data engineer with heavy data bricks at Remote, Remote, USA |
Email: [email protected] |
From: Syeda Hajra, Absolute IT [email protected] Reply to: [email protected] Data engineer Hybrid at PA only USC and GC must have strong databricks Mandatory Skills 5+ YEARS OF DATA ENGINEERING 3+ YEARS IN PYTHON, SPARK (PYSPARK) SQL DATABRICKS PLATFORM DATA PIPELINES. CLOUD (AWS, AZURE, GCP OR SIMILAR) MICROSERVICES Location: 3 days/week in Bethlehem PA location Databricks is basically a Cloud-based Data Engineering tool that is widely used by companies to process and transform large quantities of data and explore the data. This is used to process and transform extensive amounts of data and explore it through Machine Learning models. It allows organizations to quickly achieve the full potential of combining their data, ETL processes, and Machine Learning. a Data Engineer for a CONTRACT Bethlehem, PA. Hybrid 3 days/week. Required Skills: 5+ years of data engineering experience. 3+ years of hands-on experience in programming languages Python, Spark (PySpark) & SQL. 1+ years of experience working within Databricks platform (mandatory). 3+ years of experience, building end to end production data pipelines. 3+ years of experience working within cloud ecosystem (AWS, Azure, GCP or similar) 1+ year experience building microservices architecture using containerization technologies like docker or Kubernetes. Knowledge of data warehousing concepts. Proficient in understanding and incorporating software engineering principles in design & development process. Efficiency in handling data - tracking data lineage & ensuring data quality, and improving reliability of data. Excellent verbal and written communication; Proven interpersonal skills and ability to convey key insights from complex analysis in summarized terms ; Ability to effectively communicate with technical teams.\\ Keywords: information technology green card Pennsylvania |
[email protected] View All |
01:57 PM 28-Feb-23 |