Urgent requirement for Big Data Engineer with Spark, Scala at Remote at Remote, Remote, USA |
Email: [email protected] |
From: Praveen Kumar, Magicforce [email protected] Reply to: [email protected] Job Title: Data Engineer with Spark, Scala Location: Remote Duration: 1+ Year Job Description: Mandatory Skills: Data Bricks, SPARK, Scala, Bachelors degree in Computer Science, Information Technology, or a related field. Proven experience as a Databricks Engineer with a focus on AWS. Strong programming skills in languages such as Python or Scala. Experience with data engineering tools and technologies, including ETL frameworks. Knowledge of cloud computing platforms, especially AWS. Familiarity with big data processing frameworks such as Apache Spark. Ability to optimize and fine-tune Databricks clusters for performance. Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills. Preferred Qualifications: AWS certification in Big Data or related field. Previous experience with data lakes and data warehousing solutions. Familiarity with data governance and compliance standards Keywords: Urgent requirement for Big Data Engineer with Spark, Scala at Remote [email protected] |
[email protected] View All |
02:24 AM 19-Feb-25 |