Urgent Hiring For Cloud Data Engineer (Lead) :::: REMOTE at Remote, Remote, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1240358&uid= From: Kayla Cooper, Adame Services [email protected] Reply to: [email protected] Hi Partner, We are hiring for the below job role. Please have a look and let me know if you have any suitable candidates. Title: Cloud Data Engineer (Lead) Location: 100 % remote Yes (USA) Duration: 1 Year Must have skills: Google Cloud Composer, DataFlow, DataProc clusters, Apache beam, Hadoop, BigQuery Job Description: We are seeking a skilled Cloud Data Engineer with a minimum of 5 years of experience in designing, building, and optimizing data pipelines. The ideal candidate will have expertise in Google Cloud Platform (GCP) technologies including Google Cloud Composer, DataFlow, DataProc clusters, Apache Beam, Hadoop, and BigQuery. Key Responsibilities: Design, develop, and maintain robust and scalable data pipelines leveraging GCP technologies such as Google Cloud Composer, DataFlow, and DataProc clusters. Utilize Apache Beam for building data processing pipelines and ensure efficient data ingestion, transformation, and loading. Optimize data pipelines for performance and reliability, ensuring high availability and low latency. Collaborate with cross-functional teams to gather requirements and design data solutions that meet business needs. Implement process automation and application development using Java, Python, Unix, and PL/SQL. Experience with database Change Data Capture (CDC) ingestion and streaming ingestion applications like Striim and Nifi is a plus. Work within Agile software development lifecycle disciplines, including analysis, design, coding, and testing. Experience with Big Data technologies and creating high-performance, low-latency pipelines using Hadoop, PySpark, and Apache Beam. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. Minimum of 5 years of experience in data engineering roles. Strong proficiency in Google Cloud Platform (GCP) services such as Google Cloud Composer, DataFlow, DataProc clusters, BigQuery, etc. Hands-on experience with Apache Beam for building data processing pipelines. Proficiency in programming languages such as Java, Python, Unix, and PL/SQL. Experience with database CDC ingestion and streaming ingestion applications is desirable. Familiarity with Agile software development practices. Excellent problem-solving and analytical skills with a strong attention to detail. Strong communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. If you meet the qualifications and are passionate about leveraging cloud technologies to build scalable and efficient data pipelines, we encourage you to apply. Thanks | Regards Kayla Cooper |Technical Recruiter C: +13473777344 www.adameservices.com https://www.linkedin.com/in/khushboo-kundra-5b8a47251/ Adame Services 1309 Coffeen Avenue STE 1200, Sheridan, Wyoming, 82801 | USA Keywords: cprogramm procedural language http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1240358&uid= |
[email protected] View All |
10:03 PM 21-Mar-24 |