GCP Data Engineer at Remote, Remote, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1485747&uid= From: Arun kumar, yochana IT [email protected] Reply to: [email protected] Sr. Cloud Data Engineer USA, remote position, Contract Mandatory Skills Data Fabric - Big Data Processing - Apache Spark Job description Job Description: 10 to 15 years of proven experience in modern cloud data engineering, broader data landscape experience and exposure and solid software engineering experience. Prior experience architecting and building successful self-service enterprise scale data platforms in a green field environment with microservices based architecture. Proficiency in building end to end data platforms and data services in GCP is a must. Proficiency in tools and technologies: BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, PubSub. Experience with Microservices architectures - Kubernetes, Docker. Our microservices are build using TypeScript, NestJS, NodeJS stack. Prefer candidates with this experience. Experience building Symantec layers. Remote position, engineers will work Eastern US business hours Keywords: information technology GCP Data Engineer [email protected] http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1485747&uid= |
[email protected] View All |
11:04 PM 14-Jun-24 |