Data Engineer with GCP or GCP Data Engineer at Remote, Remote, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2080427&uid= Hi, One of our client is looking for Data Engineer with GCP or GCP Data Engineer for the Location HYBRID (Dallas, Texas)(Locals Only) Role : Data Engineer with GCP or GCP Data Engineer Location : HYBRID (Dallas, Texas) (Locals Only) : CTS Experience : 8+ years Rate : $55/hr. on C2C Required Skills : GCP, Python, Apache Spark, BigQuery, Pyspark, Cloud Composer, Cloud Dataflow, Cloud Dataproc, Cloud SQL, Data Fusion Responsibilities : GCP Data Engineer will create, deliver, and support custom data products, as well as enhance/expand team capabilities. They will work on analysing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and analytics. Google Cloud Data Engineers will be responsible for designing the transformation and modernization on Google Cloud Platform using GCP Services Responsibilities. Build data systems and pipelines on GCP Cloud using Data proc, Data Flow, Data Fusion, Big query and Pub/Sub Implement schedules/workflows and tasks for Cloud Composer/Apache Airflow. Create and manage data storage solutions using GCP services such as BigQuery, Cloud Storage, and Cloud SQL Monitor and troubleshoot data pipelines and storage solutions using GCPs Stackdriver and Cloud Monitoring Develop efficient ETL/ELT pipelines and orchestration using Data Prep, Google Cloud Composer Develop and Maintain Data Ingestion and transformation process using Apache Pyspark, Dataflow Automate data processing tasks using scripting languages such as Python or Bash Ensuring data security and compliance with industry standards by configuring IAM roles, service accounts, and access policies. Automating cloud deployments and infrastructure management using Infrastructure as Code (IaC) tools such as Terraform or Google Cloud Deployment Manager. Participate in Code reviews, contribute to development best practices and usage of Developer Assist tools to create a robust fail-safe data pipelines Collaborate with Product Owners, Scrum Masters and Data Analyst to deliver the User Stories and Tasks and ensure deployment of pipelines Thanks and Regards, Madhusudana Recruitment Lead Desk : (408) 675-2480 | Ext 1000 Email Id:- [email protected] Reqroute, Inc 1879 Lundy Ave , # 228 San Jose , CA 95131 DISCLAIMER: The information transmitted is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender and delete the material from any computer or if you want to be REMOVED please reply with REMOVE in the Subject line of this email. Keywords: information technology California Idaho Data Engineer with GCP or GCP Data Engineer [email protected] http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2080427&uid= |
[email protected] View All |
01:44 AM 15-Jan-25 |