| Urgent Hiring for the Position || GCP Data Engineer with Matillion || Remote 100% at Remote, Remote, USA |
| Email: [email protected] |
|
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=3029107&uid=0aab376ec6f4493b8f3ed14dc9b3e605 Please don't share a candidate who has already submitted with HCL America. Job Title: GCP Data Engineer with Matillion Location: Remote 100% Duration: Long Term Implementation Partner - HCL America Mandatory Skills - GCP BigQuery, Dataflow, Python, Cloud Composer, Data Governance & Validation Role Overview The Senior Data Engineer will be responsible for designing, developing, and maintaining scalable data pipelines and solutions on Google Cloud Platform (GCP). The role focuses on data ingestion, transformation, orchestration, and governance across enterprise data systems. The engineer will work closely with architects, data analysts, and business stakeholders to implement high-quality, performance-optimized data solutions in BigQuery and related GCP services. Key Responsibilities: Design and implement end-to-end data pipelines using Cloud Dataflow (Python/Apache Beam) for batch and streaming data. Develop, optimize, and maintain BigQuery stored procedures (SPs), SQL scripts, and user-defined functions (UDFs) for complex transformations and business logic implementation. Build and manage data orchestration workflows using Cloud Composer (Airflow) with appropriate operators and dependencies. Establish secure and efficient connections to source systems for data ingestion and integration. Manage data ingestion workflows from on-premise and cloud sources into Google Cloud Storage (GCS) and BigQuery. Execute history data migration from legacy data warehouses (preferably Snowflake, Teradata, Netezza, Oracle, SQL Server) to BigQuery, ensuring accuracy and performance optimization. Design and maintain data validation and testing frameworks for ensuring data quality and reliability across pipelines. Implement data governance practices, including metadata management, lineage tracking, and access control. Collaborate with data analysts and architects to define scalable and reusable data models, views, and semantic layers in BigQuery. Troubleshoot data pipeline failures, perform root cause analysis, and implement preventive measures. Optimize cost and performance of GCP workloads using best practices for BigQuery and Dataflow. Technical & Soft Skills Required: Strong hands-on experience in Google Cloud Platform (GCP) with expertise in BigQuery, Dataflow, Cloud Composer, and GCS. Proficient in Python programming for data engineering (Dataflow pipelines, validation scripts, automation). Expertise in BigQuery SQL, Stored Procedures, Views, and User Defined Functions (UDFs). Experience with data migration from on-premise or cloud data warehouses (preferably Snowflake, Teradata, Netezza, Oracle, SQL Server) to BigQuery. Strong understanding of ETL/ELT frameworks, data modeling, and schema design (Star/Snowflake). Familiarity with data governance, metadata, and lineage frameworks on GCP.- Knowledge of data validation and testing techniques, including reconciliation, rule-based checks, and automation. Hands-on experience in workflow orchestration using Cloud Composer (Airflow) with custom operators. Strong SQL tuning and BigQuery performance optimization skills. Good communication and collaboration skills to work with cross-functional teams in Agile environments. Proven ability to troubleshoot, optimize, and enhance complex data workflows at scale. Thanks & Regards Mansi Sen | LinkedIn US IT Recruiter | KTEK Resourcing [email protected] Address: 2277 Plaza Dr. Suite 240, Sugar Land, TX 77479 We are an E-Verify participating employer. -- Keywords: information technology card Texas Urgent Hiring for the Position || GCP Data Engineer with Matillion || Remote 100% [email protected] http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=3029107&uid=0aab376ec6f4493b8f3ed14dc9b3e605 |
| [email protected] View All |
| 08:38 PM 06-Jan-26 |