Home

GCP Data Engineer - Philadelphia, PA - Onsite at Philadelphia, Pennsylvania, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2286058&uid=

Oorwin Email

Greetings,

My name is Sukumar, and I represent GAC Solution Inc., a staff augmentation firm providing a wide-range of talent on-demand and total workforce solutions. We have excellent domain expertise in all verticals. Repositioning professionals is what we do, and we do it very well. 

I am reaching out to you today as your profile matches an immediate job opportunity we have with our premier client.

Please take a look at the job description below and let me know your interest.

Role: GCP Data Engineer

Location: Philadelphia, PA - Onsite

Contract

Job Description:

We are seeking a highly skilled Data Engineer to design, build, and maintain scalable data platforms that enable large-scale ingestion, storage, processing, and analysis of structured and unstructured data. This role will focus on constructing data products (data lake / data warehouse), optimizing data pipelines, and implementing robust ETL workflows to support analytics, machine learning, and operational reporting.

The ideal candidate will be proficient in distributed computing, cloud-based data architectures (GCP), and modern data processing frameworks. Experience with real-time data streaming (Kafka, Apache Beam), MLOps, and infrastructure automation (Terraform, Jenkins) is highly preferred.

Key Responsibilities

Data Platform & Architecture Development

Design, implement, and maintain scalable data platforms for efficient data storage, processing, and retrieval.

Build cloud-native and distributed data systems that enable self-service analytics, real-time data processing, and AI-driven decision-making.

Develop data models, schemas, and transformation pipelines that support evolving business needs while ensuring operational stability.

Apply best practices in data modeling, indexing, and partitioning to optimize query performance, cost efficiency, considering best practices for Sustainability.

ETL, Data Pipelines & Streaming Processing

Build and maintain highly efficient ETL pipelines using SQL, Python, to process large-scale datasets.

Implement real-time data streaming pipelines using Kafka, Apache Beam, or equivalent technologies.

Develop reusable internal data processing tools to streamline operations and empower teams across the organization.

Write advanced SQL queries for extracting, transforming, and loading (ETL) data with a focus on execution efficiency.

Ensure data validation, quality monitoring, and governance using automated processes and dashboards.

MLOps & Cloud-Based Data Infrastructure

Deploy machine learning pipelines with MLOps best practices to support AI and predictive analytics applications.

Optimize data pipelines for ML models, ensuring seamless integration between data engineering and machine learning workflows.

Work with cloud platforms (GCP) to manage data storage, processing, and security.

Utilize Terraform, Jenkins, CI/CD tools to automate data pipeline deployments and infrastructure management.

Collaboration & Agile Development

Work in Agile/DevOps teams, collaborating closely with data scientists, software engineers, and business stakeholders.

Advocate for data-driven decision-making, educating teams on best practices in data architecture and engineering.

Required Skills & Qualifications

5+ years of experience as a Data Engineer working with large-scale data processing.

Strong proficiency in SQL for data transformation, optimization, and analytics.

Expertise in programming languages (Python, Java, Scala, or Go) with an understanding of functional and object-oriented programming paradigms.

Experience with distributed computing frameworks.

Proficiency in cloud-based data engineering on AWS, GCP, or Azure.

Strong knowledge of data modeling, data governance, and schema design.

Experience with CI/CD tools (Jenkins, Terraform) for infrastructure automation.

Preferred Qualifications

Experience with real-time data streaming (Kafka, or equivalent).

Strong understanding of MLOps and integrating data engineering with ML pipelines.

Familiarity with knowledge graphs and GraphQL APIs for data relationships.

Background in retail, customer classification, and personalization systems.

Knowledge of business intelligence tools and visualization platforms.

Databricks Data Intelligence Platform, DevOps, ETL, Google Cloud Platform, IT operations, Python, business intelligence, cloud computing, cloud providers, data analysis, data intelligence, data mining, data processing, data science, information technology, public cloud, system administration, technology

BI solutions, Google Analytics, Microsoft Power BI,SQL, business intelligence, business requirements, dashboarding solutions, data analysis, data processing, data querying, information system, information technology, programming, software development, software engineering, technology, web analytics, web traffic management, website

I would also appreciate it if you could send me an updated resume.

Best Regards,

Sukumar

Sr Technical Recruiter

E: [email protected]

www.gacsol.com  

www.muffins.ai

Experts in Digitalization and Platform Engineering - Enterprise 4.0

Keywords: continuous integration continuous deployment artificial intelligence machine learning access management business intelligence information technology golang Pennsylvania
GCP Data Engineer - Philadelphia, PA - Onsite
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2286058&uid=
[email protected]
View All
11:41 PM 25-Mar-25


To remove this job post send "job_kill 2286058" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 7

Location: Philadelphia, Pennsylvania