Home

Interview on Saturday || Local to AZ only || Client round F2F || Bigdata Engineer with GCP || Phoenix, AZ. Only Locals at Phoenix, Arizona, USA
Email: [email protected]
Hello
Vendors

Hope
you are doing well!!!

Candidates should be available on Saturday for the interview.

Make sure you didnt submitted before your profile to
IMPETUS/AMEX in the last 4-5 months.

Visa :: OPT EAD,H1B,USC Only 5 Years of exp workable.

Not workable :: Do not share H4EAD, GC EAD, GC Candidates.

Please share your interest and resume at 

[email protected].

Role :
Big Data Engineer with GCP 

Location-
Phoenix, AZ. Only Locals

Job Description:

We
are looking for a 
Big Data Engineer with expertise in Google Cloud Platform (GCP)
 to
design, develop, and optimize large-scale data processing systems. The ideal
candidate will have experience working with 
GCP data services, big
data frameworks, and data pipeline orchestration
 to
drive scalable and efficient data solutions.

Key Responsibilities:

Design, develop, and
maintain 
end-to-end data pipelines
 on GCP.

Work with 
BigQuery,
Dataflow, Dataproc, Pub/Sub, Cloud Storage
, and other GCP services for data processing.

Optimize data storage,
retrieval, and transformation processes for scalability and performance.

Develop and maintain 
ETL/ELT
pipelines
 using 
Apache
Spark, Apache Beam, or Cloud Data Fusion
.

Ensure 
data
quality, governance, and security
 within
the cloud environment.

Collaborate with data
scientists, analysts, and application teams to deliver 
data-driven
solutions
.

Automate data workflows and
orchestration using 
Cloud Composer (Apache Airflow)
.

Implement real-time data
streaming solutions using 
Pub/Sub,
Kafka, or similar tools
.

Monitor and troubleshoot data
pipelines to ensure reliability and performance.

Work with 
Terraform,
CloudFormation, or Infrastructure as Code (IaC)
 for environment setup and automation.

Required Skills & Qualifications:

10+ years
 of experience in 
Big
Data Engineering
 with
a focus on GCP.

Hands-on experience with 
Google
Cloud BigQuery, Dataflow, Dataproc, Cloud Composer (Airflow), and Pub/Sub
.

Strong programming skills
in 
Python, Java, or Scala
.

Experience with 
SQL,
NoSQL databases, and data warehousing concepts
.

Expertise in 
Apache
Spark, Apache Beam, or Hadoop ecosystems
.

Familiarity with 
real-time
data processing and streaming technologies
.

Knowledge of 
CI/CD,
DevOps practices, and Infrastructure as Code (IaC)
.

Strong understanding of 
data
governance, security, and compliance best practices
.

Experience with 
Terraform,
Kubernetes, or Docker
 is
a plus.

GCP certification (e.g., 
Professional
Data Engineer
) is a
plus.

Preferred Qualifications:

Experience working with 
multi-cloud
or hybrid cloud environments
.

Familiarity with 
machine
learning workflows and MLOps
.

Experience integrating GCP
services with 
third-party tools and APIs

--

Keywords: continuous integration continuous deployment information technology green card Arizona
Interview on Saturday || Local to AZ only || Client round F2F || Bigdata Engineer with GCP || Phoenix, AZ. Only Locals
[email protected]
[email protected]
View All
08:54 PM 12-Mar-25


To remove this job post send "job_kill 2250117" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 7

Location: Phoenix, Arizona