Home

Data Engineer with GCP :: Scottsdale, AZ :: Only local candidates :: H1b only at Scottsdale, Arizona, USA
Email: [email protected]
Role: Data Engineer with GCP

Location: Scottsdale, AZ (Onsite)

Note: Only H1b with passport

Need Local candidates only.

GCP + Python is the key
skillset; Java is good to have

Apache airflow, Cloud Composer,
Dataflow, GCP DataProc, GCP Big Query, GCP DMS, Java, Python,
Scala, GCP Pub/Sub, GCP Cloud Data Fusion, Google Analytics Hub, Google
Workflows, GCP Data Flow, Google DataStream, Google Data Form, GCP Cloud
Pub/Sub, Big Data Hadoop Ecosystem, ANSI-SQL

Job
Summary:

A solid experience and
understanding of considerations for largescale solutioning and
operationalization of data warehouses data lakes and analytics platforms on GCP
is a must

works on creating a framework
for ML and LLM Ops

works on processes and flow of AI
Compliance Governance framework

Monitors the Data Lake
constantly and ensures that the appropriate support teams are engaged at the
right times

Create reports to monitor usage
data for billing and SLA tracking

Work with business and cross
functional teams to gather and document requirements to meet business needs

Provide support as required to
ensure the availability and performance of ETLELT jobs

Provide technical assistance
and cross training to business and internal team members

Collaborate with business
partners for continuous improvement opportunities

Requirements

Experience Skills Qualifications

6 years of experience in Data
Engineering with an emphasis on Data Warehousing and Data Analytics

Very strong hands-on Python
expertise

Strong Data and GCP Vertex AI
knowledge

6 years of experience with one
of the leading public clouds

6 years of experience in design
and build of salable data pipelines that deal with extraction transformation
and loading

6 years of experience with
Python Scala with working knowledge on Notebooks

6 years hands on experience on
GCP Cloud data implementation projects Dataflow DataProc Cloud Composer Big
Query Cloud Storage GKE Airflow etc.

At least 5 years of experience
in Data governance and Metadata Management

Ability to work independently
solve problems update the stake holders

Analyze design develop and
deploy solutions as per business requirements

Strong understanding of
relational and dimensional data modeling

Experience in DevOps and CICD
related technologies

Excellent written verbal
communication skills including experience in technical documentation and
ability to communicate with senior business managers and executives

Skills

GCP Gemini

Thanks,

Ahsan Khan

--------------------------------------

Senior
Technical Recruiter 

[email protected]

--

Keywords: artificial intelligence machine learning information technology Arizona
Data Engineer with GCP :: Scottsdale, AZ :: Only local candidates :: H1b only
[email protected]
[email protected]
View All
10:04 PM 27-Nov-24


To remove this job post send "job_kill 1971278" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 9

Location: Scottsdale, Arizona