Home

C2C Requirments GCP Data Engineer-San Jose, CA, local at San Jose, California, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2139438&uid=

Title: GCP Data Engineer
Location: San Jose, CA, local
Visa: No Sponsorship

Must have -
GCP, Python, HANA, pub/sub, GCS, Big Query , PYSpark, Digital, Data warehouse

Position: Data Engineer with extensive experience in building data integration pipelines in CI/CD modelExperience: Lead 12 + years , Sub-Leads 7+ years of experience

    Ability to design and develop a high-performance data pipeline framework from scratch

o    Data ingestion across systemso    Data quality and curationo    Data transformation and efficient data storageo    Data reconciliation, monitoring and controls

    Support reporting model and other downstream application needs

o    Skill in technical design documentation, data modeling and performance tuning applications

o    Lead and manage a team of data engineers, contribute towards code reviews, and guide the team in designing and developing convoluted data pipelines adhering to the defined standards.

o    Be hands on, performs POCs on the open source/licensed tools in the market and share recommendations.

o    Provide technical leadership and contribute to the definition, development, integration, test, documentation and support across multiple platforms (GCP, Python, HANA)

o    Establish a consistent project management framework and develop processes to deliver high quality software, in rapid iterations, for the business partners in multiple geographies

o    Participate in a team that designs, develops, troubleshoots, and debugs software programs for databases, applications, tools etc.

o    Experience in balancing production platform stability, feature delivery and reduction of technical debt across a broad landscape of technologies.

o    Skill in the following platform, tools and technologies

    GCP cloud platform

GCS, Big Query, Streaming (pub/sub), data proc and data flow

    Python, PYSpark, Kafka, SQL, scripting & Stored procs

    Data warehouse, distributed data platforms and data lake

    Database definition, schema design, Looker Views, Models

    CI/CD pipelineo    Proven track record in scripting code in Python, PySpark and SQLo    Excellent structured thinking skills, with the ability to break down multi-dimensional problemso    Ability to navigate ambiguity and work in a fast-moving environment with multiple stakeholderso    Good communication skills and ability to coordinate and work with cross functional teams.

Digital : Python~Digital : Google Cloud

Thanks & Regards

Shalini

Sr. Technical Recruiter
-------------------------------------
SR Talent Solution INC.

--

Keywords: continuous integration continuous deployment information technology California
C2C Requirments GCP Data Engineer-San Jose, CA, local
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2139438&uid=
[email protected]
View All
01:13 AM 04-Feb-25


To remove this job post send "job_kill 2139438" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 9

Location: San Jose, California