Home

Immediate Job Opening_ GCP Data Engineer at Phoenix, Arizona, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2168548&uid=

NO GC
Role:                            GCP Data Engineer

Location:                      Phoenix, AZ(Onsite)

Duration:                      18 Months 

Experience-11+

Rate: $60/HR.
SUMMARY:
This is for a Sr. Software Engineer (Data Engineer) who will be a part of our Datawarehouse and Reporting team and will be responsible for working on and building out data products and capabilities in the Datawarehouse space that will be leveraged to enhance our customer's lives. They have solid cloud experience GCP preferred and have a solid Data engineering background.

PRINCIPAL DUTIES AND RESPONSIBILITIES:
A solid experience and understanding of considerations for large-scale solutioning and operationalization of data warehouses, data lakes, and analytics platforms on GCP is a must.
Monitors the Data Lake and Warehouse to ensure that the appropriate support teams are engaged at the right times.
Design, build, and test scalable data ingestion pipelines, and perform end-to-end automation of ETL process for various datasets that are being ingested.
Participate in peer review and provide feedback to the engineers keeping development best practices, business, and technical requirements in view
Determine the best way to extract application telemetry data, structure it, and send it to the proper tool for reporting (Kafka, Splunk).
Work with business and cross-functional teams to gather and document requirements to meet business needs.
Provide support as required to ensure the availability and performance of ETL/ELT jobs.
Provide technical assistance and cross-training to business and internal team members.
Collaborate with business partners for continuous improvement opportunities.

POSITION SPECIFICATIONS:
5+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
4+ years of experience with one of the leading public clouds.
4+ years of experience in the design and building of salable data pipelines that deal with extraction, transformation, and loading.
4+ years of experience with Python, and Scala with working knowledge of Notebooks.
2+ years of hands-on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).
At least 2 years of experience in Data Governance and Metadata Management.
Ability to work independently, solve problems, and update the stakeholders.
Analyze, design, develop, and deploy solutions as per business requirements.
Strong understanding of relational and dimensional data modeling.
Experience in DevOps and CI/CD-related technologies.
Excellent written, and verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.

Thank you!

Respectfully Yours,

Kushal Raghuvanshi || Technical Recruiter

Zenith Infotek LLC
 | 4400 State Hwy 121, Suite 335, Lewisville, TX 75056 | |

--

Keywords: continuous integration continuous deployment information technology green card Arizona Texas
Immediate Job Opening_ GCP Data Engineer
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2168548&uid=
[email protected]
View All
10:02 PM 12-Feb-25


To remove this job post send "job_kill 2168548" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 80

Location: Phoenix, Arizona