Home

GCP Big Data Engineer :Scottsdale, Arizona (Onsite): LTI Mindtree at Scottsdale, Arizona, USA
Email: [email protected]
Hi,

Hope you are doing good. We have multiple requirements on ServiceNow, Please go through requirements and share me your
updated resumes along with visa copy, DL copy and employer contact details.

Role: GCP Big Data Engineer

Location: Scottsdale, Arizona(onsite)

Job description

JOB SUMMARY PRINCIPAL DUTIES

A solid experience and understanding of considerations for largescale solutioning and operationalization of data warehouses data lakes and analytics platforms on GCP is a must

Monitors the Data Lake constantly and ensures that the appropriate support teams are engaged at the right times

Design build and test scalable data ingestion pipelines perform end to end automation of ETL process for various datasets that are being ingested

Determine best way to extract application telemetry data structure it send to proper tool for reporting Kafka Splunk

Handson experience with data cataloging metadata management tools Collibra Dataplex Alation

Create reports to monitor usage data for billing and SLA tracking

Work with business and crossfunctional teams to gather and document requirements to meet business needs

Provide support as required to ensure the availability and performance of ETLELT jobs

Provide technical assistance and cross training to business and internal team members

Collaborate with business partners for continuous improvement opportunities

Requirements

JOB SPECIFICATIONS

Education Bachelors Degree in Computer Science Information Technology Engineering or related field

Experience Skills Qualifications

6 years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics

4 years of experience with one of the leading public clouds

4 years of experience in design and build of salable data pipelines that deal with extraction transformation and loading

4 years of experience with Python with working knowledge on Notebooks

2 years of experience with Kafka PubSub Docker Kubernetes

2 years hands on experience on GCP Cloud data implementation projects Dataflow DataProc Cloud Composer Big Query Cloud Storage GKE Airflow etc

At least 2 years of experience in Data governance and Metadata Management

Ability to work independently solve problems update the stake holders

Analyze design develop and deploy solutions as per business requirements

Strong understanding of relational and dimensional data modeling

Experience in DevOps and CICD related technologies

Excellent written verbal communication skills including experience in technical documentation and ability to communicate with senior business managers and executives

Thanks & regards, 

Abhinay Kumar

Sr Recruitment Specialist

MSR Technology Group LLC

Work

+1-925-290-7931 Ext: 166

Email: [email protected] /

[email protected]

https://www.linkedin.com/in/abhinay-kumar-reddy-malipatel/

Keywords: information technology golang
GCP Big Data Engineer :Scottsdale, Arizona (Onsite): LTI Mindtree
[email protected]
[email protected]
View All
09:29 PM 26-Nov-24


To remove this job post send "job_kill 1967175" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 9

Location: Scottsdale, Arizona