Home

Data Engineer at Remote, Remote, USA
Email: [email protected]
Data Engineer
Location: 

Responsibilities

Contribute to the migration of legacy data warehouse to a google cloud-based data warehouse for a Telecom Major..

 Collaborate with Data Product Managers, Data Architects to design, implement, and deliver successful data solutions

 Help architect data pipelines for the underlying data warehouse and data marts

Design and develop very complex ETL pipelines in Google cloud Data environments.

Our legacy tech stack includes Teradata and new tech stack includes GCP Cloud Data Technologies like BigQuery and Airflow and languages include SQL , Python

Maintain detailed documentation of your work and changes to support data quality and data governance

 Support QA and UAT data testing activities

 Support Deployment activities to higher environments

Ensure high operational efficiency and quality of your solutions to meet SLAs and support commitment to our customers (Data Science, Data Analytics teams)
 Be an active participant and advocate of agile/scrum practice to ensure health and process improvements for your team

Basic Qualifications

8+ years of data engineering experience developing large data pipelines in very complex environments

Very Strong SQL skills and ability to build Very complex transformation data pipelines using custom ETL framework in Google BigQuery environment

 Exposure to Teradata and ability to understand complex Teradata BTEQ scripts

 String Python programming skills

 Strong Skills on build Airflow Jobs and Debug issues

 Ability to Optimize the Query in BigQuery

 Hands-on experience on Google Cloud data Technologies ( GCS , BigQuery, Dataflow, Pub sub, Data Fusion , Cloud Function)
Preferred Qualifications

 Experience with cloud data warehouse technology BigQuery.

Nice to have experience with Cloud technologies like GCP (GCS , Data Proc, Pub/sub, Data flow, Data Fusion, Cloud Function)

Nice to have exposure to Teradata

 Solid experience with Job Orchestration Tools like Airflow and ability to build complex Jobs.

Writing and maintaining large Data Pipelines using Custom ETL framework

 Ability to Automate Jobs using Python

 Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices

 Very good experience with Code Version control repository like Github

 Good Scripting skills, including Bash scripting and Python

 Familiar with Scrum and Agile methodologies

 Problem solver with strong attention to detail and excellent analytical and communication skills

Ability to work in Onsite / Offshore model and able to lead a Team.

--

Keywords: quality analyst information technology
Data Engineer
[email protected]
[email protected]
View All
02:42 AM 01-Mar-25


To remove this job post send "job_kill 2218685" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,