Home

ETL lead with PySpark expertise - Minneapolis, MN - Onsite at Minneapolis, Minnesota, USA
Email: [email protected]
From:

Mahesh Kumar,

KK Associates LLC

[email protected]

Reply to:   [email protected]

Hi,

          We have below job opportunity with one of our clients. If you have any matching profiles please get in touch with me

Role name: |

Developer |

Role Description: |

ETL Developer (5+years) to create data pipeline ETL Jobs using AWS Glue and PySpark within the financial services industry.Responsibilities:Work with a scrum team(s) to deliver product stories according to priorities set by the business and the Product Owners.Interact with stakeholders.Provide knowledge transfer to other team members.Creating and testing pipeline jobs locally using aws glue interactive session.Performance tuning of PySpark jobs.AWS Athena to perform data analysis on Lake data populated into aws glue data catalog through aws glue crawlers.Must Haves:Responsible for designing, developing, and maintaining ETL processes to support data integration and business intelligence initiatives.Need to closely work with stakeholders to understand data requirements and ensure efficient data flow and transformation using ETL tools and PySparkDevelop and implement ETL processes using with one of ETL tool and PySpark to extract, transform, and load data.4+ years of experience in ETL development with knowledge on Pyspark5+ years as an ETL DeveloperSQL expertAWS Glue WITH Python ( PySpark )PySpark Dataframe APISpark SQLKnowledge in AWS services (e.g. DMS, S3, RDS, Redshift, Step Function).Nice to Haves:Etl development experience with tools e.g. SAP BODS, Informatica.Good understanding of version control tools like Git, GitHub, TortoiseHg.Financial services experienceAgile |

Competencies: |

Digital : PySpark |

Experience (Years): |

6-8 |

Essential Skills: |

ETL Developer (5+years) to create data pipeline ETL Jobs using AWS Glue and PySpark within the financial services industry.Responsibilities:Work with a scrum team(s) to deliver product stories according to priorities set by the business and the Product Owners.Interact with stakeholders.Provide knowledge transfer to other team members.Creating and testing pipeline jobs locally using aws glue interactive session.Performance tuning of PySpark jobs.AWS Athena to perform data analysis on Lake data populated into aws glue data catalog through aws glue crawlers.Must Haves:Responsible for designing, developing, and maintaining ETL processes to support data integration and business intelligence initiatives.Need to closely work with stakeholders to understand data requirements and ensure efficient data flow and transformation using ETL tools and PySparkDevelop and implement ETL processes using with one of ETL tool and PySpark to extract, transform, and load data.4+ years of experience in ETL development with knowledge on Pyspark5+ years as an ETL DeveloperSQL expertAWS Glue WITH Python ( PySpark )PySpark Dataframe APISpark SQLKnowledge in AWS services (e.g. DMS, S3, RDS, Redshift, Step Function).Nice to Haves:Etl development experience with tools e.g. SAP BODS, Informatica.Good understanding of version control tools like Git, GitHub, TortoiseHg.Financial services experienceAgile |

Desirable Skills: |

ETL Developer (5+years) to create data pipeline ETL Jobs using AWS Glue and PySpark within the financial services industry.Responsibilities:Work with a scrum team(s) to deliver product stories according to priorities set by the business and the Product Owners.Interact with stakeholders.Provide knowledge transfer to other team members.Creating and testing pipeline jobs locally using aws glue interactive session.Performance tuning of PySpark jobs.AWS Athena to perform data analysis on Lake data populated into aws glue data catalog through aws glue crawlers.Must Haves:Responsible for designing, developing, and maintaining ETL processes to support data integration and business intelligence initiatives.Need to closely work with stakeholders to understand data requirements and ensure efficient data flow and transformation using ETL tools and PySparkDevelop and implement ETL processes using with one of ETL tool and PySpark to extract, transform, and load data.4+ years of experience in ETL development with knowledge on Pyspark5+ years as an ETL DeveloperSQL expertAWS Glue WITH Python ( PySpark )PySpark Dataframe APISpark SQLKnowledge in AWS services (e.g. DMS, S3, RDS, Redshift, Step Function).Nice to Haves:Etl development experience with tools e.g. SAP BODS, Informatica.Good understanding of version control tools like Git, GitHub, TortoiseHg.Financial services experienceAgile |

Country: |

United States |

Branch | City | Location: |

Minneapolis Downtown, MN
MINNEAPOLIS
Minneapolis, MN |

Keywords: |

ETL lead with PySpark expertise |

Email is the best way to reach me if I missed your call

Regards,

Mahesh Kumar

KK Associates LLC.

8751 Collin McKinney Pkwy, # 1302, McKinney, TX 75070

555 Metro Place North, Suite # 100, Dublin, OH 43017
Email: shetty.m

@kksoftwareassociates.com

Web: 

www.kksoftwareassociates.com

Keywords: sthree Minnesota Ohio Texas
ETL lead with PySpark expertise - Minneapolis, MN - Onsite
[email protected]
[email protected]
View All
09:33 PM 26-Dec-24


To remove this job post send "job_kill 2039921" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,