Home

ETL Developer with Aws Redshift:: Charlotte, NC at Charlotte, North Carolina, USA
Email: prem@brightsol.ai
https://rb.gy/r1ud0k
https://jobs.nvoids.com/job_details.jsp?id=2237863&uid=
From:

Prem,

brightsol.ai

prem@brightsol.ai

Reply to:   prem@brightsol.ai

Role: ETL Developer with Aws Redshift 

Location: Charlotte, NC

Contract: Long term

Only H1Bs

Looking for 10+years Profiles

Passport Number Required

Job Description:

An "ETL Developer with AWS Redshift" job description would seek a skilled professional responsible for designing, developing, and maintaining data pipelines using AWS Redshift, focusing on extracting data from various sources, transforming it as needed, and loading it into the Redshift data warehouse for analysis, requiring strong proficiency in SQL, data warehousing concepts, and AWS services, particularly Redshift functionalities. 

Key Responsibilities:

Data Extraction:

Design and implement data extraction processes from diverse sources like databases, APIs, flat files, and applications using appropriate ETL tools and techniques. 

Data Transformation:

Perform data cleaning, manipulation, aggregation, and transformation to ensure data quality and consistency for analysis within Redshift. 

Data Loading:

Efficiently load transformed data into Redshift tables, optimizing loading strategies for large data volumes. 

Redshift Query Optimization:

Write optimized SQL queries to retrieve data from Redshift, taking advantage of its distributed processing capabilities to ensure fast query performance. 

Pipeline Development:

Build and maintain complex ETL pipelines using AWS services like AWS Glue, S3, Lambda, and other relevant tools. 

Data Modeling:

Design and implement data models in Redshift to facilitate efficient data analysis and reporting. 

Performance Tuning:

Monitor and troubleshoot performance bottlenecks within ETL pipelines and Redshift queries, optimizing for scalability and cost-effectiveness. 

Data Quality Assurance:

Implement data validation checks to ensure data integrity throughout the ETL process. 

Required Skills:

Strong SQL Skills:

Expertise in writing complex SQL queries for data manipulation, aggregation, and analysis within Redshift. 

AWS Redshift Knowledge:

Thorough understanding of Redshift architecture, data loading mechanisms, query optimization techniques, and best practices. 

ETL Tool Proficiency:

Familiarity with ETL tools like AWS Glue, Talend, or similar platforms for building and managing data pipelines. 

Programming Skills:

Basic proficiency in Python or other scripting languages for data manipulation and automation. 

Data Warehousing Concepts:

Understanding of data warehouse design principles, dimensional modeling, and star schema concepts. 

Cloud Computing Knowledge:

Familiarity with AWS services beyond Redshift, including S3, IAM, and VPC. 

Thanks,
Prem Kusuma

Keywords: artificial intelligence sthree information technology North Carolina
ETL Developer with Aws Redshift:: Charlotte, NC
prem@brightsol.ai
https://rb.gy/r1ud0k
https://jobs.nvoids.com/job_details.jsp?id=2237863&uid=
prem@brightsol.ai
View All
09:44 PM 07-Mar-25


To remove this job post send "job_kill 2237863" as subject from prem@brightsol.ai to usjobs@nvoids.com. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to prem@brightsol.ai -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at me@nvoids.com


Time Taken: 11

Location: Charlotte, North Carolina