Home

Urgent Need for Data Architect at REMOTE at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=598072&uid=

From:

Sudhir,

Techgene Solutions, LLC

[email protected]

Reply to:   [email protected]

Hi,

Position: Data Architect

Location: REMOTE

Duration: 6+ Months

NO OPT

NEED PP NUMBER or RECENT i94

JOB DESCRIPTION:

The role is to play a lead role in developing a high-performance data platform, integrating data from a variety of internal and external sources, to support data and analytics activities. This is a technical role that involves defining changes to the warehouse data model and building scalable and efficient processes to populate or modify warehouse data. The successful candidate will have hands-on data processing and data modeling experience in cloud and on-prem environments.

Responsibilities

Be a technical lead in the development of high-volume platforms to

drive decisions and have a tangible beneficial impact on our clients and on business results.

Design and implement efficient data pipelines (ETLs) to integrate data from a variety of sources into Data Warehouse.

Design and implement data model changes that align with warehouse standards.

Design and implement backfill or other warehouse data management processes.

Develop and execute testing strategies to ensure high quality warehouse data.

Provide documentation, training, and consulting for data warehouse users.

Perform requirement and data analysis to support warehouse project definition.

Provide input and feedback to support continuous improvement in team processes.

Experience in working both, On-Prem, and AWS Cloud.

Responsible for leading the team onsite and offshore with technical leadership and guidance.

Qualifications

5+ years in a Data Architect or Java Lead role

7+ years hands on experience with SQL

Ability to write/ interpret SQL and Complex Joins/ Queries

Execution plan and SQL optimization (Oracle SQL Profiler)

5+ years coding experience (Python and/or PySpark).

5+ years hands on experience with big data and cloud technologies (Snowflake, EMR, Redshift, or similar technologies) is highly preferred.

Schema design and architecture on Redshift

Architecture and design experience with AWS cloud.

AWS services expertise: S3, RDS, EC2, ETL services (Glue, Kinesis),

Redshift, Step functions, lambda, EMR.

Consumption layer design experience for reporting and dashboard.

Expert level understanding and experience of ETL fundamentals and building efficient data pipelines.

3+ years at least - Enterprise GitHub branch, release, DevOps, CI/CD pipeline.

Team player, Strong communication and collaboration skills.

Experience with DBT or similar transformation frameworks, Experience with data integration frameworks such as Airbytes, Fivetran or similar.

Experience with Agile methodologies.

Masters Degree (or a B.S. degree with relevant industry experience) in math, statistics, computer science, or equivalent technical field.

Keywords: continuous integration continuous deployment sthree
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=598072&uid=
[email protected]
View All
10:44 PM 01-Sep-23


To remove this job post send "job_kill 598072" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,