Home

Opening for Azure Data Engineer in MI - need locals at Warren, Michigan, USA
Email: [email protected]
Hi, 

I hope you are doing well. A brief version of the job description is attached below, if you are interested revert to my mail with your updated resume 

Role: Azure Data Engineer

Location: Warren, MI - hybrid 3 days onsite 

Duration: Long Term 

Rate: $60/hr on C2C

Need PP number 
no opt and cpt and h1t visas and GC   

Job Description : 

As a Data Engineer, you will build industrialized data assets and data pipelines in support of Business Intelligence and Advance Analytic objectives. The Data Engineer handles leading and delivering new and innovative data driven solutions that are elegant and professional. You will work closely with our forward-thinking Data Scientists, BI developers, System Architects, and Data Architects to deliver value to our vision for the future. Our team focuses on writing maintainable tests and code that meet the customers need and scale without rework. Our engineers and architects work in highly collaborative environments across many disciplines (user experience, database, streaming technology, custom rules engines, and ai/ml and most web technologies). We work on innovative technologies understanding and inventing modern designs and integration patterns along the way. Our Data Engineers must

You will assemble large, complex data sets that meet functional / non-functional business requirements.

You will identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

Lead and deliver exceptional data driven solutions across many different languages, tools, and technology.

Develop a culture which takes challenging and complex ideas and turns them into production solutions.

Have a broad, enterprise-wide view of business and varying degrees of appreciating for strategy, process, capabilities, enablers, and governance.

Be able to critically apply themselves to solving problems in an interconnected and integrated environment.

Think strategically about adopting innovative technologies that are beyond the horizon and deliver on sound solution design.

Create high-level models that can be leveraged in future analysis to extend and mature the business architecture.

Working towards continuously raising our standards of engineering excellence in quality, efficiency, and developing repeatable designs.

Performing hands-on development, leading code reviews and testing, creating automation tools, and creating proofs of concepts

Ability to work along with operations team during any production issues related to platform.

Best practices like agile methodologies, design thinking and continuous deployment that will allow you to innovate fast.

Deliver solutions across Big Data applications to support business strategies and deliver business value.

Build tool / automation to make deployment and monitoring production environment more repeatable.

Build strong relationships with Business & Technology Partners and provide leadership, direction, best practices, and coaching to technology development teams.

Owning all Technical aspects of development and for assigned applications.

Driving continuous improvement in applications, through use of consistent development practices and tools, and through ongoing design and code refactoring

Collaborating with stakeholders through ongoing product and platform releases, to solve existing needs, identify exciting opportunities, and predict future challenges.

Working closely with product managers and architects to prioritize and manage features, technical requirements, known defects, and issues.

Managing and appropriately escalating delivery impediments, risks, issues, and changes tied to the product development initiatives.

Data Integration Design and develop new source system integrations from a variety of formats including files, database extracts and APIs.

Data Pipelines Design and develop highly scalable Data Pipelines that incorporate complex transformations and efficient code.

Data Delivery Design and develop solutions for delivering data that meets SLAs

Candidate Requirment:

Degree Requirement Bachelor Degree in the data space or a related field  (computer science, information technology, mathematics are good fields)

Years of experience at least 7-8 years of experience building ETL

(ETL extract transform and load)

ETL code utilizing different technologies, connects you to a database and helps extract it out so the table and the fields do the transformation cleaning, connect to another database and actually load it to the new database

Working with azure cloud technologies

Understanding databases, database structures, data model, data architecture 

Expert in multiple tools and technologies including Azure and Databricks

Scala, Spark, Python, SQL, .NET, REST API, Angular

SQL Server, PostgreSQL, SQL Server Integration Services (SSIS), Complex ETL with massive data, PowerBI

--

Keywords: artificial intelligence machine learning business intelligence information technology green card Michigan
Opening for Azure Data Engineer in MI - need locals
[email protected]
[email protected]
View All
08:08 PM 30-Jan-25


To remove this job post send "job_kill 2128727" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 7

Location: Warren, Michigan