Home

Azure Big Data Engineer at Remote, Remote, USA
Email: [email protected]
From:

sai kiran,

Msysinc

[email protected]

Reply to:   [email protected]

Title: Azure Big Data Engineer

Location: Remote

Length: Long term

Restriction: W2 or C2C

Send resume to: [email protected]

Description:

Interview Type: Webcam Interview *** Very long term project initial PO for 1 year, expect to go for 4+ years *** *** Remote *** 35 hours per week

Short Description:

Develops, engineers, maintains, runs tests, evaluates, and implements big data infrastructure, projects, tools, and solutions, working with the latest database technologies, to get results from vast amounts of data quickly.

Job Description:

The Big Data Engineer is a vital member of a collaborative team, responsible for designing, engineering, maintaining, testing, evaluating, and implementing big data infrastructure, tools, projects, and solutions for the North Dakota University System (NDUS).This role involves working closely with the team to leverage cutting edge database technologies for the swift retrieval of results from vast datasets. The engineer will select and integrate big data frameworks and tools to meet specific needs and manage the entire lifecycle of large datasets to extract valuable insights.

Key Responsibilities:
Design and implement scalable big data solutions tailored to NDUSs needs.
Maintain and enhance existing big data infrastructures to meet NDUSs unique requirements.
Test and evaluate new big data technologies and frameworks for compatibility with NDUS systems and goals.
Collect, store, process, manage, analyze, and visualize large datasets to derive actionable insights.
Collaborate with team members to integrate big data solutions with existing NDUS systems.
Ensure data integrity and security across all platforms used within NDUS.
Develop and optimize data pipelines for ETL/ELT processes specific to NDUSs data needs.
Document technical solutions and maintain comprehensive records in line with NDUS standards and protocols.
Stay updated with the latest trends and advancements in big data technology relevant to NDUSs strategic initiatives.

Required Qualifications:
Thorough understanding of cloud computing technologies, including IaaS, PaaS, and SaaS implementations.
Skilled in exploratory data analysis (EDA) to support ETL/ELT processes.
Proficiency with Microsoft cloud products, including Azure and Fabric.
Experience with tools such as Data Factory and Databricks.
Ability to script in multiple languages, with a strong emphasis on Python and SQL.

Preferred Qualifications:
Experience with data visualization tools.
Proficiency with Excel and Power BI.
Familiarity with Delta Lake.
Knowledge of Lakehouse Medallion Architecture.

Required Skills:
IaaS, PaaS, or SaaS data or AI implementations within Microsoft Azure 3 Years
Exploratory Data Analysis (EDA): Proficiency in EDA techniques to support ETL/ELT processes 3 Years
Implementation of Development and Production Workflows in Azure Data Factory and Databricks 2 Years
Strong scripting skills in Python and SQL 3 Years
Expert proficiency of data engineering creation in Microsoft Fabric 1 Years
Expert proficiency with Excel and Power BI 2 Years
Expert proficiency of Delta Lake format and protocol 1 Years
Expert understanding of the Data Lakehouse Medallion Architecture 1 Years preferred

Keywords: artificial intelligence business intelligence golang purchase order wtwo
Azure Big Data Engineer
[email protected]
[email protected]
View All
01:27 AM 12-Nov-24


To remove this job post send "job_kill 1920859" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,