Home

Azure Data Engineer | NC locals at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2011461&uid=

H1B Profiles only

Azure
Data Engineer

Raleigh,
NC

Cary,
NC

Data bricks, Delta lake, Hive, HDFS,
Sqoop ,Kafka, Kerberos, Impala etc, Spark (Scala/python/Java), Hive, HDFS,
Sqoop ,Kafka, Kerberos, Impala etc. and CDP 7.x

Collect, store, process and analyze
large datasets to build and implement extract, transfer, load (ETL) processes

Develop reusable frameworks to
reduce the development effort involved thereby ensuring cost savings for the
projects.

Develop quality code with thought
through performance optimizations in place right at the development stage.

Appetite to learn new technologies
and be ready to work on new cutting-edge cloud technologies.

Work with team spread across the
globe in driving the delivery of projects and recommend development and
performance improvements.

Building and Implementing data
ingestion and curation process developed using Big data tools such as
Spark(Scala/python/Java), Data bricks, Delta lake, Hive, HDFS, Sqoop ,Kafka,
Kerberos, Impala etc.

Hands on expertise in: Implementing
analytical data stores in Azure platform using ADLS/ADF/Data Bricks and Cosmos
DB and CDP 7x

Ingesting huge volumes data from
various platforms for Analytics needs and writing high-performance, reliable
and maintainable ETL code Strong

SQL knowledge and data analysis
skills for data anomaly detection and data quality assurance. Proficiency and
extensive Experience with Spark & Scala, Python and performance tuning is a
MUST

Hive database management and
Performance tuning is a MUST. (Partitioning / Bucketing )

Strong SQL knowledge and data
analysis skills for data anomaly detection and data quality assurance. Strong
analytic skills related to working with unstructured datasets

Performance tuning and
problem-solving skills is a must.

Code versioning experience using
Bitbucket/AzDo. Working knowledge of AzDo pipelines would be a big plus.

Monitoring performance and advising
any necessary infrastructure changes.

Strong experience in building
designing Data warehouses, data stores for analytics consumption. (real time as
well as batch use cases)

Eagerness to learn new technologies
on the fly and ship to production

Expert in technical program delivery
across cross-functional / LOB teams

Expert in driving delivery through
collaboration in highly complex, matrixed environment

Possesses strong leadership and
negotiation skills

Excellent communication skills, both
written and verbal

Ability to interact with senior
leadership teams in IT and business Preferred

Expertise in Python and experience
writing Azure functions using Python/Node.js

Experience using Event Hub for data
integrations.

Hands on expertise in: Implementing
analytical data stores in Azure platform using ADLS/Azure data factory /Data
Bricks and Cosmos DB (mongo/graph API)

Experience ingesting using Azure
data factory, Complex ETL using Data Bricks.

Eagerness to learn new technologies
on the fly and ship to production

Rajesh Potlapelli

https://www.linkedin.com/in/rajesh-potlapelli/

https://www.linkedin.com/groups/9142054/

[email protected]

--

Keywords: javascript database information technology North Carolina
Azure Data Engineer | NC locals
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2011461&uid=
[email protected]
View All
07:51 PM 13-Dec-24


To remove this job post send "job_kill 2011461" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,