Home

Immediate Hiring for the Position - Data Modeler at Durham, North Carolina, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1173004&uid=

From:

Sameera,

Avance Consulting

[email protected]

Reply to:   [email protected]

Job Description

Role: Data Modeler

Location: Durham, NC (Day1 Onsite)

Analyzing and translating business needs into long-term solution data models.

Evaluating existing data systems.

Working with the development team to create conceptual data models and data flows.

Developing best practices for data coding to ensure consistency within the system.

Reviewing modifications of existing systems for cross-compatibility.

Experience in Azure cloud environment

Defining databases and the physical, logical and conceptual levels involved

Kafka Expert with Big Data Background

Required skills:

A bachelors or masters degree in relevant Business/IT studies with at least 5 years of experience in a similar role

You are experienced with Kafka publish-subscribe principles

Experience in Internet-of-things integration scenarios

Understanding of event-driven microservice

At least 3 years of experience in the field of Kafka Data Streaming

Experience with Kafka connectors on different hyperscaler and integration scenarios

Transformation tools like KSQL Streams

Experience in Cloud Big Data technologies and architectures within AZURE, Google Cloud or AWS.

Experienced in Java, Scala, Python, MySQL

Business Consulting and Technical Consulting skills

An entrepreneurial spirit and the ability to foster a positive and energized culture

A growth mindset with a curiosity to learn and improve.

Team player with strong interpersonal, written and verbal communication skills.

You can demonstrate fluent communication skills in English and Mandarin (spoken and written)

Roles & Responsibilities:

Managing Kafka real-time data streaming scenarios in a productive environment

Working with Confluent Kafka Control center

Supporting Kafka components like Broker, Zookeeper, Schema registry

Building complex Kafka KSQL Streams

Setup of Kafka scenarios on different hyperscaler

Design and build Data Flows using different Kafka connectors

Develop and optimize Data Models and pipelines for performance and scalability, reusable and listing in libraries for the future

Support industrialization of streaming solutions

Enable meaningful and insightful reports for Data Analysis and Monitoring

Ensure systematic quality assurance for the validation of accurate Data Processing

Building reusable code and libraries for future use

Optimization of applications for maximum speed and scalability

Implementation of security and data protection

Translation of stakeholder requirements into concrete specifications for Kafka and self-service solutions

Work diligently towards better relationship with the customer

Ability to create instructions /operations manual and staying on the job until it is finished

Willingness to be accountable, courtesy, reliability, flexibility, cooperation and adaptability.

Take initiative, ability to take responsibility

Keywords: information technology North Carolina
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1173004&uid=
[email protected]
View All
02:02 AM 01-Mar-24


To remove this job post send "job_kill 1173004" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 62

Location: Durham, North Carolina