Home

Big data Engineer with Azure Contract Franklin TN. at Franklin, Tennessee, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=199964&uid=

From:

Gayathri J,

Vbeyond Corporation

[email protected]

Reply to:   [email protected]

Hi,

Greeting from VBEYOND,

I hope you are doing well; I am Gayathri J from VBeyond Corp.  We are a global recruitment company with a specialization in the hiring of IT professionals.  One of our clients is looking for a big data Engineer with Azure
Contract Franklin TN.
If you feel this requirement is not suitable for you, kindly share the same with your circle or Reply REMOVE.

Role: Big data Engineer with Azure

Location: Franklin TN

Type: Contract

Mandatory :

10 Years

Bigdata with Azure

Scala

Kafka

Python

Summary :

This role will be part of the Data Exchange group and will report to Software Engineering Senior Manager.

It will require cross coordination between multiple teams.

This role will be a key player in defining and implementing Big Data Strategy for the organization along with driving implementation of IT solutions for the business.

This role will also provide direction for implementing best practices to determine optimum solutions. Minimum Education, Licensure and Professional

Certification requirement:

Bachelors Degree or equivalent Experience Minimum Experience required (number of years necessary to perform role): 10+ years IX.

Required Skills/Qualifications:

Bachelors Degree or Equivalent Experience

Minimum 10 years of IT Experience

Minimum 3 years in implementing Big Data Solutions

Proficiency in developing batch and streaming application using PySpark/Scala and Kafka

At least 3 years experience with working on Cloud implementations required

At least 2 years experience in using Azure Databricks Platform, Databricks Delta

Candidate must be able to lead cross functional Solutions

Experience in using different data services on Azure

Proficient in Database Concepts and Technologies including MS SQL Server, DB2, Oracle, Cosmos DB, and No-SQL Databases

Proficiency in file formats such as (but not limited to) Avro, Parquet, and JSON

Familiar with Data Modeling, Data Architecture & Data Governance concepts

Adept in designing and leveraging APIs including integrating to drive dynamic content

Exposure to at least one: Azure DevOps, AWS, or Google Cloud

Demonstrated problem-solving skills and the ability to work collaboratively with other stakeholders or team member to resolve issues

Excellent Communications skills and should be able to effectively collaborate with the remote teams both on-shore and off-shore

Healthcare or Financial background is a plus Job Description Overview In this role, you will be responsible for full life cycle solutions, from conception through deployment, for data solutions. Improve coding quality and reliability by implementing good standards and processes. Increase productivity by implementing tools and processes. Serve as the technology go-to person on any technical questions. Resolve complex technical issues. Ensure quality is maintained by following development patterns and standards. Prepare deployment and post-deployment plans to support the conversion and deployment of the solution. Interact with architects, technical project managers, developers to ensure that solutions meet requirements and customer needs. Improve coding quality and reliability by implementing good standards and processes (best practices). Increase productivity by implementing tools and processes. Serve as the technology go-to person on any technical questions. Resolve complex technical issues. Ensure quality is maintained by following development patterns and standards. Responsibilities Build ETL processes to allow data to flow seamlessly from source to target using tools like DataBricks, Azure Data Factory SSIS. Load and enhance dimensional data models. Leverage code to apply business rules to ensure data is clean and is interpreted correctly by all business stakeholder like DataBricks, SQL, Scala, and Spark. Perform peer code reviews and QA. Fine tune existing code to make processes more efficient. Maintain and create documentation to describe our data management processes. a) Drive development & delivery of Key Business initiatives for the Big Data Platform in collaborating with other stakeholders. b) Collaborate with Business stakeholders in gathering Business requirements c) Perform POCs on Big Data Platform to determine optimum solution. d) Work with vendors in evaluating Big Data Technologies and resolving Technical Issues e) Effectively collaborate with remote teams (on-shore and off-shore) for Solution Delivery

Thanks & Regards

Gayathri J

VBeyond Corporation || PARTNERING FOR GROWTH

Hillsborough, New Jersey, USA

Email ID:
[email protected]

Contact No: 862- 227-3677

Website : 
www.vbeyond.com

Disclaimer:
 You received this message in response to your interest in such jobs or your past interaction with our company. If you have received this email in error or prefer not to receive such emails in the future, please reply with "REMOVE" in the subject line. All remove requests will be honored ASAP. We sincerely apologize for any inconvenience caused.

Keywords: cprogramm quality analyst access management database information technology golang
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=199964&uid=
[email protected]
View All
12:35 PM 09-Dec-22


To remove this job post send "job_kill 199964" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 9

Location: Franklin, Tennessee