Home

Big data Engineer with Azure - Franklin TN - Contract (10 Years Must) at Franklin, New Jersey, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=138592&uid=

From:
harinath,
VBeyond Corporation
[email protected]
Reply to: [email protected]

Greetings from Harinath,

VBeyond Corporation. (http://www.Vbeyond.com), is an Equal Opportunity Employer, is an IT consulting firm headquartered in Hillsborough, NJ servicing clients nationwide. We provide services in IT, Telecom, Finance & Accounting and Engineering categories.

If you are interested, kindly send me your updated resume, contact details and a convenient time to talk to you.

Big data Engineer with Azure

Franklin TN

Contract

Mandatory :

10 Years

Bigdata with Azure

Scala

Kafka

Summary : This role will be part of the Data Exchange group and will report to Software Engineering Senior Manager. It will require cross coordination between multiple teams. This role will be a key player in defining and implementing Big Data Strategy for the organization along with driving implementation of IT solutions for the business. This role will also provide direction for implementing best practices to determine optimum solutions. Minimum Education, Licensure and Professional Certification requirement:

Bachelors Degree or equivalent Experience Minimum Experience required (number of years necessary to perform role): 10+ years IX. Required Skills/Qualifications:

Bachelors Degree or Equivalent Experience
Minimum 10 years of IT Experience
Minimum 3 years in implementing Big Data Solutions
Proficiency in developing batch and streaming application using PySpark/Scala and Kafka
At least 3 years experience with working on Cloud implementations required
At least 2 years experience in using Azure Databricks Platform, Databricks Delta
Candidate must be able to lead cross functional Solutions
Experience in using different data services on Azure
Proficient in Database Concepts and Technologies including MS SQL Server, DB2, Oracle, Cosmos DB, and No-SQL Databases
Proficiency in file formats such as (but not limited to) Avro, Parquet, and JSON
Familiar with Data Modeling, Data Architecture & Data Governance concepts
Adept in designing and leveraging APIs including integrating to drive dynamic content
Exposure to at least one: Azure DevOps, AWS, or Google Cloud
Demonstrated problem-solving skills and the ability to work collaboratively with other stakeholders or team member to resolve issues
Excellent Communications skills and should be able to effectively collaborate with the remote teams both on-shore and off-shore
Healthcare or Financial background is a plus Job Description Overview In this role, you will be responsible for full life cycle solutions, from conception through deployment, for data solutions.
Improve coding quality and reliability by implementing good standards and processes. Increase productivity by implementing tools and processes.
Serve as the technology go-to person on any technical questions. Resolve complex technical issues.
Ensure quality is maintained by following development patterns and standards.
Prepare deployment and post-deployment plans to support the conversion and deployment of the solution.
Interact with architects, technical project managers, developers to ensure that solutions meet requirements and customer needs.
Improve coding quality and reliability by implementing good standards and processes (best practices). Increase productivity by implementing tools and processes.
Serve as the technology go-to person on any technical questions. Resolve complex technical issues.
Ensure quality is maintained by following development patterns and standards.

Responsibilities

Build ETL processes to allow data to flow seamlessly from source to target using tools like DataBricks, Azure Data Factory SSIS. Load and enhance dimensional data models.
Leverage code to apply business rules to ensure data is clean and is interpreted correctly by all business stakeholder like DataBricks, SQL, Scala, and Spark.
Perform peer code reviews and QA. Fine tune existing code to make processes more efficient.
Maintain and create documentation to describe our data management processes.

a) Drive development & delivery of Key Business initiatives for the Big Data Platform in collaborating with other stakeholders.

b) Collaborate with Business stakeholders in gathering Business requirements

c) Perform POCs on Big Data Platform to determine optimum solution.

d) Work with vendors in evaluating Big Data Technologies and resolving Technical Issues

e) Effectively collaborate with remote teams (on-shore and off-shore) for Solution Delivery

Thanks & Regards,
Harinath M
Vbeyond Corporation
IT Services & Solutions Firm
Hillsborough, New Jersey, USA
Direct : 908-633-2603
Email ID:[email protected]

Disclaimer:You received this message in response to your interest in such jobs or your past interaction with our company. If you have received this email in error or prefer not to receive such emails in the future, please reply with "REMOVE" in the subject line. All remove requests will be honoured ASAP. We sincerely apologize for any inconvenience caused.
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=138592&uid=
[email protected]
View All
07:53 PM 14-Nov-22


To remove this job post send "job_kill 138592" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,