Home

AWS Architect Data Bricks - C2C- Remote at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=60777&uid=

From:
Mano,
sightspectrum
[email protected]
Reply to: [email protected]

Hi ,
Greetings from Sight Spectrum!
Please find the job description below for your review and if interested, please send across your recent updated resume for further discussion

Role: AWS Architect with Data Bricks

Duration : Long-term Contract

Job Description:

Role Summary:

This role will be responsible for transforming extensive and complex data into consumable business capabilities. Create system architecture, design, and specification using in-depth engineering skills and knowledge to solve complex development problems and achieve engineering goals. Determine and source appropriate data for a given analysis. Work with data modelers/analysts to understand the business problems they are trying to solve, then create or augment data assets to feed their analysis. Acts as a resource and mentor for colleagues with less experience.

Core Responsibilities:

- Hands-on experience working with Data Analysis and Architecture, Spark, PySpark, Python, Sqoop, Pig, Hive, No SQL Data Stores, Object Store, Design and Development of APIs, Kafka

- Create business value by leading the design, get your hands dirty, write code, and ultimately deploy big data and machine learning capabilities.

- Possess expert knowledge in performance, large-scale data distributed system scalability, system architecture, and data engineering best practices.

- Give the highest priority to operational excellence, evaluate system performance, security, design system metrics, and driving quality improvements.

- Provide leadership, work collaboratively, and be a mentor in a fantastic team.

- Your expertise is deep and broad; youre hands-on, producing both detailed technical work and high-level architectural designs.

- 5+ years of recent hands-on building data ingestion and transformation pipelines leveraging Spark.

- 5+ years of recent hands-on in an object-oriented language (Java, Scala, Python).

- 5+ years of experience designing and building data pipelines and data-intensive applications.

- Experience using Big Data frameworks (e.g., Hadoop, Spark), databases for complex data assembly and transformation.

- Experience working with Healthcare data is a plus.
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=60777&uid=
[email protected]
View All
11:04 AM 17-Oct-22


To remove this job post send "job_kill 60777" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,