Home

Omnicell Data Platform Developer Hybrid Cranberry, PA Omnicell==USC or GC only at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2264289&uid=

From:

nitu,

RCI

[email protected]

Reply to:   [email protected]

Job Title: Omnicell Data Platform Developer (
Hybrid)

Location: Cranberry, PA

Duration: 6+ Months Contract to hire

Interview mode: Phone and Skype (
Locals only)

Visa: USC or GC only

Experience level: 12+ years

Location:

Cranberry, PA, need local candidates only it can be hybrid, 2-3 days a week.

Description:

Omnicell is the world leader in pharmacy robotics, and were expanding beyond inventory management into inventory analytics. The OmniSphere helps hospitals and health systems understand how meds flow through their business, from the loading dock to the nurses glove, and then apply clinical expertise and advanced machine learning to uncover opportunities to adjust that flow to improve safety, cost, efficiency, and patient outcomes. And the next step for us is to help busy clinicians act on those opportunities by building efficient, industry-leading workflows.

To do that, we take terabytes of data from thousands of devices and translate it to simple, actionable steps our clients can take to improve their overall performance. This is achieved through a sleek new microservices architecture primarily composed of Kafka, Spark, PostgreSQL, .NET Core, and Angular all running in AWS.

Responsibilities:  

Translate business requirements into effective technology solutions.

Identify and establish technology choices (along with data architects) to enhance the data platform.

Set coding and design standards

Help lead the design, architecture and development of the Omnicell Data Platform

Conduct design and code reviews

Resolve defects/bugs during QA testing, pre-production, production, and post-release patches

Analyze and improve efficiency, scalability, and stability of various system resources once deployed

Provide technical leadership to agile teams onshore and offshore: Mentor junior engineers and new team members, and apply technical expertise to challenging programming and design problems

Help define the technology roadmap that will support the product development roadmap

Continue to improve code quality by tracking, reducing and avoiding technical debt

Focus on always putting the customer first.

Required Knowledge and Skills:

Deep development experience of distributed/scalable systems and high-volume transaction applications, participating in architecting big data projects

Hands-on programming experience in Scala, Python, and other object-oriented programming languages.

Expert in using Big data technologies like, Apache Kafka, Apache Spark, Real Time streaming, Structured Streaming, Delta lake.

Excellent analytical and problem-solving skills.

Energetic, motivated self-starter that is eager to excel

with excellent inter-personal skills.

Expert in knowing a balance driving the right architecture

but realizing the realities of having customers and the need to ship software

Experience developing ETL processing flows with MapReduce technologies like Spark and Hadoop.

Experience with Snowflake or Databricks.

Experience developing with ingestion and clustering frameworks such as Kafka, Zookeeper, YARN.

Education:

Bachelors degree preferred; may consider relevant experience in lieu of a degree.

10+ years experience in software engineering with a degree; 12+ years experience in software engineering in lieu of a degree.

Preferred Knowledge and Skills:

Masters degree.

Hands-on working experience in cloud infrastructure like AWS. Able to scale cade and deploy applications in the public cloud using technologies like AWS, Lambda, Docker, Kubernetes.

Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O.

Experience working with healthcare specific data exchange formats including HL7 and FHIR.

Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming.

Experience with various messaging systems, such as Kafka or RabbitMQ.

Working knowledge of Databricks, Team Foundation Server, Codefresh and DataDog.

Work Conditions:

Hybrid

Team collaborative hours between 8am to 4pm EST.

Ability to travel 10% of the time.

Keywords: quality analyst machine learning information technology green card Pennsylvania
Omnicell Data Platform Developer Hybrid Cranberry, PA Omnicell==USC or GC only
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2264289&uid=
[email protected]
View All
08:31 PM 18-Mar-25


To remove this job post send "job_kill 2264289" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 20

Location: , Pennsylvania