Home

Hybrid Position: Hadoop with DevOps USC/GC strongly at Remote, Remote, USA
Email: [email protected]
Position: Hadoop with DevOps
experience
No H1B/OPT- USC/GC strongly
preferred
ONLY LEGIT CANDIDATES WILL BE
ENTERTAINED
This is a hybrid position in NY
Local candidates ONLY
$60-65/hr C2C
Must have LinkedIn
PV is very technical, please submit strong candidates who can take call
properly. Please submit quality resume within 4-5 pages only.

The Team
Bloomberg runs on data! It's our business and our product.
Financial institutions need timely, accurate data to capture opportunities and
evaluate risk in fast-moving markets, from the biggest banks to elite hedge
funds.
Bloomberg's application teams face complex compute challenges in
terms of large-scale data processing, low-latency data retrievals and
high-volume requests over regionally distributed data lakes and warehouses.
Cloud Native Compute Service (CNCS) group provides backbone solutions to these
problems by building secure, highly available, and regulatory compliant
multi-tenant platforms powering AI, business intelligence, data engineering,
and analytics use cases in Bloomberg. 

The Apache Hadoop platform (HDFS, HBase, Hive, Oozie, YARN/Spark,
etc) houses the largest datasets, multi-petabytes in scale, and the most
mission critical compute workloads in the firm. Our Hadoop engineering team
manages the massive underlying infrastructure  running 100s of billions of
requests per day and running 10s of thousands of jobs hitting 100s of thousands
of tables every day. We also provide standard methodologies and domain
expertise on Hadoop services to applications across various product domains at
Bloomberg.

Who are you

You
are a dedicated and motivated engineer interested in building and managing
large-scale distributed systems.

You
are an innovative problem solver who likes building tools and
orchestration frameworks to relentlessly automate away toil.

You
want to make a significant impact and contribute to open-source software.

Well trust you to:

Build
and develop tools to operate and support a massive data and compute
infrastructure (10s of PB) through aggressive process automation.

Provide
tenants with self-serve tools to provision Hadoop resources and environments.

Understand
and improve the usability, reliability, and scalability of open-source
Apache Hadoop services to optimize for the needs of Bloomberg application
teams.

Youll need to have:

3+
years experience Infrastructure as Code (IaC) practices and technologies
like Ansible, Chef, or Terraform.

Systems
programming experience in Python, Go or Java.

A
degree in Computer Science, Engineering or similar field of study or
equivalent work experience.

Experience
with Chaos Engineering or testing strategies for infrastructure and
platform operations.

Solid
understanding of Linux Operating system, scripting, OS
troubleshooting

Strong
problem-solving and communication skills.

We'd love to see:

Experience
with workflow automation tools like Airflow, Argo, etc.

CI/CD
experience like Jenkins, TeamCity, etc. to manage SDLC of infrastructure
and platform automation tooling.

Experience
in automating deployments of technologies in the Hadoop ecosystem (HDFS,
HBase, Hive, Spark, Oozie, etc.)

Experience
with Cloud Native computing technologies like Kubernetes, and Containers.

Operates well in Agile team cadences

--

Keywords: continuous integration continuous deployment artificial intelligence information technology golang green card New York
Hybrid Position: Hadoop with DevOps USC/GC strongly
[email protected]
[email protected]
View All
09:55 PM 14-Jan-25


To remove this job post send "job_kill 2079242" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 2

Location: ,