Home

Big Data Hadoop Administrator at Remote, Remote, USA
Email: [email protected]
From:

Vikrama,

Valiantiq

[email protected]

Reply to:   [email protected]

Job Role: Big Data Hadoop Administrator

Duration: 12 Months

Location: 100% Remote

Visa: H4-EAD, OPT-EAD, CPT-EAD, GC, USC

Seeking an Experienced Hadoop Admin with an understanding of Cloudera.

Job Description:

At Client, the Big Data Design Engineer is responsible for architecture design, implementation of Big Data platform, Extract/Transform/Load (ETL), and analytic applications.

Primary Responsibilities:

Oversees implementation and ongoing administration of Hadoop infrastructure and systems

Manages Big Data components/frameworks such as Hadoop, Spark, Storm, HBase, Hadoop Distributed File System (HDFS), Pig, Hive, Sqoop, Flume, Ozie, Avro, etc.

Analyzes latest Big Data analytic technologies and innovative applications in both business intelligence analysis and new offerings

Aligns with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and expand existing environments

Handles cluster maintenance and creation/removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise

Handles performance tuning of Hadoop clusters and Hadoop MapReduce routines

Screens Hadoop cluster job performances and capacity planning

Monitors Hadoop cluster connectivity and security

Manages and reviews Hadoop log files

Handles HDFS and file system management, maintenance, and monitoring

Partners with infrastructure, network, database, application, and business intelligence teams to guarantee high data quality and availability

Collaborates with application teams to install operating system and Hadoop updates, patches, and version upgrades when required

Acts of point of contact for vendor escalation

Requirements:

Bachelor's degree in a related field

Seven (7) years of experience in architecture and implementation of large and highly complex projects

Skills and Competencies:

Experience with Airflow, Argo, Luigi, or similar orchestration tool

Experience with DevOps principals and CI/CD

Experience with Docker and Kubernetes

Experience with No-SQL databases such as HBase, Cassandra, or MongoDB

Experience with streaming technologies such as Kafka, Flink, or Spark Streaming

Experience working with Hadoop ecosystem building Data Assets at an enterprise scale

Strong communication skills through written and oral presentations

Experienced Hadoop Admin with an understanding of Cloudera.

Thanks & Regards,

Vikrama Rao

Recruitment Executive- ValiantIQ Inc.

"Searching Best Minds 

 Searching Best Minds"

Email:

[email protected]

P.
18032918038
F. (302) 482-3672

Disclaimer:
  If you are not interested in receiving our e-mails then please reply with a "REMOVE" in the subject line for automatic removal. And mention all the e-mail addresses to be removed with any e-mail addresses, which might be diverting the e-mails to you. We are sorry for the inconvenience.

Keywords: continuous integration continuous deployment green card
Big Data Hadoop Administrator
[email protected]
[email protected]
View All
07:07 PM 31-Jan-25


To remove this job post send "job_kill 2132617" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 28

Location: , Remote