Home

We are looking for Big Data Engineer/Hadoop Developer with 10+ exp - Onsite role(AZ) - Direct Client requirements at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=896574&uid=

Hai Friends,

We are looking for

Big Data Engineer/Hadoop Developer
with 10+ years
of experienced candidates, if you have any candidates please revert back to me with resumes.

Job Title: Big Data Engineer/Hadoop Developer

Location: Onsite Phoenix -AZ

Must be very strong in GCP and Big Data .

Description:

We are searching for a highly skilled and experienced Big Data Engineer/Hadoop Developer to join our team. The ideal candidate should possess a minimum of 7 years of experience in developing, designing, and maintaining applications using Java and J2EE technologies, with a focus on Big Data and Hadoop technologies for at least 4 years. The candidate should be well-versed in various Big Data ecosystems, including Cloudera, Hortonworks, and NoSQL platforms such as HBase and Cassandra. Proficiency in Hadoop's ecosystem tools like MapReduce, Hive, Pig, Sqoop, Zookeeper, Kafka, Flume, Impala, and Apache Spark is essential.

Key Responsibilities:

Design, develop, and integrate scalable Big Data solutions using Hadoop and associated technologies.

Utilize expertise in Hadoop File System and its ecosystem tools for data processing and analysis.

Implement data ingest pipelines using technologies like Spring Integration, Apache Storm, and Kafka.

Work with AWS services like EMR and EC2 for efficient processing of Big Data.

Write and optimize MapReduce programs, Pig jobs, and Hive queries for data analysis.

Leverage relational databases like Oracle, MySQL, PostgreSQL, and MS-SQL Server.

Design and optimize data distribution using Hive QL, partitioning, and bucketing techniques.

Utilize Oozie Workflow Engine for running Hadoop MapReduce and Pig jobs.

Extend the Hive library using custom UDFs to query data in non-standard formats.

Ingest data from various databases using Sqoop and migrate ETL operations into HDFS systems using Pig scripts.

Evaluate and use big data analytics libraries like MLlib and Spark-SQL for data exploration.

Utilize Apache Ignite for handling streaming data and work with build tools like Maven and Jenkins.

Experience with cloud platforms such as AWS, Azure, and GCP.

Proficiency with IDEs like Eclipse, IntelliJ, My Eclipse, RAD, and NetBeans.

Participate in Business Intelligence activities related to data warehouse, ETL, and report development.

Expertise in Waterfall and Agile software development models and project planning using Microsoft Project Planner and JIRA.

Technical Skills:

Big Data Technologies: HDFS, MapReduce, Hive, Pig, Sqoop, Flume, Oozie, Avro, Hadoop Streaming, Zookeeper, Kafka, Impala, Apache Spark, Hue, Ambari, Apache Ignite.

Languages: Java, C, SQL, PYTHON, PL/SQL, PIG-Latin, HQL.

IDE Tools: Eclipse, IntelliJ.

Frameworks: Hibernate, Spring, Struts, Junit.

Operating Systems: Windows (XP, 7, 8), UNIX, LINUX, Ubuntu, CentOS.

Application Servers: JBoss, Tomcat, Web Logic, Web Sphere, Servlets.

Reporting Tools/ETL Tools: Tableau, Power view for Microsoft Excel, Informatica.

Databases: Oracle, MySQL, DB2, Derby, PostgreSQL, No-SQL Database (HBase, Cassandra).

Qualifications:

Bachelor's degree in a related field.

Strong problem-solving skills and a deep understanding of Big Data technologies.

Experience with Agile and Waterfall software development methodologies.

Proficiency in various programming languages, including Java, Scala, and Python.

Strong communication skills and the ability to work collaboratively in a team.

--

HARISH

IT Recruiter

[email protected]

Contact: 517- 409-1222

https://www.linkedin.com/in/harish-naidu-3a9819243/

--

Keywords: cprogramm information technology microsoft procedural language Arizona
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=896574&uid=
[email protected]
View All
08:08 PM 30-Nov-23


To remove this job post send "job_kill 896574" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.

Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 1

Location: ,