Home

GCP BigData and Hadoop Ecosystems Developer at Phoenix, Arizona, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2085014&uid=

From:

lavanya,

KK asscoiates

[email protected]

Reply to:   [email protected]

We are hiring GCP BigData and Hadoop Ecosystems Developerfor one of my client in interesting candidates please send resumes to [email protected]
Visa status: :only H1B/GCEAD /GC/USC

Position:- :- GCP BigData and Hadoop Ecosystems Developer
Location: - Phoenix,AZ -onsite

Competencies: |

Digital : BigData and Hadoop Ecosystems, Digital : BigData and Hadoop Ecosystem - MapR |

Experience (Years): |

6-8 |

Essential Skills: |

5-7 years of experience in Data technologies - Modeling, Data Warehousing, Data Marts, ETL pipelines, Data Visualization.Hands on exp in Big data, Data lake, data bricks, Haddop, HiveCloud experience will be added advantage with GCPExperience in working with relational databases like PostgreSQL, MySQL, SQL Server, AWS RDS, Big QueryExperienced with Docker and Kubernetes ,containerization Extensive experience in setting up CI/CD pipelines using tools such as Jenkins, Bit Bucket, GitHub, Maven, SVN and Azure DevOps. |

Keywords: continuous integration continuous deployment green card Arizona
GCP BigData and Hadoop Ecosystems Developer
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2085014&uid=
[email protected]
View All
09:04 AM 16-Jan-25


To remove this job post send "job_kill 2085014" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 6

Location: Phoenix, Arizona