Home

TODAY INTERVIEW || GCP Consultant || USA (REMOTE) at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=689375&uid=

Hi,

This email is in regards to an excellent job opportunity with one of our clients. We are looking
GCP Consultant, Location: USA (REMOTE) per below mentioned job description, if you are interested please call me ASAP or you can reply over this email with the updated resume, contact detail and best time to reach.

NOTE: PLEASE SHARE YOUR UPDATED HOTLIST ALSO, IF I DONT RESPONSE YOUR CALL PLEASE DROP MSG ON LINKEDLN (
https://www.linkedin.com/in/mohd-tariq-053242247/
)

REQUIREMENT- 1

Job Role: GCP Lead / Architect with DevOps Exp

Location: USA (REMOTE)

Job Type: Contract (6+ months or more)

Experience: 10+ Years

PASSPORT NUMBER MANDATORY

Mandatory Skills:

GKE (Kubernetes) Service Mesh Implementation (ISTIO, GKE Ingress)

CI/CD DevOps (Blue Deployment, Tooling Jenkins, Harness), Terraform, Application (Java, Python), Ansible, Cheff, Puppet

Job Description:
A good understanding of GCP Services and architecting principles
Proficient on container-based solution services, have handled at least 2 - 3 large scale Kubernetes based infrastructure build out, provisioning of services in GCP.
Proficient on cloud migration and modernization projects of various scale and volume
GCP Certified (preferably)
Should have experience in GCP Services (CE, GKE, Cloud function, Pub/Sub, Storage, Data tools, Cloud Build & GCR)
Experience in Container Orchestration
Experience in Infrastructure as Code Tooling like Terraform
Knowledge of CI/CD pipeline tooling and deployments.
Experience with at least one configuration management tool like Ansible, Chef, Puppet Salt, etc.

REQUIREMENT- 2

Job Role: GCP Architect with Infrastructure & Terraform Exp, Networking, Security

Location: USA (REMOTE)

Job Type: Contract (6+ months or more)

Experience: 10+ Years

PASSPORT NUMBER MANDATORY

Mandatory Skills: Java, Migration exp from AWS cloud to Google Cloud,

Job Description:

A good understanding of GCP Services and architecting principles.
Should have experience in GCP networking fundamentals & GCP Security fundamentals.
Strong Terraform & Scripting skills and Strong Linux Experience
Knowledge of CI/CD pipeline tooling and deployments.
Strong knowledge on container technologies such as Docker/Kubernetes
GCP Certified (preferably)
Experience in Infrastructure as Code Tooling like Terraform
Experience with at least one configuration management tool like Ansible, Chef, Puppet Salt, etc.

REQUIREMENT-3

Job Role: GCP Lead with Kubernetes

Location: USA (REMOTE)

Job Type: Contract (6+ months or more)

Experience: 10+ Years

PASSPORT NUMBER MANDATORY

Mandatory Skills: AWS ECS TO GKE Migration Exp, GCP Services, CI/CD, CI/CD Pipeline design

Job Description:
We are looking for an Lead with a good understanding of GCP Services and architecting principles.
The ideal candidate should have experience in GCP Services (CE, GKE, Cloud function, Pub/Sub, Storage, Data tools, Cloud Build & GCR), Container Orchestration, Infrastructure as Code Tooling like Terraform, and CI/CD pipeline tooling and deployments.
The candidate should also have handled at least 2 - 3 large scale kubernetes based infrastructure build out, provisioning of services in GCP and cloud migration and modernization projects of various scale and volume.
GCP Certification is preferred. The Architect (ATC) should be proficient in SDLC Pipelines Config, DevOps, GCP Security Services, Kubernetes, and other related skills.

REQUIREMENT- 4

Job Role: GCP Architect with Applications Migrations

Location: USA (REMOTE)

Job Type: Contract (6+ months or more)

Experience: 10+ Years

PASSPORT NUMBER MANDATORY

Mandatory Skills: ETL, Redshift, EMR, AWS Glue, GCP BigQuery, GCP Data proc, GCP Data Plex, Bigtable

Job Description:
Experience as Architect for migrating from Redshift to BigQuery
Design the custom solutions for migration and data sync between two environments
GCP Data Engineering certification is MUST
Extensive exp with data engineer on GCP
Expert in Pyspark programming
Min of 10 years of total experience , with 5 of experience in working with Data Fusion & dataflow with Beam programming in Java
Exp in Developing using custom templates, streaming pipelines and troubleshoot Data pipelines n Dataflow
Develop custom plugins on data fusion, and good experience with Cloud Dataproc
Must have used GCP services such as Cloud Composer, Cloud Storage, Big Query, pub/sub , IAM etc..
Good to have knowledge in understanding SAP BODS/Talend data pipelines

Thanks & Regards

Tariq Ahmad

Email:
[email protected]

LinkedIn: https://www.
linkedin.com/in/mohd-tariq-
053242247/

Office: 805-222-0532 Ext 161

WhatsApp
:332-228-3588

--

Keywords: continuous integration continuous deployment information technology container edition
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=689375&uid=
[email protected]
View All
10:59 PM 27-Sep-23


To remove this job post send "job_kill 689375" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.

Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,