Home

Looking for GCP Data Architect(Hybrid role) at Phoenix, Arizona, USA
Email: hr@shayaancorp.com
https://short-link.me/15H5b
https://jobs.nvoids.com/job_details.jsp?id=2183507&uid=
Job Title: GCP Data Architect (Not Devops)
Location: Phoenix, AZ(Hybrid)
Rate: $65/hr on C2C

Job Description:
We are seeking a GCP Data Architect with 10+ years of experience in designing and implementing scalable data solutions on Google Cloud Platform (GCP). The ideal candidate will have deep expertise in BigQuery, Dataflow, Airflow (Composer), SQL, and data pipeline development. This role involves architecting, orchestrating, and optimizing data workflows to support large-scale cloud data ingestion, transformation, and analytics.
Key Responsibilities:
Architect and design scalable and efficient cloud-based data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
Develop, optimize, and manage data pipelines for real-time and batch processing using Dataflow (Apache Beam) and BigQuery.
Orchestrate and schedule workflows in Airflow (Cloud Composer) to automate data ingestion, transformation, and processing.
Implement data governance, security, and compliance best practices in cloud data architectures.
Design and implement data warehouse and data lake solutions for structured and unstructured data.
Collaborate with cross-functional teams including data engineers, analysts, and business stakeholders to translate business needs into technical solutions.
Drive performance tuning and optimization of SQL queries and data pipelines.
Ensure data reliability, quality, and availability by implementing robust monitoring and alerting mechanisms.
Stay up to date with emerging GCP technologies and best practices in cloud data engineering.
Required Skills & Experience:
10+ years of experience in data architecture, data engineering, or cloud-based data solutions.
Strong expertise in Google Cloud Platform (GCP) with hands-on experience in BigQuery, Dataflow, Cloud Storage, Pub/Sub, and IAM.
Proficiency in SQL for data analysis, transformation, and optimization.
Experience in developing and managing ETL/ELT pipelines for large-scale data processing.
Hands-on experience in Apache Airflow (Cloud Composer) for workflow orchestration.
Knowledge of streaming and batch data processing patterns using Dataflow/Apache Beam.
Experience with Python or Java for developing custom data transformations.
Strong understanding of data modeling, warehousing concepts, and schema design.
Experience with CI/CD pipelines, Infrastructure as Code (Terraform, Deployment Manager), and DevOps practices for cloud environments.
Familiarity with data security, governance, and compliance requirements in cloud-based data platforms.
Excellent problem-solving skills with a strong analytical mindset.
Ability to work independently and collaborate with cross-functional teams in an Agile environment.
Preferred Qualifications:
GCP Professional Data Engineer or GCP Professional Cloud Architect Certification.
Experience with real-time streaming architectures using Kafka or Pub/Sub.
Knowledge of machine learning workflows and integration with GCP AI/ML services.

Regards:

Vamsi

hr@shayaancorp.com

| O. 732 798 5943

Shayaan Corporation

--

Keywords: continuous integration continuous deployment artificial intelligence machine learning information technology Arizona
Looking for GCP Data Architect(Hybrid role)
hr@shayaancorp.com
https://short-link.me/15H5b
https://jobs.nvoids.com/job_details.jsp?id=2183507&uid=
hr@shayaancorp.com
View All
07:42 PM 18-Feb-25


To remove this job post send "job_kill 2183507" as subject from hr@shayaancorp.com to usjobs@nvoids.com. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to hr@shayaancorp.com -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at me@nvoids.com


Time Taken: 10

Location: Phoenix, Arizona