Data Engineer (GCP, SQL, Python, Data Pipeline Development) : Atlanta, GA at Atlanta, Georgia, USA |
Email: [email protected] |
From: Deepika Dua, Stellar Consulting Solutions LLC [email protected] Reply to: [email protected] Hello Sir, My name is Deepika & I am a recruiter at Stellar Consulting, and I came across your profile while looking for qualified candidates for Data Engineer (GCP, SQL, Python, Data Pipeline Development) role. I was impressed by your experience and skills . I think you would be a great fit for this role, as you have almost all the skills & experience required for the role. Below are the job details for your reference.: Job Title : Data Engineer (GCP, SQL, Python, Data Pipeline Development) Location : Atlanta, GA Duration : Long Term Contract Job Description: We are looking for a Data Engineer with expertise in Google Cloud Platform (GCP), SQL, and Python, specializing in data pipeline development. The ideal candidate will design, build, and optimize scalable data pipelines to support analytics, reporting, and AI/ML use cases. You will work with GCP services like BigQuery, Cloud Composer, Dataflow, and Pub/Sub to process large-scale datasets efficiently. Key Responsibilities: Data Pipeline Development: Design and implement ETL/ELT pipelines using Cloud Dataflow, Apache Beam, Cloud Composer (Airflow), and Pub/Sub. Database & Data Warehousing: Develop and optimize queries for BigQuery, Cloud SQL (PostgreSQL, MySQL), and Firestore. Data Processing & Transformation: Write efficient Python and SQL scripts to transform and clean data for analytics and reporting. Workflow Orchestration: Automate and schedule data workflows using Cloud Composer (Apache Airflow). Real-Time & Batch Processing: Implement real-time data streaming with Pub/Sub, Dataflow, and Kafka while supporting batch processing pipelines. Performance Optimization: Tune SQL queries, optimize data storage, and improve pipeline efficiency to reduce latency and cost. Security & Compliance: Ensure data governance, security, and compliance with best practices in IAM, VPC, encryption, and auditing. Monitoring & Troubleshooting: Implement monitoring, logging, and alerting using Stackdriver, Cloud Logging, and Cloud Monitoring. Collaboration: Work closely with data analysts, data scientists, and software engineers to integrate data solutions into business applications. Required Qualifications: Education: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. Experience: 4+ years in data engineering with a focus on GCP, SQL, and Python. Cloud Certifications (Preferred): Google Professional Data Engineer Certification Technical Skills: Cloud Services (GCP): BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer (Airflow) Programming & Scripting: Python, SQL (PostgreSQL, MySQL, BigQuery SQL), Scripting Data Processing & Pipelines: Apache Beam, Dataflow, Apache Kafka, Apache Spark Database & Warehousing: BigQuery, Cloud SQL, Firestore, Spanner Workflow Orchestration: Apache Airflow (Cloud Composer) DevOps & CI/CD: Terraform, Docker, Kubernetes, GitHub Actions Security & Compliance: IAM, Data Encryption, VPC, Audit Logging Soft Skills: Strong analytical and problem-solving skills Ability to work in fast-paced, agile environments Excellent communication and collaboration skills Passion for data-driven decision-making Deepika Dua Sr. Technical Recruiter at Stellar Consulting Solutions, LLC | Keywords: continuous integration continuous deployment artificial intelligence machine learning access management Georgia Data Engineer (GCP, SQL, Python, Data Pipeline Development) : Atlanta, GA [email protected] |
[email protected] View All |
07:59 PM 03-Feb-25 |