| GCP Data Engineer at Remote, Remote, USA |
| Email: [email protected] |
|
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=641392&uid= From: Satyam Yadav, Tek Inspiration [email protected] Reply to: [email protected] Hi, Hope you are doing great!!! Please share suitable profiles and mention visa status and current location. Job Description - Job Description - Job title : GCP Data Engineer Type of Role : Contract Number of Resources needed : 1 Remote / Hybrid / Onsite Position: Hybrid (2-3 days/week at the office) Location of the resource : Dallas TX preferred, but open to Pleasanton CA and Phoenix AZ Visa Restrictions US Citizens and Greencard holders preferred (Visa holders may require about 5 weeks before starting) Contracts Duratio n: 6 months, with potential extension linked to performance Target Start Date: Immediate start Projects Information: (Information to share during an introductory interview E.g Project`s Goals, relationship between client and GFT, location of the team members, etc) This is for a portfolio company of Cerberus Technology solutions, who we have worked with for over 4 years. We dont know the name of the portfolio company yet. They are in the process of migrating their data analytics platform from Snowflake to GCP. Do candidates need to have specific industry experience No, but experience in data transformations for Retail and e-commerce business use cases. Technical and General Skillset Required: Required Qualifications The candidate must be an SME who is able to advise the client on the best ways to proceed regarding Azure and/or GCP technologies; 7+ years of proven experience in developing and deploying data pipelines in GCP or Azure. Must have experience in both at least 7+ in either and minimum 2-3 with the other; 5+ years of Snowflake, BigQuery, and/or Databricks experience 5+ years of proven experience in building frameworks for data ingestion, processing, and consumption using GCP Data Flow, GCP Data Composer, and Big Query. 4+ years of strong experience with SQL, Python, Java, and API development 2+ years of proven expertise in creating real-time pipelines using Kafka, Pub/sub. Knowledge of GitHub Actions for CICD Building high-quality data pipelines with monitoring and observability 2+ years of experience building dashboards and reports with PowerBI and/or Thoughtspot Preferred Qualifications Extensive experience in data transformations for Retail and e-commerce business use cases will be a plus Bachelors or Masters in computer engineering, computer science or related area. Knowledge of building machine learning models Mandatory Certifications Required: GCP Professional Architect or GCP Data Engineer Regards Satyam Yadav Keywords: Arizona California Texas http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=641392&uid= |
| [email protected] View All |
| 11:42 PM 14-Sep-23 |