Urgent req for GCP Data Engineeronly local candidate from PA || Immediate Placement at Remote, Remote, USA |
Email: [email protected] |
From: Sudhanshu Shekhar, Vyze [email protected] Reply to: [email protected] Job Description - GCP Data Engineer Remote. (Client location is PA , so local candidates will be preferred ) 6 Months Need LinkedIn, Need 8+ years Experience Candidate. , GCP Certified Candidates will be preferred Must Have: Strong Experience on GCP Financial or Banking Background, Strong Data Bricks Experience, Strong Communication Skills: 4-5 years of GCP experience. Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, Airflow; 2 + years coding skills in Java/Python; Work with data team to analyze data, build models and integrate massive datasets from multiple data sources for data modeling; Extracting, Loading, Transforming, cleaning, and validating data + Designing pipelines and architectures for data processing; Architecting and implementing next generation data and analytics platforms on GCP cloud; Experience in working with Agile and Lean methodologies; Experience working with either a Map Reduce or an MPP system on any size/scale; Geographic Information Systems (GIS), Geoanalytics, Geospatial Analysis, ArcGIS, Carto, Unfolded, H3 is a plus. Responsibilities: Build data systems and pipelines on cloud providers (GCP preferable); Build algorithms and prototypes (geospatial models are a plus); Implement tasks for Apache Airflow; Support and organize data in a data warehouse (Snowflake/BiqQuery); Develop efficient ETL/ELT pipelines. Keywords: Pennsylvania |
[email protected] View All |
10:53 PM 21-Dec-23 |