GCP Big Data Engineer at Phoenix, Arizona, USA |
Email: [email protected] |
Job Title: GCP Big Data Engineer Location: Phoenix AZ or Palo Alto, California F2F for Final Call (Spark, CouchDB, GCP) -Main skills Key Requirements: Experience: 6+ years of hands-on experience working with Big Data technologies, including Apache Spark and CouchDB. Cloud Expertise: Proven experience working on Google Cloud Platform (GCP), including services like BigQuery, Cloud Dataproc, Dataflow, Pub/Sub, and Cloud Storage. Spark Development: Strong experience in designing and optimizing Spark applications, including writing efficient batch and real-time data processing jobs. CouchDB: In-depth knowledge of CouchDB architecture, replication, and management in distributed environments. ETL Processes: Hands-on experience with designing and building ETL pipelines using Apache Beam, Dataflow, or other similar tools. Programming Skills: Proficiency in languages like Python, Java, Scala, or similar for developing data solutions. Data Engineering Tools: Experience with data orchestration and workflow tools such as Apache Airflow. Version Control and CI/CD: Familiarity with version control systems like Git, and CI/CD tools for deploying big data pipelines. Problem Solving: Strong analytical and problem-solving skills with the ability to troubleshoot and optimize distributed systems. Collaboration: Strong communication skills with experience collaborating across teams and business units. Cloud Certification: GCP certification (e.g., Professional Data Engineer) is a plus. Preferred Skills: -- Samuel McCoy Account Manager [email protected] linkedin: https://www.linkedin.com/in/sai-krishna-putta-845518232/ Email is the Best way To reach me . -- Keywords: continuous integration continuous deployment information technology Arizona GCP Big Data Engineer [email protected] |
[email protected] View All |
07:55 PM 04-Dec-24 |