GCP Data Engineer - Onsite at Phoenix, AZ at Phoenix, Arizona, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1192172&uid= From: Roopendra Reddy, Stacknexus [email protected] Reply to: [email protected] Job Description: Experience : 6-9 years Skills : GCP, Big Data, Hive, Spark, SQL, Kafka JD: Bachelors degree in Engineering or Computer Science or equivalent OR Masters in Computer Applications or equivalent. 5+ years of software development experience and leading teams of engineers and scrum teams 3+ years of hands-on experience of working with Map-Reduce, Hive, Spark (core, SQL and PySpark) Hands-on experience on writing and understanding complex SQL (Hive/PySpark-dataframes), optimizing joins while processing huge amount of data Experience in UNIX shell scripting Responsibilities: Responsible for designing system solutions, developing custom applications, and modifying existing applications to meet distinct and changing business requirements. Handle coding, debugging, and documentation, as well working closely with SRE team. Provide post implementation and ongoing production support Develop and design software applications, translating user needs into system architecture. Assess and validate application performance and integration of component systems and provide process flow diagrams. Test the engineering resilience of software and automation tools. You will be challenged with identifying innovative ideas and proof of concept to deliver against the existing and future needs of our customers. Software Engineers who join our Loyalty Technology team will be assigned to one of several exciting teams that are developing a new, nimble, and modern loyalty platform Additional Good to have requirements: Solid Datawarehousing concepts Knowledge of Financial reporting ecosystem will be a plus Experience with Data Visualization tools like Tableau, SiSense, Looker Expert on Distributed ecosystem Hands-on experience with programming using Python/Scala Expert on Hadoop and Spark Architecture and its working principle Ability to design and develop optimized Data pipelines for batch and real time data processing Should have experience in analysis, design, development, testing, and implementation of system applications Demonstrated ability to develop and document technical and functional specifications and analyze software and system processing flows Aptitude for learning and applying programming concepts. Ability to effectively communicate with internal and external business partners. Preferred Qualifications: Knowledge of cloud platforms like GCP/AWS, building Keywords: http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1192172&uid= |
[email protected] View All |
03:32 AM 07-Mar-24 |