Data Engineer (Kafka, MongoDB, OpenShift (AWS) || Phoenix, AZ (3 days to office) || C2C at Phoenix, Arizona, USA |
Email: [email protected] |
Role : Data Engineer ( Kafka, MongoDB, OpenShift (AWS) Location: Phoenix, AZ (3 days to office) Type :: C2C EXPERIENCE || Overall 10+ years experience H1B || Passport Number VERTICAL || BANKING JD- Data Engineer with expertise in Kafka, MongoDB, OpenShift (AWS). The ideal candidate will be responsible for designing, building, and maintaining robust data pipelines, ensuring the efficient flow of data across various systems. You will work closely with data scientists, analysts, and other stakeholders to optimize data processing and storage solutions. Experience & Skill Requirements Design and implement scalable data pipelines using Kafka for real-time data ingestion and processing. Hands on Java,Spark, Bigdata and manage and optimize MongoDB databases, ensuring high availability and performance. Develop containerized Springboot/Java applications and deploy them using OpenShift for streamlined operations. Utilize AWS services (e.g., S3, Redshift) for data storage, processing, and analysis. Collaborate with cross-functional teams to gather requirements and deliver data solutions that meet business needs. Monitor and troubleshoot data pipeline performance, implementing improvements as necessary -- Keywords: sthree information technology Arizona Data Engineer (Kafka, MongoDB, OpenShift (AWS) || Phoenix, AZ (3 days to office) || C2C [email protected] |
[email protected] View All |
10:35 PM 21-Feb-25 |