AWS PySpark with Python Exp at Plano, Texas, USA |
Email: harish@tanishasystems.com |
https://rb.gy/r1ud0k https://jobs.nvoids.com/job_details.jsp?id=1867417&uid= From: Harsh, Tanisha Systems Inc harish@tanishasystems.com Reply to: harish@tanishasystems.com Hi, Please send me updated resume if interested in below role, Role: AWS PySpark with Python Exp Location: Plano TX / Wilmington DE(Hybrid Role ) Job Mode: Contract Required Exp: 11+ Years Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 10+ years applied experience Hands-on practical experience delivering system design, application development, testing, and operational stability Advanced in one or more programming language(s) - Java, Python, Go A strong understanding of business technology drivers and their impact on architecture design, performance and monitoring, best practices Design and building web environments on AWS, which includes working with services like EC2, ALB, NLB, Aurora Postgres, DynamoDB, EKS, ECS fargate, MFTS, SQS/SNS, S3 and Route53 Advanced in modern technologies such as: Java version 8+, Spring Boot, Restful Microservices, AWS or Cloud Foundry, Kubernetes. Experience using DevOps tools in a cloud environment, such as Ansible, Artifactory, Docker, GitHub, Jenkins, Kubernetes, Maven, and Sonar Qube Experience and knowledge of writing Infrastructure-as-Code (IaC) and Environment-as-Code (EaC), using tools like CloudFormation or Terraform Experience with high volume, SLA critical applications, and building upon messaging and or event-driven architectures Deep understanding of financial industry and their IT systems Preferred qualifications, capabilities, and skills Expert in one or more programming language(s) preferably Java AWS Associate level certification in Developer, Solutions Architect or DevOps Experience in building the AWS infrastructure like EKS, EC2, ECS, S3, DynamoDB, RDS, MFTS, Route53, ALB, NLB Experience with high volume, mission critical applications, and building upon messaging and or event-driven architectures using Apache Kafka Experience with logging, observability and monitoring tools including Splunk, Datadog, Dynatrace. CloudWatch or Grafana Experience in automation and continuous delivery methods using scripts, Gradle, Maven, Jenkins, Spinnaker Experience with microservices architecture, high volume, SLA critical applications and their interdependencies with other applications, microservices and databases Experience developing process, tooling, and methods to help improve operational maturity Regards Harish kumar Lead Recruiter, Tanisha Systems Inc Email: harish@tanishasystems.com Website: www.tanishasystems.com Address: 99 Wood Ave South Suite # 308,Iselin, NJ 08830 LinkedIn: https://www.linkedin.com/in/harish-kumar-71907714b/ Keywords: sthree information technology golang Delaware New Jersey Texas AWS PySpark with Python Exp harish@tanishasystems.com https://rb.gy/r1ud0k https://jobs.nvoids.com/job_details.jsp?id=1867417&uid= |
harish@tanishasystems.com View All |
02:57 AM 23-Oct-24 |