| Full Stack Java Heavy Kafka Streaming-Hybrid-C2C at Remote, Remote, USA |
| Email: [email protected] |
|
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=347367&uid= From: kishore, American it solutions [email protected] Reply to: [email protected] We have these exclusive requirements with a client based out of Dallas,TX (CANDIDATES NEED TO BE LOCAL ENOUGH TO ONE OF THOSE CITIES FOR 2 days onsite per week). RATES ARE OPEN. (As per Market,) I. WORK TO BE PERFORMED: This role is part of the Renaissance project and the resource will architect, design and build streaming solution(s) as part of a risk based set of systems. We need a major contributor in the development of scalable resilient hybrid Cloud-based distributed computing solutions, supporting critical financial risk management activities. You will help in the transformation of the enterprise into a data-driven organization. The role is for someone with experience in cloud development, having the ability to design large scale micro services based streaming solution(s). Further, this person should have hands on technical skills in creating prototype(s) and in setting right standards around software development practices. II. SKILL AND EXPERIENCE REQUIRED: Must Have: 7-10+ years of technical experience building newly configured, and designed data-centric software solutions Advance level knowledge and use of Java 8+ w/experience using Multithreading, Collections, Streams API and functional programming, working on real enterprise projects. Minimum of one year working to develop cloud native streaming applications using Kafka, Kafka Streams and Sprint Boot. Hands-on experience with any high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. Hands-on experience with one of these distributed data stores; HBase, Cassandra, MongoDB, AWS Dynamo DB Some hands-on experience with a distributed message broker, such as Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Hands-on experience with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. and experience with Big Data architectures and BI solutions Additional qualifications (Nice to have) : Intermediate working knowledge of DevOps tools Terraform, Ansible, Jenkins, Maven/Gradle, Nexus/Artifactory and CI/CD pipeline etc. Comprehensive debugging and troubleshooting skills, resourcefulness and strong researching skills Proficiency and demonstrated skills in both Oral and Written business communications Keywords: continuous integration continuous deployment business intelligence sthree database information technology Texas http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=347367&uid= |
| [email protected] View All |
| 12:09 AM 09-Feb-23 |