Hiring Databricks developer with Strong Java , Weehawken, NJ (Day one Onsite) at Weehawken, New Jersey, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=871742&uid= From: Swati, SIRI INFO SOLUTIONS [email protected] Reply to: [email protected] Hello, Have a nice day...! Greeting from Siri Info Solutions. I am Swati with Siri Info Solutions. We Siri Info Solutions is a global staff augmentation firm providing a wide-range of talent on-demand and total workforce solutions. We have an immediate opening for the below position with one of our premium clients. Role : Databricks developer with Strong Java Location : Weehawken, NJ (Day one Onsite) Duration : Long Term Job Description: Must have skills - Java, PySpark and Azure Databricks, Azure PaaS experience Responsibilities : 6 to 8 years of IT experience and at least 4+ years in cloud native data engineering solution design/architecture and development. Strong hands-on expertise in building Data engineering solutions using Java, Apache Spark and Scala. Deep expertise in Performance tuning and optimization of Java/Spark/Scala programs. Ability to design and build data engineering tasks using Data bricks framework. Hands on experience working with Azure Cloud services like Azure Data Factory, Azure Data Lake Storage, Azure Even hub, Azure SQL, Azure Analytical services, Azure Cosmos DB, Should be able to demonstrate expertise in designing and building efficient and scalable data engineering solutions using Massively Parallel Processing (MPP) architecture. Micro-batch streaming and continuous streaming process in Databricks for high latency, low latency and ultra-low latency of data accordingly by using inbuilt Apache spark modules. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in Azure Databricks. Build data pipelines for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL Experience in writing HQL queries in Hive Data warehouse and performance tuning of HIVE scripts Well versed with Relational and Dimensional Modelling techniques like Star, Snowflake Schema, OLTP, OLAP, Normalization, Fact and Dimensional Tables Proficient in creating Technical Design Documents (TDD). Familiar with DevOps principles and experience in CD/CI implementations using Azure native services. Very good customer relationship and stakeholder management skills Excellent communication and presentation skills. Please share your Employer details, doc ( Visa copy & DL Copy) & share updated resume Current Location Visa status Prefer time to call you Contact No. Skype id LinkedIn URL Education details both bachelors & Masters (university, Stream, Year of passing) Keywords: continuous integration continuous deployment access management database information technology Idaho New Jersey http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=871742&uid= |
[email protected] View All |
09:59 PM 20-Nov-23 |