Data Bricks Consultant || Jackson Michigan (onsite) at Jackson, New Hampshire, USA |
Email: [email protected] |
Hello, My name is Divya Pandey. and I am a Staffing Specialist at Resource Logistics. I am reaching out to you on an exciting job opportunity with one of our clients COMPLETE JOB DESCRIPTION IS BELOW FOR YOUR REVIEW: Job Title: Data Bricks Consultant Location : Jackson Michigan (onsite) Hire Type: Contract Rate: 70 c2c Following is the skill matrix: Note: Primary Skill is Data Bricks (It include all the skill sets like Python, Py-Spark, SQL, ADF). Refer the Technical areas/responsibilities defined for this role. Additionally, the skill sets have been mentioned explicitly (ADF, SQL, Python, Py-spark) so that it will be easy for you to look at the high level skill sets in the profile of the Candidates. Sr.No Skill Technical Area Must Have/Good to have Proficiency Rating (1 to 5) 1 Data Bricks Strong hands on in Pyspark and Apache Spark Strong hands on in Medallion architecture Experience in Native Spark Migration to Databricks. Experience in Building Data Governance Solutions like Unity Catalog, Azure Purview etc. Highly experienced in Usability Optimization (Auto Compaction, ZOrdering, Vaccuming), Cost Optimization and Performance Optimization. Build Very Strong Orchestration Layer in Databricks/ADF. Workflows. Build CICD for Databricks in Azure Devops. Process near Real time Data thru Auto Loader, DLT Pipelines. Implement Security Layer in Delta Lake. Implement Massive Parallel Processing Layers in Spark SQL and PySpark. Implement Cost effective Infrastructure in Databricks. Experience In extracting logic and from on prem layers, SAP, ADLS into Pyspark/ADLS using ADF/Databricks. Must Have (Primary Skill) 4-5 2 Azure Synapse Analytics/Azure data Factory (ADF) - Hands on Experience in Azure Synapse Analytics, Azure Data Factory and Data Bricks, Azure Storage, Azure Key Vault, SQL Pools CI/CD Pipeline Designing and other Azure services like functions, logic apps - Linked services, Various Runtimes, Datasets, Pipelines, Activities - Strong Hands on Experience in Various Activites like Control flow logic and conditions (For Each, if, switch, until), Lookup, Stored procedure, scripts, validations, Copy Data, Data flow, Azure functions, Notebooks, SQL Pool Stored procedures and etc - Strong hands on exp in deployment of code through out landscape (Dev -> QA -> Prod), Git Hub, CI/CD pipelines and etc Must Have 4-5 3 SQL Server stored procedures - strong hands on creating the SQL stored procedures - Functions, Stored Procedures, how to call one SP into another, How to process record-by-record - Dynamic SQL Must have 4-5 4 Python - Must have strong background about the Python libraries like PySpark, Pandas, NumPy, pymysql, Oracle, Pyspark libraries - Must have strong hands on to get data through APIs - Must be able to install libraries and help users to troubleshoot issues - Must have knowledge to get the data through stored procedures via Python - Should be able to debug the Python code Must have 4-5 5 Sparks - Hands on experioence in Spark Pools, PySpark - Should be able to merge data/delta loads through Notebooks - Must have strong background about the Python libraries and PySpark Must have 4-5 Thanks & Regards, Divya Pandey, Technical Recruiter |Email: [email protected] Resource Logistics, Inc. -- Keywords: continuous integration continuous deployment quality analyst access management information technology Data Bricks Consultant || Jackson Michigan (onsite) [email protected] |
[email protected] View All |
11:00 PM 06-May-24 |