Home

Rohith Dhane - Lead Data Engineer
rohith.dhane69@gmail.com
Location: Wilmington, Delaware, USA
Relocation: YES
Visa: GC
Resume file: Rohith Dhane Data Engineer_1749129053330.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
Profile Summary: Rohith Dhane Senior Data Engineer
Overview:
Rohith Dhane is a highly experienced and technically proficient Senior Data Engineer with over 12 years of hands-on experience in building and managing end-to-end data engineering solutions. His expertise spans across Big Data technologies, cloud platforms (AWS, Azure, GCP), and modern data pipelines for large-scale, real-time, and batch processing systems.

Core Competencies:
Cloud Platforms: Proficient in AWS (S3, EMR, Lambda, Glue, Redshift), Azure (Data Factory, Synapse, Databricks), and GCP (BigQuery, Dataflow, Dataproc).

Big Data Technologies: Deep expertise in Hadoop ecosystem (Hive, HDFS, MapReduce, Pig, Spark, Sqoop), Spark (PySpark, Scala), Kafka, HBase, MongoDB, Cassandra.

ETL & Data Pipelines: Skilled in designing scalable ETL pipelines using Apache Airflow, Spark, Python, Talend, Informatica, and Data Fusion.

Programming Languages: Proficient in Python, Scala, Shell scripting, SQL, and HiveQL.

DevOps & Automation: Experience with CI/CD pipelines using Jenkins, Terraform, AWS SAM, and infrastructure automation.

BI & Visualization: Developed reports and dashboards using Tableau and Power BI.

Monitoring & Performance: Worked with tools like AppDynamics, Datadog, ELK Stack, Dynatrace, and IDERA.

Professional Highlights:
CareFirst (2023 Present): Led cloud-native ETL workflows using PySpark and AWS services, built robust data ingestion and transformation pipelines, and implemented real-time analytics using Kafka and Spark Streaming.

CBRE (2021 2023): Migrated legacy systems to AWS EMR, built real-time data processing apps using Spark and Kafka, and managed infrastructure using CloudFormation and Jenkins.

Merck Pharma (2019 2021): Developed and deployed data pipelines on GCP using Airflow, BigQuery, and Dataflow. Built REST APIs for data ingestion and managed real-time streaming data solutions.

Dollar Tree (2016 2019): Played a key role in Azure cloud migration, data warehousing using Synapse Analytics, developed ETL pipelines, and built dashboards using Tableau.

Concentrix (2013 2016): Designed and executed ETL processes using Talend and Informatica, optimized database performance, and implemented cloud-based Redshift solutions via AWS Data Pipeline.

Technical Summary:
Big Data Tools: Hadoop, Spark, Hive, Kafka, Flume, Sqoop, HBase, Pig

Languages: Python, Scala, SQL, Shell Scripting

Databases: Snowflake, Teradata, Oracle, SQL Server, MongoDB, DynamoDB

Cloud: AWS, Azure, GCP

ETL Tools: Talend, Informatica, Airflow, Data Factory

DevOps: Jenkins, Git, Terraform, AWS SAM

BI Tools: Tableau, Power BI

Soft Skills & Methodologies:
Strong communication and problem-solving skills

Agile/Scrum project delivery experience

Adept at stakeholder collaboration and requirement gathering

Proven ability to mentor and guide junior engineers
Keywords: continuous integration continuous deployment business intelligence sthree

To remove this resume please click here or send an email from rohith.dhane69@gmail.com to usjobs@nvoids.com with subject as "delete" (without inverted commas)
rohith.dhane69@gmail.com;5607
Enter the captcha code and we will send and email at rohith.dhane69@gmail.com
with a link to edit / delete this resume
Captcha Image: