Home

Rahul Dev - Data Engineer
[email protected]
Location: Dallas, Texas, USA
Relocation:
Visa: GCEAD
Resume file: Rahul Suryadevara_DE_databricks_1763388043575.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
Rahul Dev Suryadevara
Mobile:979-221-1004

Summary:
Over 15+ years of experience designing, developing, and optimizing large-scale data solutions across AWS, Azure, and hybrid cloud platforms. Proven expertise in building scalable ETL pipelines, optimizing complex SQL queries, and delivering real-time analytics using AWS Glue, Redshift, S3, Lambda, EMR, Athena, Lake Formation, Amazon Kinesis, as well as Azure Data Factory, Azure Synapse, Azure Databricks (Delta Lake, Delta Live Tables, Unity Catalog), Azure Data Lake Storage, Azure Blob Storage, Azure Stream Analytics, and both traditional databases (SQL Server, Oracle, MySQL) and cloud databases (Amazon RDS, Azure SQL Database, Google Cloud SQL). Experience in implementing real-time data streaming and event-driven architectures using Kafka, and other messaging systems to support low-latency data processing. Skilled in data modeling, reporting, and dashboard development with Power BI, Tableau, Cognos, SSRS, and Informatica PowerCenter to support key business decisions. Experienced in leading cloud migrations, implementing data governance frameworks with Unity Catalog and Azure Purview, and collaborating with analysts and data scientists to ensure high-quality, validated datasets.
Education:
International Technological University, USA, Master s in electrical engineering (2009)
Jawaharlal Nehru Technological University, India, Electrical and Electronics Engineering, Bachelor of Science (2006)

Certification:
SnowPro Associate Certification
SnowPro Core Certification
Azure Fabric Data Engineer Associate

Skills:

ETL Tools: IDMC, IICS, Informatica Power Center 10.5/10.2/9.X/8.X, SQL Server Integration Service (SSIS), Informatica Power Exchange, Databricks,DBT(Data Build Tool), Apache Spark, Trino, Apache Hive, Flink, Trino
Cloud ETL Tools: AWS (AWS Glue,AWS Data Pipline, Amazon S3,AWS Lake Formation,Amazon Redshift,AWS EMR,Lambda,AWS Athena), Azure ( Azure Data Factory,Azure Synapse,Azure Databricks Delta Live tables,Azure Data lake storage Gen2,Azure BlOB Storage,Azure SQL Storage),Databricks(Delta lake,Delta live tables,Unity Catlog,Autoloader,Workflows),Apache Iceberg
Reporting Tools: Cognos 11,10.x, 8.x, Power Play, Cognos Query, TM1, SQL Server Integration Service (SSRS), SQL ELK (Elastic Search, Log stash, KIBANA),Tableau,PowerBI
Data Architecture: Lakehouse patterns, Data modeling, Governance (RBAC, metadata),Unity Catalog, Open Table format(Apache Iceberg, Detatable,Apache Hudi, Apache Avro, Parquet)
Databases: Oracle 11g/10g /9i/8i, SQL Server 2000/2005, MS SQL, MS-Access, IBM DB2 v9.5/9.1, UDB, Teradata (14.1)
NoSql Databases: Azure Cosmos DB, MongoDB, Cassandra
Databricks: Streaming workloads, Delta Lake, Unity Catalog, Delta Live Tables
Languages: SQL,T-SQL, PL/ SQL, Unix Shell scripting, Python 3.0,Pyspark, SparkSQL,Hive,Amazon RedShift, Scala (familiar),Snowflake(Smowpipe,Snowpark,Snowstream,Snowpark,StoredProcedures),.
Other Tools: TWS, TOAD, SQL*Loader, Microsoft Office
CI/CD Tools: BitBucket,Github, GitHub Actions (Yaml Pipielines),Jenkins, Ansible,Azure Devops,
Scheduling Tools: Dagster, Control-M & Autosys



Professional Experience:

Bank of America, Plano TX Sep 2024 Present
Lead Data Engineer
Spearheaded the design and implementation of scalable data pipelines using Azure Databricks, Azure Data Factory, and PySpark, delivering seamless ETL processes that accelerated data processing by over 50%.
Implemented Medallion Architecture in Snowflake, organizing data into Bronze, Silver, and Gold layers to optimize data flow, transformation, and analytics in data lakehouse environments.
Proficient in Snowflake SQL and Python, with expertise in data modeling, query optimization, and leveraging Snowflake s features for high performance and cost-efficient data storage and Optimized Snowflake performance through clustering keys, partitioning, and caching strategies
Leveraged Kafka Streaming Services for implementing event-based data pipelines, ensuring reliable data streaming with real-time data processing, and enabling event-driven architectures to support dynamic business use cases
Led cross-functional teams in the adoption of CI/CD pipelines with GitHub Actions and Azure DevOps, significantly increasing deployment frequency and reducing time-to-production.
Developed and enforced data governance frameworks using Unity Catalog and Azure Purview, ensuring compliance with industry data privacy regulations.
Provided technical leadership and mentorship to junior engineers, helping to build a collaborative team environment and improving team productivity and knowledge transfer.

Tools: Azure Databricks, Azure Data Factory, PySpark, Python, GitHub Actions, Unity Catalog, Azure DevOps, Delta Lake, Azure Event Hub, Medallion Architecture,Snowflake,Kafka

Tata Tata Consultancy Services (JPMC), Plano, TX Nov 2023 Aug 2024
Lead Data Engineer
Developed real-time ingestion pipelines using Azure Databricks and Delta Lake to handle high-throughput data, supporting event and clickstream data processing for large-scale applications.
Enhanced data engineering workflows using PySpark and Python to streamline data transformations and improve data processing speeds by 40%.
Leveraged Medallion Architecture to create a scalable framework for data processing that improved data quality, ensured consistency, and enhanced overall operational efficiency.
Integrated data between AWS and Azure cloud environments to ensure seamless data synchronization using Lambda, Glue, and Cloud Functions, enabling hybrid cloud strategies and enhanced data availability.
Expertise in Snowflake SQL, including advanced features such as window functions, CTEs, and advanced joins, optimizing complex queries for performance and scalability.
Mentored and led a team of data engineers, focusing on building multi-functional teams and promoting a culture of collaboration and innovation.

Tools: Azure Databricks, Azure Data Factory, Python, PySpark, GitHub Actions, Tableau, AWS, Azure, Medallion Architecture,Snowfalke

Tata Tata Consultancy Services (Delta Airlines), Atlanta GA Feb 2023 Nov 2023
Senior Data Engineer
Played a pivotal role in migrating ETL workflows from on-premises infrastructure to Azure, reducing operational overhead and improving system performance by 45%.
Designed and built cloud-native data pipelines leveraging Azure Data Factory, Azure Databricks, and Power BI, optimizing data availability and enabling faster decision-making processes for business leaders.
Reduced manual processing time by 95% through the development of automated data cleaning and transformation workflows using PySpark and Python.
Collaborated with data scientists to optimize data pipelines for machine learning, ensuring high-quality data for training and deployment of models.
Implemented data governance strategies using Azure Purview and Unity Catalog to ensure compliance with internal and external data privacy regulations.
Developed and maintained interactive BI dashboards and reports using Power BI and Tableau, providing actionable insights to both technical and non-technical stakeholders.
Tools: Azure Databricks, Azure Data Factory, Power BI, Python, PySpark, Informatica IICS, Tableau, Azure Purview, AWS
.
Optum (Randstad), Charleston, WV Apr 2019 Dec 2022
Senior Data Engineer
Project: Integrated Eligibility (IE) Child Support, Child welfare Applications

Collaborated closely with source data application teams and product owners to design, implement, and optimize analytics solutions that provided actionable insights to support data-driven decision-making.
design and implementation of scalable data migration solutions using Azure Data Factory, Azure Databricks, and Azure Data Lake Storage, successfully migrating large volumes of data to cloud environments.
Developed and maintained data pipelines for batch and streaming data, integrating Azure Stream Analytics, Event Hub, and Databricks, enabling real-time analytics and enhanced reporting capabilities
Optimized data ingestion processes by leveraging Azure Data Lake and Azure Blob Storage, reducing data processing time by 50% and ensuring seamless storage and retrieval of large datasets.
Worked closely with cross-functional teams to gather and define business requirements, ensuring that the developed ETL processes met the data needs of the organization.
Developed and maintained data models in Azure Databricks, utilizing both Delta Lake and Unity Catalog for optimal performance, scalability, and governance.
Provided ongoing support and maintenance for production data systems, resolving issues and ensuring high availability, reliability, and security of cloud-based data platforms.

Tools: Azure Databricks, Azure Data Factory, Power BI, Python, PySpark, Informatica IICS, Tableau, Azure Purview, AWS

Highmark Health Solutions (Celerity), Pittsburgh Apr 2018 Feb 2019
Senior Business Intelligence Developer
Analysed the business requirements and developed models using Cognos framework manager and published packages for reports development.
Contributed in the analysis and data modeling of HR, Invoicing and Inventory systems using Erwin.
Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
Created Framework Manager Model (Metadata modelling) for Report Studio reports; Data modeling, Star Schema Designing for both relational and dimensional FM model and create various types of reports like List, Cross tab, Chart, Sectioning, Drill through, Drill up/down, Master detail etc.

BMW, New Jersey Oct 2017- Mar 2018
Senior Business Intelligence Developer
Design and developed the Dynamic cubes by importing the data from Framework manager as well as importing data from data sources.
Implement validation rules and on-screen errors within the user interface. Retrieve exceptions i.e. unmapped items from ETL processes i.e. loading trial balance. Integrate with the TDM for seamless committing of approved changes to data.
Designed Dynamic Cubes in memory tool using Cube Designer tool and deployed the Cognos 11 environment to increase the performance which is using large data sets. Used Aggregate advisor in optimizing the performance of dynamic cubes.

SAAB, O Hare, IL Jun 2017- Oct 2017
Senior ETL/MSBI Developer
Used various SSIS tasks such as Conditional Split, Derived Column, which were used for Data Scrubbing, data validation checks during Staging, before loading the data into the Data warehouse. Created SSIS package to get data from different sources consolidate and merge into one single source.
Created Sub-Reports, Drilldown-Reports, Summary Reports, and Parameterized Reports using SSRS. Create Elastic search queries using JSON and Curl Commands Build Kibana visualizations and Dashboards using Elastic Search queries.

Wells Fargo, Hoffman Estates, IL Mar 2017- Jun 2017
Senior Business Intelligence Developer
Involved in designing, developing and deploying reports in MS SQL Server environment using SSRS
Created Parameterized Queries, generated Tabular reports, sub reports, Cross Tabs, Drill down reports using Expressions, Functions, Charts, Maps, Sorting the data, Defining Data sources and Subtotals for the reports using SSRS 2015.
Designed, deployed, and maintained complex canned reports using SQL Server 2014 Reporting Services (SSRS).

Orlando Utilities and Corporation, Orlando, FL Dec 2015- Feb 2017
Senior Business Intelligence Developer
Extensive experience on Structuring Metadata and Cognos Models using Framework Manager.
Involved in Designing DMR models with Relational Data using Framework Manager and deployed packages to the Report Servers Migrated Reports and Packages from Cognos 10.2.1 to 10.2.2.
Created List Reports, Cross tab Reports and Control Charts.
Implemented Security for Data access, Object level access and Package access.

Catepillar/ (SOASIS), Peoria, IL Feb 2015- Nov 2015
Senior ETL/BI Developer
Involved in understanding existing Business Model and Customer Requirements by participating in the Design Team and user requirement gathering meetings. Expertise in managing multiple projects and coordinating with off-shore team for on-time delivery of report modules
Worked on scheduling and saving the reports on Daily, Weekly & Monthly basis using Cognos Scheduler. Performance tuning by analyzing and comparing the turnaround times between SQL and Cognos.

Aon Hewitt/ Hexaware, Lincolnshire, IL Jul 2014 Jan 2015
Senior ETL/BI Developer
Create models, packages and publish packages to Cognos Connection using Framework Manger.
Developed Standard Templates, Dashboard Reports, Standard Reports, Charts, and Drill through Reports, Master-detail & conditional formatting reports using Report Studio.

American Eagle Outfitters (AEO), Pittsburgh, PA Nov 2013- June2014
Senior ETL Developer
Utilized Framework Manager for metadata modeling created and published packages to the Cognos connection Developed the ETL Informatica Mappings with Complex Transformations.
Performed bulk data load from multiple data source (ORACLE) to Teradata RDBMS using BTEQ, Multi Load and Fast Load.

Johnson & Johnson, Titusville, NJ Nov2012- Oct 2013
ETL Developer/ L3 Support
Worked on Informatica Power Center tool - Source Analyzer, Warehouse designer, Mapping and Transformations, Informatica Repository Manager, Informatica Workflow Manager and Workflow Monitor, Tier 3 support of EDW production processes including, but not limited to Issue investigation, root cause analysis, and defect resolution.

Abbott Laboratories, Waukegan, IL Jun 2011- Oct 2012
Business Intelligence Developer
Generated various List Reports, Grouped Reports, Cross tab Reports, Chart Reports, Drill-Down and Drill-Through Reports. Generated Complex reports in Cognos 8 report studio including Drill Down reports from DMR modelled Framework model.
Used some advanced features like Conditional Formatting in Report Studio in generating multipage Reports.

Hartford Life, Chicago, IL Jan 2010- May 2011
Business Intelligence Devloper
Creating Framework Manager Models in Cognos 8.4. Created dashboards, simple list reports, Crosstab reports and various Drill-Through reports according to the requirements
Developed Test Cases according to Business and Technical Requirements and prepared SQL scripts to test data.
Keywords: continuous integration continuous deployment business intelligence sthree database microsoft mississippi procedural language Florida Georgia Illinois New Jersey Pennsylvania Texas West Virginia

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];6429
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: