Home

S Vinay - Azure Data Engineering
[email protected]
Location: Austin, Texas, USA
Relocation: Open
Visa: H1
SURYA VINAY

SUMMARY:
With over 11+ years of IT experience, including 5 years specializing in Azure Data Engineering
contributing to various aspects of Product and Service-based organizations.
Experience in Supply Chain management, Logistics & E-commerce domains
Extensive experience in Azure Data Engineering, specializing in data ingestion, processing, and
integration.
Proficient in developing and managing data pipelines using Azure Data Factory, incorporating services
like Azure Storage, Azure SQL, Azure Data bricks, and PySpark.
Expertise in data processing and analytics for actionable insights and decision-making.
Experience on migrating SQL database to Azure Data Lake, Azure data Lake Analytics, Azure SQL, Data
Bricks and Azure SQL Data warehouse and controlling and granting database access
Experience in developing spark applications using spark-SQL in Data bricks for data extraction,
transformation and aggregation from multiple file formats
Well versed with Agile methodology and project management tools such as JIRA & VSO
Extensive experience in Software Development Life Cycle (SDLC) & Software Testing Life Cycle (STLC).
Well versed in different management scenarios like Quality Assurance, Defect Tracking, System
Integration, and Task Scheduling.
SKILL SUMMARY:
Cloud Platforms : Azure (Data Factory, SQL Database, Data Lake).
Data Engineering : ETL processes, Data Integration, Data Warehousing
Databases : SQL Server, Azure SQL Database.
Big Data Technologies : Azure Databricks, Spark, ADLS (Azure Data Lake Storage)
Data Modeling : Star Schema, Snowflake Schema, Dimensional Modeling
Programming : SQL, Pyspark, Python, Spark SQL
Version Control & DevOps : Git, CI/CD pipelines.
Tools : Apache Spark, Data Bricks, Azure Data Factory (ADF)
PROFESSIONAL EXPERIENCE
Risposta Corp Apr 2022 Oct 2024
Azure Data Engineer
Client: New Clicks Group
Project: Third Party Billing
Clicks Group is a retail-led healthcare recognized as one of the leading supply chain management companies in
the region. Through market-leading retail brands Clicks, GNC, The Body Shop and Claire's, the group has over
850 stores across southern Africa.
Responsibilities:
Designing and implementing effective database solution (Azure Blob Storage) to store and retrieve
data.
Deployed Azure Data Factory for creating Pipeline to orchestrate the data in SQL Database.
Developed Spark Application using Pyspark and Spark SQL for data extraction, transformation and
aggregation from multiple file format for analyzing & transforming data to uncover insight into the
customer usage patterns.
Optimize data processing and storage for performance and cost efficiency
Develop, unit test, and debug complex data processing scripts using SQL and Python to ensure data
quality and reliability.
Developed Data Bricks jobs using Pyspark and Spark SQL.
Understand and manage various data file formats such as JSON, CSV, Avro, Parquet, and Delta Lake for
effective data storage and retrieval.
Involved in the Migration activity, migrating data from SQL Server to Azure Cloud.
Participate in Agile projects, following DevOps processes using technologies like Git, Jenkins, and Azure
DevOps to ensure continuous integration and continuous delivery (CI/CD).
Created Multiple aggregates table and have created Business metrics for Business stake holder to track
the revenue model and features.
Developed multiple datasets from the raw data and scheduled using Databricks notebook
Cigniti Technologies Apr 2021 Apr 2022
Azure Data Engineer
Client: Accelya
Project: United Airlines
Accelya have been innovating and transforming the airline industry for many clients like United Airline by
partnering with IATA on industry-wide initiatives to creating strategic solutions that simplify airline processes,
Accelya has always been driving the airline industry forward.
Responsibilities:
Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform and load
data from different sources like Azure SQL, Blob storage and Azure SQL Data warehouse
Designing and creating Tables, Views, User Created Data Types, Indexes, Stored Procedures,
Cursors, Triggers and Transactions.
Designing and developing SQL Server database structure, Stored Procedures and Triggers.
Performance tuning of Spark Applications for setting right Batch Interval time, correct level of
Parallelism and memory tuning.
Develop, unit test, and debug complex data processing scripts using SQL and Python to ensure data
Attending daily meetings to synchronize the model and resolve the issues.
Zen3 Info-solutions (Tech Mahindra) Feb 2019 Dec 2020
Azure Data Engineer
Client: Microsoft
Project: Xbox Support
Microsoft Xbox team has rebuilt a new support website (support.xbox.com) that covers more regions with a
brand-new look and feel making it faster and simpler in getting Xbox users help with their account, games,
subscriptions and devices. In addition to the new look and more intuitive, the new site is designed to provide
notifications based on what s happening with Xbox and Users account, so the information is clear and ready for
the user
Responsibilities:
Collecting and importing data from various sources into Azure storage solutions, such as Azure SQL Dat
abase, Azure Data Lake Storage, or Azure Cosmos DB1
Developing data processing pipelines using tools like Azure Data Factory or Azure Databricks to clean
transform, and aggregate data
Implementing ETL (Extract, Transform, Load) processes to integrate data from different sources into a
unified data warehouse or data lake1
Developed Data Bricks jobs using Pyspark and Spark SQL.
Involved in the Migration activity, migrating data from SQL Server to Azure Cloud.
Developed and maintained ETL pipelines using Azure Data Factory
Managed data storage solutions such as Azure SQL Database, Azure Data Lake Storage, and Azure
Cosmos DB.
Documented data workflows, processes, and changes made to data pipelines.
Logit One Pvt Ltd Jul 2013 - Jan 2019
Client: DHL, Gosselin, Ahlers & Feige
Senior QA Engineer (Sep 2015 Jan 2019)
Associate Product Analyst (Nov 2014 Aug 2015)
QA Engineer (Jul 2013 Oct 2014)
Logit4SEE is Logistic Enterprise application is combination of Air freights predictability and Ocean freights
lower costs giving one solution to all the customers yearning for speed, probability and knowing where their
shipment/cargo is. Built on Java/J2EE technologies, Logit4SEE covers all the Logistic activities which include
routing information and transparency at key handover moments like terminal/carrier, carrier/import,
transport and transshipment integrating with various tracking systems (carrier & vessel).
Responsibilities:
Understanding the client requirements by participating in planning meetings.
Responsible for analysis of Business Requirement and creation of Test Plan documents (MTD-master
Test plan document)/Test case design.
Prepared high-level scenarios and test cases based on the requirements.
Executed test cases using SQL developer, Putty, WinSCP, web Application.
Defect reporting in Rally, JIRA, QC and tracking on daily basis.
Responsible for effective communication between the project team (BSA/AD/client) and the customer.
Responsible for sprint status updates showing the progress of the testing effort and open issues to be
resolved.
EDUCATION: Bachelor s Degree from Osmania University.
Keywords: continuous integration continuous deployment quality analyst database active directory information technology

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];4929
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: