Home

Gowtham Krishna G - Azure Data Engineer
[email protected]
Location: , , USA
Relocation: Yes
Visa: H1b
Resume file: Azure Data Architect_1752499507178.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
Gowtham Ghattamaneni
[email protected]
M: 470-480-1899
Around 10 years of Data Architect, Data Engineering, ETL experience in Health Care, Banking, Finance, Automobile industry Domains with expertise in on-premises Databases Microsoft SQL Server/oracle/DB2/MYSQL/Postgres SQL and cloud-based Database Applications using Azure, Snowflake, Databricks database management system.

Summary of Qualification:

Strong experience in logical and physical database design and development, Normalization and data modeling and good knowledge in RDBMS concepts OLTP and OLAP and design Patterns.
Developing and implementing an overall organizational data strategy that is in line with business processes.
The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems.
Identifying data sources, both internal and external, and working out a plan for data management that is aligned with organizational data strategy.
Coordinating and collaborating with cross-functional teams, stakeholders, and vendors for the smooth functioning of the enterprise data system.
Managing end-to-end data architecture, from designing the technical architecture, and developing the application to finally testing and implementing the proposed solution both on-prem and cloud
Planning and execution of big data solutions using technologies such as Snowflake/Azure Synapse/Data Bricks/GCP.
Experience on various cloud platforms to migrate On-premises Applications to Cloud and Configure Infrastructure as service(IASS),Platform as Services(PASS) and publica and private networks accesses and Infrastructure Deployment process and continuous Integration Process(CICD) using Git and Azure Devops.
Defining and managing the flow of data and dissemination of information within the organization.
Integrating technical functionality, ensuring data accessibility, accuracy, and security.
Conducting a continuous audit of data management system performance, refine whenever required, and report immediately any breach or loopholes to the stakeholders.
Performed Data Governance by Define roles and responsibilities related to data governance and ensure clear accountability for stewardship of the company s principal information assets using collibra.
Create and maintain common data dictionaries and the tools or methods that support data standards for an organization.
Involved in ETL Design and Architecture Process and Data Warehouse Implementation/development for Retail, Health care and Banking Domains.
Knowledge and Experience on data models using data modeling tools (ER_Studio or Erwin).
Writing T-SQL Script for creation of data base objects Stored Procedures, User Defined Functions, Triggers, Views, Sub Queries, Stored Procedures/Common table Expressions (CTE) for Temporary result set in T-SQL, /MySQL PL/SQL/ Sybase (ASE)/DB2/ MY SQL/Mongo DB/Postgre Sql Databases.
Involved in Query Optimization and Performance Tuning using SQL Profiler and Index Tuning Wizard.
Worked on SQL Server 2014 migration from SQL Server 2012/2008R2 / 2008, Master in scripting for Homogenous and Heterogeneous sources Migration.
Experienced with all phases of Data Warehouse development life cycle, from gathering requirements, testing, and identifying the Facts and Dimensions, creating Star and Snowflake schemas.
Hands on Experience on ETL Process to Extract transform and Load data using (SSIS) and Informatica/Azure Data Factory/Snow pipelines and Azure Data bricks.
Experience on reporting tools SSRS, Power BI and Tableau and Cognos created dashboards, KPI s and understanding of Tableau Server/Power BI Desktop/Power BI Services administration, security, and architecture.
Proficient in building Cubes, defining Referenced Relationships, and creating Hierarchies, Perspectives, Partitions and Designed Aggregations in cubes using SSAS and wrote MDX Queries.
Working with SQL AZURE Data Bases and writing the scripts to Create Data Base Objects within the SQL AZURE.
Good knowledge of Hadoop framework, Hadoop distributed file system and parallel processing and Ecosystems HDFS, Map Reduce, Hive, Pig, HBase, Sqoop and in Hadoop, AWS Red shift.
Ability to configure SQL server to XML and HTML /JSON/API Web Services used to extract, store.
Developed python and Spark applications using Azure Data Bricks to batch and streaming process.
Experience with Mainframe environment, transitioning from mainframe functions to data warehouse.
Writing complex Access, Excel Macros with Excel VBA for different Business Environments.
Experience in working with software methodologies like Agile, SCRUM, and Waterfall.

Soft Skills:

Platforms Windows XP/2003, 2008 Server/Vista/ Windows 7, Oracle 11g/10g/9i/8i, PL/SQL, SQL *PLUS, TOAD and SQL*LOADER, Net Framework 1.0/2.0/3.5/4.0, CSOM, SSOM, REST API , MS Excel APIs, Linux, FTP, REST and JSON.
Languages T-SQL, SQL, SQL*PL/SQL ,MY SQL VB .NET, C#, HTML, MDX/DAX, Java script
RDBMS Microsoft SQL Server 2005/2008/2012/2014,2016 MS Access,DB2, Oracle 11g/10g, Tableau Desktop 7.0, 8.0,9.0
SQL Server Tools DTS, SSIS, SSAS, SSRS, SSMS, Power BI Configuration Manager, Enterprise Manager, Query Analyzer Profiler, Database Tuning Advisor, SAP BO data services 9.0, IBM's Data stage version 9.0.
Other Tools MS Office Suite, MS Visio, Crystal Reports, MIS Reports, TFS, Share point 2013.
Work Experience
State Of state of Alabama 2023 May to Till Date
Chicago, IL(Remote)
Data Bricks Lead Engineer/Data Architect
Responsibilities:

Implementing Data Integration from Legacy system DB2,VSAM and various other source systems and file formats, to oracle database using spark sql in data bricks and scheduling jobs using Azure data factory.
Develop Database Migration procedures, technical specifications, knowledge of Oracle Migration Utilities (Import/Export, RMAN etc.)/ SQL Server Migration Utilities (backup/restore), support documents and training guides.
Experience on Azure DevOps designing and supporting Azure environments, Azure DevOps, including IaaS and PaaS.
Extensive experience in Azure SQL and Azure NoSQL, Azure Stream Analytics, Azure APIs and Event Hub, Azure Storage Solutions, Azure Log Analytics.
Develop deployment automation and Azure pipelines for workload in Azure including (AKS, Azure Data Factory, Azure Key Vaults, etc.) using Azure DevOps.
Experience in managing and deploying azure cloud based infra structure and application and in managing Virtual Networks and subnets and integrating web applications.
Created Virtual Machines and subnetwork security groups and virtual networks and NSG rules.
Configuring and troubleshooting VPN connections site to site and point to site, vnet to vnet and vnet peering.
Deploy and configure Storage Accounts, App Services, App Service Plans, App Service Environments, App Insights, Function Apps, Key Vaults, SQL DB, COSMOS DB and other approved Azure resources.
Experience deploying resources with PowerShell and CLI and Configure, modify, deploy ARM templates.
Knowledge in Azure data bricks and big data tools, Hadoop ecosystem, Hive ,spark, pig, Sqoop, flume. Snow SQL, Python and Databricks, Py Spark.
Good understanding of spark architecture for streaming and batch process using azure data bricks with Azure and AWS environments and configuring clusters in databricks and managing clusters.
Experience on extraction and transformation and loading data from source to target using Pysaprk and Spark SQL with data brick notebooks.
Experience in data analytics using azure data bricks workspace, managing azure data bricks notebooks, and integrating data using delta lake using python and spark SQL.
Knowledge of Databricks unity catalog for data governance and data management and CICD implementation using Azure Devops and Git repositories.
Tools used Jira confluence for ticketing and tracking, Service now, GitHub, SharePoint, Visio, Data bricks, Data Lake, Azure Data Factory,DB2,Microsoft SQL server, UNIX, Python, PySpark and SQL.


Safeway (Albertsons Companies, Inc) 2021 Sep to April 2023
Chicago, IL(Remote)
Data Architect
Responsibilities:
Analyze, design, and build Modern data solutions using Azure Pass service to support visualization of data. Understand current production state of application and determine the impact of new implementation on existing business processes.
Extract Transform and Load data from source systems to Azure data storage services using combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL and Azure Data Analytics. Data Ingestion to one or more Azure Services(Azure Data Lake, Azure Storage, Azure SQL, Azure DW,COSMOS DB) and Processing data in Azure Data bricks and Azure Synapses, AWS Glue.
Perform Data Analysis and Data Migration from other databases to snowflake.
Perform data mapping documents from source system to target system and convert the scripts as per new version EDW data warehouse and validate the data from source system to destination.
Involved in ETL Process to create snow pipelines and Data factory pipelines to extract transform load data from data lake to Snowflake using Azure data factory, Data Bricks and Snowflake data warehousing and Master Data Management.
Developed Spark applications using pyspark and spark-SQL for data extraction, transformation, and aggregation from multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns.
Writing T-SQL statements and SQL server objects tables, views and stored procedure and temp tables and table variables and CTE s and functions.
Writing bash and shell scripts to automate the script conversions and deployment process in between multiple environments.
Interact With front end applications by Web and Client/Server Application development using Microsoft .NET Framework with C# and programming languages and JavaScript and embedded visualization.
Knowledge in Azure data bricks and big data tools, Hadoop ecosystem, Hive ,spark, pig, Sqoop, flume. Snow SQL, Python and Data Bricks, Py Spark and CI/CD implementations.
Experience on Devops/Platform Architect on Azure / Azure Devops/ Terraform. Should also be able to play role of Build Engineer / Manager to design and manage Release Management activities in MS Azure DevOps.
Experience on Azure Pipelines on Azure Devops performing Release mgmt. activities. From Git TFS to track version control tool to track code changes or file versions.
Create Git repository and provide required level of access and create branch merge code into master branch/release branch and configure build job to pull code from git version control to create bild packages. And run jobs to deploy the code to target server by automated release pipelines from publish branch by ARM templates and YAML code.
Tools used Jira confluence for ticketing and tracking, Service now, GitHub, SharePoint, Visio, Snowflake, Data bricks, Data Lake, Azure Data Factory,DB2,Microsoft SQL server, UNIX, Python.

Work Experience
International Monetary Fund (IMF) 2019 Dec to 2021 SEPT
Washington D.C
Azure Data Architect
Responsibilities:


Design and develop data warehouse architecture, schema, structure of Data warehouse and build the data warehouse in star and snowflake methods.
Perform data modeling, entity relationships, Normalization process and Data Integrity process.
Migrating data from on-Premises SQL Server Databases to cloud-based Azure SQL Managed instance and Azure Synapses.
Perform data governance process using cloud based collibra metadata management tool.
Working on Azure Data factory and Data Lake to move data from on-premises to Azure SQL server databases.
Used to work with integration runtime to create linked services and to create data factory pipelines, data sets and data flows and Azure Databrick, AWS Glue.
Perform the performance tuning and query tuning to improve the Performance and to reduce the applications and report server down time.
Writing T-SQL statements and SQL server objects tables, views and stored procedure and temp tables and table variables and CTE s and functions.
Perform some of the database maintenance tasks backup and restore database scripts and working closely with dba s to resolve the blocking locking issues and deployment and production issues.
Interact With front end applications by Web and Client/Server Application development using Microsoft .NET Framework with C# and programming languages and JavaScript and embedded visualization.
framework , Py Spark and CI/CD implementations.
Experience on Devops/Platform Architect on Azure / Azure Devops/ Terraform. Should also be able to play role of Build Engineer / Manager to design and manage Release Management activities in MS Azure DevOps.
Experience on Azure Pipelines on Azure Devops performing Release mgmt. activities. From Git TFS to track version control tool to track code changes or file versions.
Create Git repository and provide required level of access and create branch merge code into master branch/release branch and configure build job to pull code from git version control to create bild packages. And run jobs to deploy the code to target server by automated release pipelines from publish branch by ARM templates and YAML code.
Working on Power BI Services and Power BI Desktop, Power Pivot, Power Query and Power view, DAX functions to create reports, dashboards and KPI s, create reports using tableau and Cognos.
Development of test framework using the and Python Automation for execution of tests in batch using Shell scripting.

Environment: MS SQL Server 2012/2014/2016/2017/2019, T-SQL, SSIS 2008/2012/2015/2017, and SQL Azure Cloud Services. Oracle 11g/10g, Oracle9i/8i Tableau Desktop 9.0. Power BI Desktop and Services.


State of MO- Department of Health and Senior Services (DHSS) 2019 Jan to 2019 Dec
Jefferson City, MO
Data Architect
Responsibilities:


Migrating data from on-Premises SQL Server Databases to Azure SQL Server databases and Azure data bricks.
Creating Data warehouse architecture and Design the data warehouse schema and structure of Data warehouse and build the data warehouse in star and snowflake methods.
Working on Azure Data factory and Data Lake to move data from on-premises to Azure SQL server databases
Used to work with integration runtime to create linked services and deploy ssis packages and Azure Databrick.
Creating ETL SSIS Packages and data conversion and error handling in SSIS packages and schedule SQL server Agent jobs.
Perform the performance tuning and query tuning to improve the Performance and to reduce the applications and report server down time.
Writing T-SQL statements and SQL server objects tables, views and stored procedure and temp tables and table variables and CTE s and functions.
Perform the some of the database maintenance tasks backup and restores database scripts and working closely with dba s to resolve the blocking locking issues and deployment and production issues.
Interact With front end applications by Web and Client/Server Application development using Microsoft .NET Framework with C# and programming languages and programming languages and JavaScript and embedded visualization and Backend scripting/parsing using Perl and Python.
Good knowledge of Hadoop framework, Hadoop distributed file system and parallel processing and Ecosystems HDFS, Map Reduce, Hive, Pig, HBase, Sqoop,s cala and in Hadoop, AWS Red shift and Experience on spark.
Data Integration process Python APIs to integrate data from API s.
Detail-oriented Financial Analyst with hands-on experience in quantitative / statistical analysis, and forecasting budgeting, accounting, and forecasting with Excel VBA and tableau and Cognos.
Developed Tableau data visualization and concepts, including table calculations, sophisticated joins, and data structures.
Working on Power BI Services and Power BI Desktop, Power Pivot, Power Query and Power view, DAX functions to create reports, dashboards and KPI s, create reports using tableau and Cognos.

Environment: MS SQL Server 2012/2014/2016/2017, T-SQL, SSIS 2008/2012/2015/2017, and SQL Azure Cloud Services. Oracle 11g/10g, Oracle9i/8i Tableau Desktop 9.0. Power BI Desktop and Services.

Market Source, Atlanta, GA 2018 Jan to 2019 Jan Integration Developer
Responsibilities:


Integrating the Data from different Data sources and Third-party Applications using REST and Soap API s
Working with SQL AZURE Data Bases and writing scripts Create Data Base Objects within the SQL AZURE and Knowledge and working experience on Data Lake and data factory and Azure data bricks.
Interact With front end applications by Web and Client/Server Application development using Microsoft .NET Framework with C# and VB.NET as programming languages.
Writing T-SQL, PL/SQL/MySQL Joins, correlated and non-correlated Sub Queries to avoid complexity to join Data sets and Common table Expressions (CTE) for Temporary result and Temporary tables and Pivoting tables.
Creating SSIS and Modify/optimize existing SSIS packages to accomplish ETL Process for Data Warehouse Development and Claims.
Worked on Azure AD B2C identity management to create users delete users and update the users to AD.
Ability to configure SQL server to XML and HTML /JSON/API Web Services used to extract and store in database.
Worked Hadoop eco system Hive worked on Managed tables and External tables and interactive and iterative operation on hdfs file systems using hive.
Worked on ETL techniques using Informatica Power Center 9.x to Extract and Transform data from various data sources such as flat files, mainframe VSAM files, RDBMS OLTP source and Load to the Transactional/Dimensional Databases.
Develop new VBA code (Excel) and modify/optimize any existing VBA (Macros) code, if necessary, Pivot tables.
Working on Power BI Services and Power BI Desktop, Power Pivot, Power Query and Power view, DAX functions to create reports, dashboards and KPI s, create reports using tableau and Cognos.

Environment: MS SQL Server 2012/2014/2016/2017, T-SQL, SSIS 2008/2012/2015/2017, and SQL Azure Cloud Services. Oracle 11g/10g, Oracle9i/8i Tableau Desktop 9.0. Power BI Desktop and Services.




CVS Health, Woonsocket, RI 2015 March to 2018 Jan
SQL BI /Tableau/Data warehouse Developer
Responsibilities:


Writing New T-SQL, PL/SQL/ Teradata queries for Table design and creation.
Involved in Data Warehouse Design and Development by using star Schema Design Method.
Migrated Data from SQL server 2012 to SQL Server 2014.
Implemented stored procedures with proper Set Commands and enforced business rules via checks and constraints.
Writing T-SQL, PL/SQL/MySQL Joins, correlated and non-correlated Sub Queries to avoid complexity to join Data sets and Common table Expressions (CTE) for Temporary result and Temporary tables and Pivoting tables.
Used to Work with Teradata utilities (T-SQL, B-TEQ, Fast Load, Multi Load, Fast Export, Tpump, Visual Explain, Query man), Teradata parallel support and UNIX Shell scripting, API.
Performance tuning to optimize any existing queries to speed up performance with modifications in removed unnecessary columns eliminated redundant and inconsistent data by applying Normalization techniques, established joins and created indexes whenever necessary. Review/modify current database design for better performance.
Implement database management tasks backup and restore and recovery and disaster recovery and high availability.
Used to work with troubleshooting and resolving database integrity issues, performance issues, blocking and deadlocking issues, replication issues, connectivity issues, security issues etc.
Creating SSIS and Modify/optimize existing SSIS packages to accomplish ETL Process for Data Warehouse Development and Claims.
Designed, developed, and deployed custom reports to Report Manager in MS SQL Server environment using SSRS 2012 to Create Data Driven, Drill Through, Drill Down, Tabular, Matrix reports & created charts, graphs using SQL Server 2012 Reporting Services (SSRS).
Interact With front end applications by Web and Client/Server Application development using Microsoft .NET Framework with C# and programming languages.
Created action filters, parameters, and calculated sets for preparing dashboards and worksheets in Tableau.
Connecting to different data sources to blend and model data within the Tableau server 9.0 to create interactive dash boards for reporting and analyzing of data.
Good Analytic capabilities in analyzing data, reporting requirements, identifying KPIs, insights and trends.
Worked with Power BI mobile app and created mobile layouts to view reports and dashboards effectively.
Good Experience on DAX and OLAP, OLTP in Data Warehouse Development(EDW)
Detail-oriented Financial Analyst with hands-on experience in quantitative / statistical analysis, and forecasting budgeting, accounting, and forecasting with Excel VBA.
Develop new VBA code (Excel) and modify/optimize any existing VBA (Macros) code if necessary, Pivot tables.
Wrote Scripts in Unix/Linux/Perl, Python Script, and Shell scripting for migration process.
Subscribe and connecting to SQL AZURE Data Bases and writing the scripts Create Data Base Objects within the SQL AZURE Data Lake and Big data Knowledge and AWS redshirt Knowledge.
Ability to configure SQL server to XML and HTML /JSON/API Web Services used to extract and store and Expert Excel user including VBA programming and Python scripting.
Proven experience working with large teams, in Agile/Scrum models required.

Environment: MS SQL Server 2012/2014, T-SQL, SSIS 2008/2012, SSRS 2012/2014 and MySQL5.0.22 and 5.1.24Oracle 11g/10g, Oracle9i/8i Tableau Desktop 9.0 and Azure Cloud Services.

Bajaj Auto Ltd, Hyderabad, India 2012 May to 2014 Dec
SQL Server SSIS/SSRS/SSAS Developer
Responsibilities:


Written Stored Procedures and SQL scripts both In SQL server and PL SQL server.
Designed T-SQL scripts to identify long running queries and blocking sessions.
Created database objects like tables, views, indexes, stored-procedures, triggers, and user defined functions.
Normalization and De-Normalization of tables.
Developed Backup and Restore scripts for SQL server 2005.
Installed and Configured SQL server 2005 with latest service packs.
Written T- SQL queries for the retrieval of the data.
Developed administrative tasks of SQL Server of performing daily Backups and Recovery procedures.
Maintained the security, integrity, and availability of SQL Server.
Ability to identify and resolve bottle necks specifically related to fragmentation, locking (page level, row level, and leaf level), blocking and memory and resource contention.

Environment: SQL Server 2008R2, T-QL, SQL Server 2008/2005 Oracle 9i, Visual Basic, SSIS, SSRS2005/2008, Excel.


EDUCATIONAL/PROFESSIONAL QUALIFICATION

Bachelor of Engineering Nagarjuna University.
Master of Engineering Hartford Connecticut.
Keywords: cprogramm csharp continuous integration continuous deployment business intelligence database active directory microsoft procedural language Delaware Georgia Illinois Missouri Rhode Island

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];5808
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: