| BHARATH - SR DATA ENGINEER |
| [email protected] |
| Location: Austin, Texas, USA |
| Relocation: No |
| Visa: H1B |
| Resume file: ADE_Bharath_Uppalapati_Resume_1772745148113.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
|
Bharath Uppalapati Certification Number: D929-6374
Data Engineer and Microsoft SQL BI/PL SQL/.Net Developer Contact: (248-823-8786) [email protected] OBJECTIVE Dedicated and motivated graduate seeking a Data Engineering or Software Developer position, where I can leverage my education and approximately 14 years of experience to contribute to organizational growth. EXPERENCE SUMMARY Over 13 years of IT Experience in ETL, Database Design, Data warehouse Development and business intelligence of Microsoft SQL Server 2018/2014/2012/2008(R2)/2005 in Development, Test and Production Environments with various business domains like Financial, Healthcare pharmaceutical, Insurance. 11+ years of experience working on facets, claim, member, provider information, HIPAA transactions, EDI files (834, 837, 835, 999, 277CA and proprietary formats). 4 years of Good Experience with Amazon Web Services (AWS) for creating and managing EC2, Elastic Map Reduce, Elastic Load-balancers, Elastic Container Service (Docker Containers), S3, Lambda, Elastic File system, RDS, Cloud Watch, Cloud Trail, IAM and Kinesis Streams. 4+ years on Azure Cloud. Experience on Migrating SQL database to Azure data Lake, Azure data lake Analytics, Azure SQL Database, Data Bricks and Azure SQL Data warehouse and controlling and granting database access and Migrating On premise databases to Azure Data Lake store using Azure Data factory. Experience in Developing Spark applications using Spark - SQL in Databricks for data extraction, transformation and aggregation from multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns. Creating Data Pipelines into data lake using airflow and python workers. Standardizing the data into one place which acts as a single source of truth for all reporting/analytical purpose. Experience in working on different encounters generation and submission application like MDE (facets) from TriZetto, QNXT from TriZetto and Edifecs application from Edifecs. Experience in Extracting, Transforming and Loading (ETL) data from Excel, Flat file, Teradata, Oracle to MS SQL Server by using BCP utility, DTS and SSIS services. Experience in migrating SSIS packages, reports and analysis services projects from SQL server 2008 to SQL server 2012 and later to 2016 and 2022. Experience in Data Conversion and Data Migration using SSIS and DTS services across different databases like Oracle, MS access and flat files. Sound experience in developing .Net applications with the help of ADO.Net, C#.Net, VB.Net, ASP.Net and .Net Frameworks. Hands on experience with ASP.NET 4.0, IIS, Web Services, ADO.NET, CSS, AJAX, Java Servlets. Skilled in Rapid Application Development tools like Microsoft Visual Studio .Net 4.0, MVC3.0, MVC4.0, ASP .NET, ADO .NET, Classic ASP, Com Components and Experience in front end technologies HTML5, CSS3, JavaScript, jQuery and Angular JS Framework. Experience in report writing using SSAS & SSRS on SQL Server (2008(R2)/2005/2000). Sound Experience and understanding of SSAS, OLAP cube and Architecture. Experience in Designing and Building the Dimensions and cubes with star schema using SQL Server Analysis Services (SSAS). Excellent knowledge in creating Databases, Tables, Stored Procedure, DDL/DML Triggers, Views, User defined data types, Cursors and Indexes using T-SQL. Experience in creating and managing fragmentation of Indexes to achieve better query performance. Created database objects like tables, Multi OU Views, Materialized Views, Collections, Table Partitioning Sequences, Synonyms, indexes using Oracle tools like SQL*Plus, SQL Developer and Toad. Good knowledge of key Oracle performance Experience in SQL and PL/SQL tuning and query optimization tools like SQL Trace, Explain Plan, TKPROF, HINTS and DBMS_PROFILER. Exposure to git hub, Visual source safe, TFS, Tortoise, bit bucket version controlling. TECHNICAL EXPERTISE ETL SSIS, Informatica, Python, Airflow, Pyspark (beginner) Languages Azure SQL, Snowflake SQL, RDS, T-SQL, PL-SQL, PostgreSQL, C#, Python, VB.Net, ASP, ASP.Net, UNIX, C, C++, Java-Eclipse, Windows Batch Scripting, Unix Shell scripting, AWK, GIT Bash scripting, Job scheduler (Tidal) Reporting Tools Looker, SQL Server Reporting/Analysis Services (SSRS/SSAS), Tableau Reports, Data Studio. Change Management Tool Tortoise SVN, Visual Source Safe (VSS), Team Foundation Server (TFS), Bit Bucket, Tortoise GIT Web Technologies JavaScript, HTML, CSS, JSON, XML, Angular JS. Cloud Technologies Azure Data Lake, Data Factory, Azure Databricks, AWS, Google Cloud Platform, Spark, Hive, Kafka, Snowflake SQL, EC2, Elastic Map Reduce, Elastic Load-balancers, Elastic Container Service (Docker Containers), S3, Lambda, Elastic File system, RDS, Cloud Watch, Cloud Trail, IAM and Kinesis Streams. PROJECT SUMMARY Elevance Health INC/Anthem INC (BCBS) FTE From: Jan 2014 June 2024 Employer: AANVI Cloud Technologies Contractor: June 2024 Current Date Role: Engineer III Technologies: Python/ Azure/ AWS / Snowflake/ T-SQL / PL-SQL/ SSIS/ Informatica/ Tableau/ C#/ .Net/ SSAS Location: Virginia Beach, VA and Austin, TX Description: The project in this company is to Support existing/Create applications and validate data. Depending upon BA request fetch the data using Python and create database objects using BIDS, SQL, develop SSRS/Looker reports as well. Key Contributions: Responsible for making changes as per the user stories in encounters application called as MDE-Xpress Encounter Pro (2014-2019). Later we had a new application from vendor called as Edifecs where we are going to use this to submit claims to the Medicaid and Medicare states (2019-till date). Currently Anthem holds 32 Medicaid states and Medicare. Responsible for supporting CPEC team applications and played a key role in enhancing the implementation of the new OH FIDE project by creating and modifying stored procedures, SSIS packages, Informatica workflows, and Tidal jobs to move claims from 01 (Pending) to 02 (Paid) status. Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks. Implemented Proof of concepts for SOAP & REST APIs Worked on REST APIs to retrieve analytics data from different data feeds Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform and load data from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards. Worked on pipelines for Oracle, MS-SQL Server, Postgres and API to Teradata using Python, MATILLION and AIRFLOW. Worked on Google cloud platform for big query, cloud DATAPROC and Apache airflow services. Hands on experience in using stack drive/ DATAPROC clusters in GCP for accessing logs for debugging. Used cloud shell SDK in GCP to configure/deploy the services Data Proc, Storage, and Big Query (BQ). Involved in setting up EC2 AWS instances for Test functionality of EDIFECS servers. Handled installation, administration and configuration of ELK stack on AWS. Good Experience with Amazon Web Services (AWS) for creating and managing EC2, Elastic Map Reduce, Elastic Load-balancers, Elastic Container Service (Docker Containers), S3, Lambda, Elastic File system, RDS, Cloud Watch, Cloud Trail, IAM and Kinesis Streams. Hands on experience with Edifecs Drools engine, KP s, Rules, Config, EAM, TM, XE, Web servers, Profiles and severity Files. Involved in merging Hotfixes to Edifecs KP s, later deploying to Dev Servers, well experienced in editing and relaxing the rules. Involved in modifying the code for XML, Java and DRL files for Edifecs Application. Implemented multiple markets in anthem SC, NJ, TX and DC following the implantation guide and mapping document to enable 837, 835, 999 and 277CA transactions and file formats. Well knowledge in understanding an EDI transaction like 834, 837, 835, 999, 277CA and Proprietary file formats. Good knowledge in understanding Member, provider, Claims -2300, 2400 and CAS balancing on the files. Used AXWAY (B2BI) to conduct HIPAA compliance before submitting files to the state. Very good understanding and development knowledge with MDE application and Edifecs Application. Involved in developing SSIS packages using BIDS for Incremental loading process including getting data from transactional system into RDBMS tables. Migrated 2008 SSIS packages to 2012, later from 2012 to 2016 and later to 2022. Created complex SSIS packages using proper control and data flow elements with error handling. Worked on Informatica packages to load data from facets & VLM to MDE database. Created tabular reports, and parameterized reports per business requirement using SSRS & Tableau. Designed and implemented SQL code, stored procedures and triggers for automating tasks in SQL, Teradata, PostGre SQL, Oracle servers. Developed PL/SQL packages, Procedures, Functions, Shell scripts to validate the data and insert the validated data into oracle standard interface tables and then into Oracle base tables using API s. Responsible for designing and developing the web application (C# .net as front end, and SQL 2012 as backend) and backend jobs using SSIS packages. Designed web pages using ASP.net server controls, HTML, CSS, AJAX, Bootstrap and JavaScript. Created Shell Script, bash Script to run TIDAL jobs and used to cleanse 837 format files to load to database tables. Created AWK (UNIX) Scripts based on the requests from BA s and HCA analyst s, this script are interim solutions to submit the 837, NCPDP, Proprietary files within the due date to the state and Federal governments. Used Tortoise & bit bucket as Version Control Tool. Tools Used: Azure SQL, Python 3.7, AWS , Google Cloud Platform, Spark, Hive, Kafka, Snowflake SQL, EC2, Elastic Map Reduce, Elastic Load-balancers, Elastic Container Service (Docker Containers), S3, Lambda, Elastic File system, RDS, Cloud Watch, Cloud Trail, IAM and Kinesis Streams, SQL Server 2016/2014/2012/2008 R2 Enterprise Edition, SSIS, C#, VB. Net Framework 4.0, JavaScript, jQuery, HTML, CSS, Visual Studio 2012, MVC 4.0, ADO .NET, Java, Oracle 11g. Employer: Sanguine Software Solutions March 2013 Jan 2014 Role: SQL Developer/ Report Developer Client: Hill Physicians Medical Group Location: San Ramon, CA Description: Primed Management Consulting Services is the management services organization (MSO) that organized Hill Physicians in 1984 and has played a key role in Hill Physicians' success. The approximately 500 employees of Primed support our physicians by providing the technology infrastructure, claims processing, customer service, patient and case management and authorization reviews that allow physicians to do what they do best practice medicine. The project on this company is to Support existing IDX application and maintain data warehouse by different techniques of SSIS, got extensive hands-on experience on migrating oracle data to Microsoft SQL and used to Work on MDX code for creating different kind of SSRS reports. Key contributions: Extensively designed the packages and data mapping using Control flow task elements, Sequence container task, Dataflow Task, Execute SQL Task, For Each Loop Container, For Loop Container and Script Task in SSIS Designer. Experience in using different transformations like fuzzy lookup, derived column, Look up, Row count, OLEDB Command, Data conversion task transformations etc. in SSIS. Involved in developing SSIS packages using BIDS for Incremental loading process including getting data from transactional system into data warehouse and then making data available to be viewable in the cube for the QNXT project. Experience in creating complex SSIS packages using proper control and data flow elements with error handling. Enhancing and deploying the SSIS Packages from development server to production server. Using SSIS, building high performance data integration solutions, including extraction, transformation, and load (ETL) packages for data warehousing and Scheduling the SSIS packages and Jobs Involved in daily loads (Full & Incremental) into Staging and ODS areas, troubleshooting process, issues and errors using SQL Server Integration Services (SSIS) 2008. Experience in creating Tabular reports, Matrix reports, List reports, Parameterized reports, Sub reports, Ad hoc reports (SSRS) Experience in integrating Share Point Server with SSRS and deploying SSRS reports on to Share Point Server. Developed Multi-Dimensional Objects (Cubes, Dimensions) using MS Analysis Involved in Creation of Dimensions using STAR and SNOWFLAKE Schema. Implemented several Custom logics for Oracle Forms Personalization and developed custom forms using Forms 6i, 10g tool based on Client s requirements. Involved in High Level Design (HLD), Solution Design Document (SDD) preparation, Coding, Code reviews, Testing, Code migration. Created various UNIX Shell Scripts to FTP files from Legacy instance directory to destination instance directory, Developed Control files to load data from Flat files (.CSV, TSV, DAT etc.) To staging tables using SQL*LOADER. Developed screen layouts, report layouts using Forms and Reports and participated in implementation. Involved in trouble shooting and fine-tuning of databases for its performance and concurrency. Effectively used views to prevent unauthorized access. Tuned SQL queries using execution plans for better performance. Tools Used: MS SQL Server 2008, Microsoft SQL BI 2008, T-SQL, DTS, SQL Server Enterprise Manager, SQL Profiler, dash Boards, XML Scripts, Oracle 10g. Employer: Sanguine Software Solutions Nov 2011 Dec 2012 Role: SQL Developer Client: Amerigroup Corporation Location: Virginia Beach, VA Description: Amerigroup Corporation, we strive to be our customers most valued partner by providing them with a fully integrated suite of hosted technology solutions and provide better Insurance plans for all types of Individuals and compliance management needs of our clients. Our focus is customer satisfaction. The project on this company is to create Data mart and to deliver different types of Reports which are used internally by Encounters, Medicare and Medicaid team. Key contributions: Created SSIS packages to extract and Scheduled Jobs to call the packages and Stored Procedures. Design and tested packages to extract, transform and load data using Server Integration Services (SSIS). Developed SSIS 2008 Packages that pull data from Oracle, Access, flat files and Excel into SQL server 2008 for further Data Analysis and Reporting by using multiple transformations. Developed ASP.NET based application for patient data enrollment, browsing medical dictionaries (MedDRA, WHODD). Created stored procedures to generate different market claims for different types of accounts. Imported the various files provide by the clients into the relational database systems. Worked with T-SQL to create Tables, Views, and triggers and stored Procedures, functions. Created complex queries which involve usage of Joins, Sub-queries, Functions, Temporary tables etc. Created Indexes for fast retrieval of the data and used string functions for space management in database. Involved in Performance tuning and Query Optimization using execution plan. Used Tortoise as Version Control Tool. Tools Used: SQL Server 2008 R2 Enterprise Edition, T-SQL, SQL Server Integration Services, SQL Server Analysis Services, SQL Server Reporting Services, Windows Server 2005, MS Excel 2003-2007, C#, VB. ACADEMIC EXPERIENCE Project#1: Casino Database Design Summary: Created sampler database design to support casino transactions and employee details, income and promotions in casino using Microsoft SQL Server 2008. Created and managing schema objects such as Tables, Views, Indexes and referential integrity depending on user requirements and converting them into technical specifications (Customer, employee, Comps Cust, Offer Comps, Slot Machine, Payout, Payout Cust, Comps and several other tables which are needed for complete transaction s) Created complex stored procedures and functions to support efficient data storage and manipulation. Designed and created views for security purposes, implemented rules, defaults, and user defined data types. Designed and implemented stored procedures and triggers for automating tasks. Loaded data from various sources like flat files, to SQL Server database Using SSIS Package. Project#2: You Bet your Life (Project on Insurance Company) Summary: Created sampler database design to support Insurance company claims, provider, Health Plan and employee details and Created income and promotions reports using SSRS 2008. Requirement gathering Solution design Actively designed the database to fasten certain daily jobs, stored procedures. Optimize query performance by creating indexes. Wrote T-SQL statements for retrieval of data and involved in performance tuning of T-SQL Queries, Stored Procedures. Create joins and sub-queries for complex queries involving multiple tables. Project#3: Driverless Smart Train Shuttling between Two Busy Stations Summary: The driverless train concept involves highly automated and control technologies; lot of care must be taken while implementing it practically to avoid accidents. The main intention of this project work is to run the train without a human driver, to control the doors automatically, and to stop the train automatically at stations, ultimately which can be used to carry the passengers from one station to the other when a real system is constructed. The prototype module is constructed with the involvement of various fields like electrical, electronic and mechanical. To make the Train Autonomous , software is also included in this project work for microcontroller, and it is prepared in Assembly language, therefore in other words the entire system can be called as embedded system. EDUCATIONAL BACKGROUND AND CERTIFICATIONS Microsoft Certified Professional in SQL Server 2008. Master s in computer science in FDU, New Jersey. Bachelors in Electronics in JNTU, Hyderabad, India. Keywords: cprogramm cplusplus csharp business analyst javascript business intelligence sthree active directory information technology microsoft mississippi procedural language California New Jersey Ohio South Carolina Texas Virginia |