Home

Venkata G - Data Analyst
[email protected]
Location: Mc Lean, Virginia, USA
Relocation: NO
Visa: H1B
Venkata G
[email protected]

Summary:
17+ Years of IT Experience in information technology with progressive development and implementation of various Data Analytics
Strong experience in Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Integration
Designed workflows with many sessions to load the security related data to provide the object and data level security.
Thorough knowledge in creating ETL's to perform the data load from different data sources and good understanding in installation of Informatica.
Well versed in doing Data profiling using Informatica analyst tool.
Gathered business requirements, performed Gap Analysis, Data Cleansing, Design and Implementation of ODS, Data Marts, Schemas based on Facts and Measures
Proficient in Azure-based data solutions, including Azure Data Factory, Azure SQL
Designed and implemented ETL pipelines using Azure Data Factory for efficient data transfer and transformation.
Implemented row-level security and dynamic data masking in Azure SQL for enhanced data protection
Efficient in developing Logical and Physical Data model and organizing data as per the business requirements using Erwin, ER Studio, ERP and SAP Hana Studio in both OLTP and OLAP applications
Good working knowledge on extracting the data sets and analysis on Databricks and Snowflake
Worked in different levels of Software Development Life Cycle (Agile and Waterfall)
Proficiency in understanding the business requirement, effort estimation and resource planning
Requirement analysis and converting the requirements into code
Hands on experience in designing Data Lakes and building data ingestion pipelines
Experienced in conducting sprint planning, retrospective and show-n-tell meetings for clients
Responsible for user story creation with acceptance criteria and JIRA backlog management
Extensive working experience in ETL transformations using Informatica, BODS
Strong understanding of data warehouse design methodologies and RDBMS concept and principles
Data Warehousing using fact tables, dimension tables and star/snowflake schema modeling
Worked extensively with Data migration, Data cleansing, ETL processes for data warehouses
Programing and script writing skills in Unix Shell, Oracle SQL and Hive (HQL) and pySpark
Proficient in analytical and data modeling tools such as Python (NumPy, Pandas)
Good understanding of GCP, AWS and Snowflake services, Databricks
Hands on experience with AWS S3 and Redshift
Extensive experience on various BI visualization tools like Tableau, Power BI, SAP Lumira Designer.
Ability to connect to different data sources from Tableau, presenting the good Visualization of data, Unique way of using the filters and parameters.
Generated Tableau Dashboard with quick/context/global filters, parameters and calculated fields on Tableau reports.
Strong knowledge on Datawarehouse and preparing the mapping the documents from source to target systems including various transformation involved.
Interact with the business team to identify the business logic and coordinating with the technical team for documentation.
Strong knowledge on writing sql code to compare and validate the data across various systems.
Ability to write complex sql to join multiple tables and identify the duplicates, perform various join conditions to cross verify the systems.
Hands on experience in SQL queries and optimizing the queries in Oracle, SQL Server, DB2, and Teradata
Perform pattern check and profile the data using sql code. Identify the relation between the tables using the key columns.
Hands-on experience on stored procedures, debugging and identify the latency.
Provide status reporting of team activities against the program plan or schedule
Hands on experience in data testing automation process and raise defects in Jira
Keep the project manager and product committee informed of task accomplishment, issues and status.
Extensive experience in Analysis, Design, Development, Implementation, Deployment and maintenance of Business Intelligence application
Extensive knowledge in Enterprise deployment and maintenance of SAP Business Objects Full-client, Designer, XI 3.0 and XI 3.1/BI 4.0/4.1/4.2 versions.
Hands on experience in dimensional data modeling, Star Schema Modeling, Snow Flake Modeling, Physical and Logical data modeling.
Proactively involved with the Testing team in efficiently improving the quality of the product.
Statistical analysis of data related to Incidents, Requests and Changes in JIRA & Service Manager
Utilized Azure DevOps and Big Data for agile project management, improving sprint planning and execution
Implemented CI/CD pipeline using Azure Pipelines for automated testing and deployment of data solutions
Works well individually or in a team environment and possesses excellent communication skills, both written and verbal.
Business Objects Enterprise Certified Professional (BOECP) from SAP Business Objects
AWS certified Solutions Architect

Technical Skills:

DBW Tools Business Objects XIR2 and XI 3.0, XI 3.1, BI 4.0 / 4.1
Other Reporting Tools SAP Business Explorer, (BEx) Analyzer, BEx Query Designer, SAP LUMIRA, QAAWS, LIVE OFFICE, Analysis for Office
Data Visualization Tableau 8.x, 9.x, Power BI
Cloud Technologies Databricks, Snowflake, AWS, S3 Bucket, AWS Redshift db, Azure
Modelling Tools ERWIN, ER Studio ,SAP Hana studio
ETL Tools Informatica 9.1, Informatica Developer (IDQ), IDQ Analyst 9.1,10.1, Azure Data Factory
Environment Linux, Windows 2008, windows 2003, XP.
Database Oracle 9i/11g, SQL Server 2008, Ms Access 2000, Netezza, Teradata, DB2, Dbeaver
Other Tools HADOOP (HDFS, SQOOP, PIG, HIVE, HBASE), HPQC, WINSCP, Guide ware, Data Loader, Sql*Loader, Python 2.7, SAP ERP

Education:
Bachelor s of Technology from Sri Vasavi Engineering College from JNTU University

Professional Experience:

Nationwide Insurance, OH Sep 2021 Till Date

Role: Sr. Data Analyst
Responsibilities:

Gathered business requirements, performed Gap Analysis, Data Cleansing, Design and Implementation of ODS, Data Marts, Schemas based on Facts and Measures
Gathering data from disparate systems for reporting and analysis.
Prepared the Data Lineage and traceability from Source to target mapping documents
Working on guidewire claims center (P&C) application for data integration
Worked on 6 source systems for property & casualty data (Insurance now, PCDE, claims quest, Guidewire) on claims domain and helped the modelling team.
Working with data governance team and understanding the data standardization and redefining.
Prepared the Profiling data sets which helped the modeling team for creating physical and logical models
Prepared the STTM s with the business rules and joins defined
Helping the Development team with the questions/clarifications on STTM s and design
Creating data dictionaries and meta data for data validation and handling
Created automation scripts using python for data extraction from flat files and other source systems.
Worked extensively on data bricks (single tenant) and snowflakes systems on data extraction and profiling and analysis.
Prepared functional and technical documents for data reporting needs and informatica data flows
Created mapping specification and re-usable mapplets using IDQ developer
Identified data quality issues that must be corrected in the source system. Identified issues that can be corrected in ETL processing.
Worked on moving data from a legacy system to a new system. Identified data quality issues that must be handled in the code that moves data from the legacy system to the new system.
Strong knowledge on writing sql code to compare and validate the data across various systems.
Monitors and shared data quality metrics with scorecards and reports to track data quality.
Understanding the different source systems & designing the physical model using business requirements
Worked on enhancement requests for existing tables in DWH as per new business requirements on fact and dimension tables
Helped the testing team on preparing test cases, functional testing, and data validations.
Done source to target data validations and part of regression testing team.
Implemented row-level security in Power BI for secure, role-based data access.
Designed and implemented Power BI datasets connecting to Azure SQL Database and Azure Analysis Services.
Created interactive dashboards with drill-through capabilities and custom DAX measures.
Involved in various phases of development analyzed and developed the system going through Agile Scrum methodology
Created Jira stories and tasks to track the status in detail.
Created tableau reports using data from multiple source systems. Prepared the quality metrics dashboards.

Environment: Databricks, Snowflake, MySql, XML,Informatica IDQ, Python, Power BI, Dbeaver, AWS S3, AWS Redshift, Guidewire application (P&C) , Jira


Food and Drug Administration, MD Aug 2018 Sep 2021

Role: Sr. Data Analyst
Responsibilities:

Gathered business requirements, performed Gap Analysis, Data Cleansing, Design and Implementation of ODS, Data Marts, Schemas based on Facts and Measures
Gathering data from disparate systems for reporting and analysis.
Prepared the Data Lineage and traceability from Source to target mapping documents
Creating data dictionaries and meta data for data validation and handling
Prepared functional and technical documents for data reporting needs and informatica data flows
Worked with SAP Hana model team and actively took part in creating analytical views and calculation views for reporting needs based on the business requirements
Identified data quality issues that must be corrected in the source system. Identified issues that can be corrected in ETL processing.
Worked on moving data from a legacy system to a new system. Identified data quality issues that must be handled in the code that moves data from the legacy system to the new system.
Profiling with detailed analysis of data objects including join & redundancy analysis, relationship inference and domain discovery as required. Conducted single column and multi-column analysis
Strong knowledge on writing sql code to compare and validate the data across various systems.
Creating technical specification documentation like process flows, and system changes for development and QA.
Monitors and shared data quality metrics with scorecards and reports to track data quality.
Understanding the different source systems & designing the physical model using business requirements
Creating source to target mappings with all the transformations logics with documentation and update in the data dictionary.
Worked on enhancement requests for existing tables in DWH as per new business requirements on fact and dimension tables
Created tableau reports like aging of inventory, inventory cost using data from multiple source systems.

Environment: SAP Hana, Informatica 9.1, Tableau 9.2, Power BI, Oracle

Performance Food Group, VA Dec 2017 Aug 2018

Role: Sr. Data Analyst
Responsibilities:

Gathering data from disparate systems for reporting and analysis.
Identified data quality issues that must be corrected in the source system. Identified issues that can be corrected in ETL processing.
Worked on moving data from a legacy system to a new system. Identified data quality issues that must be handled in the code that moves data from the legacy system to the new system.
Profiling with detailed analysis of data objects including join & redundancy analysis, relationship inference and domain discovery as required. Conducted single column and multi-column analysis
Data Analysis and profiling done on multiple heterogenous systems like IBM mainframes, SAP ERP and RDMS systems
Monitors and shared data quality metrics with scorecards and reports to track data quality
Understanding the different source systems & designing the physical model using business requirements
Creating source to target mappings with all the transformations logics with documentation and update in the data dictionary.
Worked on enhancement requests for existing tables in DWH as per new business requirements on fact and dimension tables
Used for python to extract data from various sources and did data profiling for analyzing the data patterns

Environment: Informatica 9.1, MS Visio, ERP, Netezza, Sql Server 2008, IBM I series

Capital One, VA Sep 2016 Nov 2017

Role: Sr. Data Analyst
Responsibilities:

Involved in Data Analysis, Data Profiling, Data Cleansing & Quality, Data Migration
Interacting with business process owners and analysts to understanding the functional requirements and translate them into technical documents
Finalizing the business rules and then to implement them in IDQ
Preparation of Lineage document from source to target
Preparation of Metadata document with all the information on data sets and data elements consumed
Worked on Teradata to check the quality checks and implementation of the quality checks
Creating of data quality checks using IDQ analyst tool and scorecards
Created various data objects like, Relational data objects, Hive Tables, Hadoop Datasets, flat file data objects, Logical data objects
Created mapping specifications and implemented filters in IDQ
Created LDO s using various transformations like expressions, parser, Labeler, Standardization, filter, router, joiner, lookup tables, sort and rank.
Created scorecards and re-usable rules in IDQ developer
Created mapping specification and re-usable mapplets using IDQ developer
Migration of the contents from one server to another by taking xml back-up
Did migration and validation, pointed to new connections while migration.
Interacting with business process owners and analysts to understanding the functional requirements and translate them into technical documents
Creating the Data Quality standardization and cleansing rules using the facts discovered from profiling results
Performing Data Quality Checks and cleansing the incoming data feeds as and profiling the source systems data as per business rules using IDQ
Worked on Address Data cleansing & Standardizing by configuring Address doctor component in IDQ tool
Extensively used Agile Method for daily scrum to discuss the project related information
Done sanity testing on profiled data ensuring rules are valid
Ensuring all information is passed to the offshore and resolve their queries by clarifying with the client
Created various tableau dashboards which shows the KPI s
Developed Tableau workbooks from multiple data sources using Data Blending.
Ability to connect to different data sources from Tableau, presenting the good Visualization of data, Unique way of using the filters and parameters.

Environment: Informatica 9.1, MS Visio, Netezza, Sql Server 2008, Power BI, Tableau 9.x, IDQ Analyst / Developer 9.6.1/10.1, Tableau 9.1, Teradata 15

National Grid, India Jun 2015 May 2016

Role: Senior Data Analyst
Responsibilities:
Prepared specification documents (BRD / FRD), based on business rules given by Business.
Prepared the System Test cases and Unit Test Cases with different scenarios
Participated in Data Validation discussions with Data Architects
Involved in Data Migration between MS SQL server and Oracle
Created SQL queries to validate Data transformation and ETL Loading.
Developed and executed BO test cases for Unit testing, Integration Testing, System testing.
Converted WEBI Reports to Tableau Dashboards for Advanced Visualizations
Involved in Regression testing of the WEBI reports to have data consistency across the systems.
Coordinate the review, presentation and release of design layouts, analysis, test cases and other documentation
End to end client interaction to get the business answers.
Focal point and Lead for all BO Design issues and responsible for presenting the prototypes of critical requirements to the client.
Created web intelligence reports with various functionalities like merge dimensions, input controls, graphs, used ranking to filter the data and used various custom variables for the reporting requirements
Validated data in the report against the business data by running the SQL queries against the Teradata and Oracle.
Created Reference Table using Informatica Analyst tool as well as Informatica Developer tool.
Worked on developing mapplets, mappings, workflows, and applications.
Experience in Error Handling and Trouble Shooting Techniques using the log files.
Worked on creating profiles in the Informatica Data Quality Developer tool, and analyst tool.
Experience in creating rules and applying them to profiles in developer and analyst tool.
Extensively worked on Labeler, Parser, Match transformations to identify the duplicate records.
Experience in using Standardizer transformation to standardize the data.
Developed Tableau visualizations and dashboards using Tableau Desktop.
Developed Tableau workbooks from multiple data sources using Data Blending.
Ability to connect to different data sources from Tableau, presenting the good Visualization of data, Unique way of using the filters and parameters.
Developed the standard reports within Tableau by connecting different databases Oracle 10g, MS SQL server.
Developed Tableau workbooks from multiple data sources using Data Blending.
Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
Experience in preparing the Dashboard Specification Documents.
Conducted Unit Testing of the dashboard.
Experience in Designing of the Dashboard according to color guide given by client.
Responsible for doing comparative analysis between different reporting tools of SAP, to decide the feasible tool according to the business requirements of the client.
Worked on HPQC for continuous update with project schedules, quality, issues/defects and managing resources.

Environment: Business Intelligence 4.1, HPQC, Tableau 8.x, Erwin, Informatica 9.1, Oracle, SQL Server,

Citi Bank, India Apr 2014 May 2015

Role: Data Analyst
Responsibilities:
Responsible for gathering requirements and identifying the data sources required for the requests.
Worked on Different source systems to analyze data and load them into the data warehouse
Development of scripts for loading the data into the base tables of EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
Proficient in importing/exporting enormous amounts of data from files
Manipulation & Validation of data according to the changes made in business rule
Experienced in Agile and Scrum Methodologies
Prepared the application document i.e., functional and technical requirement documents
SME for the BO production support in the project and responsible for working with the clients on new enhancements
Prepared the RCA (Root Cause Analysis) documents for data validations
Worked on complex enhancements in WEBI reports like merge dimensions, creation of detail objects, using breaks, page break options, sections
Prepared two POC s on SAP Lumira on interactive analysis and showcase to the client.
Created storyboards and documents using various functions like charts, tables, drilling, filters in a single document and prepared storyboards.

Environment: Business Objects XI R3.1, Windows 2008, Oracle SQL developer 3.2, Service Now
Zurich, India Jun 2010 Apr 2014

Role: BI Developer
Responsibilities:
Interacting with users in collecting requirements and on design discussions of Business Objects universe and reports.
Performed Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data
Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.
Developed a Testing format to compare the data at different systems.
Worked with Data governance and Data Quality Teams.
Identify business rules for data migration and Perform data administration through data models and metadata.
Designed mappings to transfer data from various sources like Flat Files, XML Files, SQL Server.
Worked with Internal modeling team for data analysis and profiling.
Played a key role in the planning, testing, and implementation of system enhancements and conversions.
Did multi-dimensional analysis using slice & dice, drill (up, down, through and across) by setting up different hierarchies and defining the scope of analysis.
Extensively used WEBI reports functions like, Cascading, Grouping, Hierarchies, and Sub reports, Charts, Variables and Sorting etc.
Created Report Templates for consistency in the look and feel of the reports. Created reports on P&C data using guidewire as source application data.
Involved in testing reports before publishing them to the repository
Developed Tableau visualizations and dashboards using Tableau Desktop.
Developed Tableau workbooks from multiple data sources using Data Blending.
Ability to connect to different data sources from Tableau, presenting the good Visualization of data, Unique way of using the filters and parameters.
Developed the standard reports within Tableau by connecting different databases Oracle 10g, MS SQL server.
Developed Tableau workbooks from multiple data sources using Data Blending.
Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
Conducted Unit Testing of the dashboard.
Experience in Designing of the Dashboard according to color guide given by client.

Environment: Guidewire (P&C), Tableau 8.x, Business Objects XIR3.1, Oracle 11g, SQL SERVER 2008, MS Office,
Goldstone Technologies, India May 2007 May 2010

Role: Report Developer /Data Analyst
Responsibilities:
Work with users to identify the most appropriate source of record and profile the data required
Involved in defining the business/transformation rules
Worked with internal architects and, assisting in the development of current and target state data architectures
Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines
Involved in defining the source to target data mappings, business rules, business and data definitions
Document data quality and traceability documents for each source interface
Generate weekly and monthly reports.
Coordinate with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality
Remain knowledgeable in all areas of business operations in order to identify systems needs and requirements.
Implementation of Metadata Repository, Maintaining Data Quality, Data Cleanup procedures, Data Standards
Responsible for Estimations and effort tracker and daily updates to the management.
Used the HPQC for the defect analysis and respond to it with the status required.
Developed webi reports as per the client s requirement.
Created multi-tab reports, used logos, various other functions like sections, filters, drill filters, complex variables, merge dimensions, hyperlinks, input controls, multi-data provider, sub-query, results from another query, custom sql
Used stacked bar charts and line graphs in report design.
Using Guideware application prepared the test data. Created test scenarios for webi reports
Preparation of ETL mapping documents according to the Business Requirements.
Analyzed the Sources, targets, transformed the data, and mapped the data and loading the data into Targets using Informatica.
Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Informatica.
Transformations like Expression, Router, Joiner, Lookup, Update Strategy, and Sequence Generator are mainly used.
Used Joiner transformation for extracting data from multiple sources.
Creating Informatica mappings, Mapplets and reusable transformations.
Created tasks and workflows in the workflow manager and monitors the sessions in the workflow monitor.
Performed unit testing and validated the data.

Environment: Business Objects XI R3.1SP3, BI 4.1, Informatica 9.1, SQL Server 2008, DB2 9.0, Guideware application, HPQC
Keywords: cprogramm continuous integration continuous deployment quality analyst business intelligence sthree database information technology microsoft Maryland Ohio Virginia

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];4897
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: