Venkata Sai - ETL / Informatica Developer |
[email protected] |
Location: Des Moines, Iowa, USA |
Relocation: |
Visa: H1B |
Venkata Sai Teja Jandhyala
Professional Summary Around 10 years of experience in strong background in SDLC, Data integration, App integration tools like Informatica PowerCenter, Informatica cloud (IICS) and Data Engineering. Experience in analysis, design, development and implementation of Data warehouses, data marts and Decision Support Systems (DSS) using Informatica Power Center with Oracle, MS SQL server and DB2. Strong data processing experience in designing and implementing Data Warehouse and Data Mart applications, mainly Transformation processes using ETL tool Informatica Power Center, and UNIX Shell Scripting. Experience in SQL&PL/SQL in developing & executing Stored Procedures, Functions, Triggers and tuning on queries while extracting and loading data. He has experience working in an Agile environment and his latest project was also in the insurance industry. Proficient level working with Informatica PowerCenter Methodology. Hands on experience identifying the most critical information within organization and creation of a single source of its truth to power business processes. Handled major Production GO-LIVE and User acceptance test activities. Extensively work in ETL process consisting of data transformation, data sourcing, mapping, conversion and loading. Proficiency in data warehousing techniques for data cleansing, slowly changing dimensions types (1, 2 and 3) Monitoring actively Informatica job runs. Experience in Performance Tuning and Debugging of existing ETL processes. Experienced to work in Development team, Production support teams in handling critical situations to meet the deadlines for successful completion of the tasks/projects and providing support post implementation. Strong experience in all phases of development including ETL data from various data sources into Data warehouses, data marts IICS cloud (CID, CAI, CIH) Excellent interpersonal and communication skills, technically competent and result-oriented with problem solving skills and ability to work independently and use sound judgment. Extensive knowledge of Data Modeling, Data Conversions, Data Integration and Data Migration with specialization in Informatica Power Center. Experience in Salesforce-Informatica integration. Strong expertise in designing and developing Business Intelligence solutions in staging, populating Operational Data Store (ODS), Enterprise Data Warehouse (EDW), Data Marts / Decision Support Systems using Informatica Power Center 9.x/8.x/7.x/6.x ETL tool. Extensive experience and knowledge of the project lifecycle, including requirements gathering, development and execution. Coordinated and lead offshore team on day-to-day activities and provided necessary knowledge transfer sessions. Expertise in Data Modeling using Star Schema/Snowflake Schema, OLAP/ROLAP tools, Fact and Dimensions tables, Physical and logical data modeling using ERWIN 4.x/3. x. Experience in documenting High Level Design, Low level Design, STM's, Unit test plan, Unit test cases and Deployment documents. Implemented SCD Type 1 and SCD Type 2 for loading data into data warehouse dimension tables. Experienced in Repository Configuration/using Transformations, creating Informatica Mappings, Mapplets, Sessions, Worklets, Workflows, Processing tasks using Informatica Designer / Workflow Manager to move data from multiple source systems into targets. Experience in AWS (Amazon Web Services) and S3 Bucket. Developed Cloud migration solutions using IICS and AWS Effectively designed and implemented strategies to migrate structured/unstructured datasets of varying sizes, formats and complexity into a unified system that ensured data integrity throughout the process. Very good understanding in using IICS Data integration console to create mapping templates to bring data into staging layer from different source systems and monitoring. Good knowledge in IICS transformations like Expression, joiner, union, lookup, sorter, filter, normalizer and various concepts like macro fields to templatize column logic, smart match fields, renaming bulk fields and with FULL push Down optimization to push data from Staging to Ods at scale using data integration templates. Hands of experience in GCP, Big Query, GCS bucket, G-cloud functions, Cloud data flow, pub/Sub cloud shell, Data Proc Lead the projects from Development to Support and Maintenance. Experience of leading the small team and provide all technical documentation to accomplish the business requirements. Experience in converting Informatica pipelines into Azure data factory. Education: Bachelors in Electronics and Communication from Vignan University/2014 Masters in CSIT from Sacred Heart University, CT/2018 Masters in Information system security from University of Cumberland s, KY/2023 Technical Skills ETL Tools Informatica Power Center 10.1,9.6/9.1/8.6/8.1/, Siberian/Informatics 10.0/9.1, Informatica Intelligent Cloud Services (IICS), Alteryx, SSIS Reporting Tools OBIEE, BI- Framework manager, Query studio, Report studio Programming Languages SQL, PL/SQL, NoSQL, MySQL, HTML, Basic UNIX shell scripting, Basic Perl scripting Database Oracle 9i/10g/ 11g, Teradata, DB2, Siebel, MS SQL Server 2010/ 2005, Netezza, AWS S3, Snowflake, Anaplan, Kafka, Teradata Operating System Windows 98/NT/2000/XP/7, UNIX Programming Tools SQL*PLUS, SQL* Loader, Siebel Analytics TOAD 10.0, Putty Office Tools MS PowerPoint, MS Word, MS Excel Scheduling tools DAC, Control M, Skybot, Robot, Crontab, UC4, Mastero, CA Workstation Defect Management HP Quality Centre, HP ALM, JIRA, Service Now Internet Tools HTML, XML, XSLT Professional Experience Michigan Central, Detroit, MI May 2024 present GCP Cloud Data Engineer Role & Responsibilities: Lead the initiative in gathering and analyzing the business requirements, documenting and providing inputs to team members on subtasks and perform all code/application integration reviews. Hands of experience in GCP, Big Query, GCS bucket, G-cloud functions, Cloud data flow, pub/Sub cloud shell, Data Proc. Analyzing client requirements related to new data sources, propose solutions and work closely with Data domain experts. Worked on creating POC documents. Solving of big data and specific problems (time/load/volume) Testing activities involve Unit Test Plan Preparation, Unit Testing, Sprint Testing, UAT and performance testing. Understanding the existing Architecture and work on architectural requirements. Worked with of data flows and server connectivity (databases, data flow, data structures, databases connectivity) Developing scripts using: Sql, Pl/Sql, Python(basic), Linux bash scripting, Json and Designing, writing, and maintaining Apache Airflow jobs that coordinate between multiple technologies. Developing ETL/ELT solutions and automating data flows. Environment: Google cloud platform, python, JSON, SQL Server, WinSCP, Putty. Farm Bureau Financial Services, West Des Moines, IA April 2023 Nov 2023 Data engineer/ Informatica Developer (Power center and IICS) Role & Responsibilities: Lead the initiative in gathering and analyzing the business requirements, documenting and providing inputs to team members on subtasks and perform all code/application integration reviews. Delivered the tasks via Agile-Scrum methodology to design, develop, test and the code. Modify existing business logic in Informatica ETL flows. Designed and developed converting existing stored procedure into Informatica mapping. Design new ETL flows as per the mapping sheets. Involved in ETL developing, creating required mappings for the data flow using Informatica. Involved in designing the mapplet and reusable transformations according to the data flow requirements. Performed data masking for flat files that are being FTP d to different third-party vendors to protect PII information. Migration of Infrastructure, data applications out of legacy data centers into cloud and hybrid environments. Developed, support and maintenance for ETL process using Informatica power center and data subset solutions using Informatica persistent data masking. Validated the output according to the specifications. Enhancements in existing mappings according to the business logic. Automated Data masking validation for source and target databases. Created ETL and Datawarehouse standards documents - Naming Standards, ETL methodologies and strategies, Standard input file formats, data cleansing and preprocessing strategies Created mapping documents with detailed source to target transformation logic, Source data column information and target data column information. Developed Cloud integration parameterized mapping templates (DB, and table object parametrization) for Stage, Dimension. (SCD Type1, SCD Type2, CDC and Incremental Load) and Fact load processes. Designed, Developed and Implemented ETL processes using IICS Data integration Created IICS connections using various cloud connectors in IICS administrator Installed and configured Windows Secure Agent register with lICS org. Extensively used cloud transformations - Aggregator, Expression, Filter, Joiner, Lookup (connected and unconnected), Rank, Router, Sequence Generator, Sorter, Update Strategy, Union Transformations. Attended daily scrum calls to post the required status update. Testing activities involve Unit Test Plan Preparation, Unit Testing, Sprint Testing, UAT and performance testing. Gathering reporting requirements and making enhancements to any existing logic if required. Prepared ETL design document which consists of the database structure, change data capture, Error handling, restart and refresh strategies. Worked on converting existing Informatica PC pipelines, Dataflows, complex transformation into Azure data factory. Environment: Informatica 10.4.0, DB2, SQL Server, Mainframe, CONTROL-M, DB Visualizer, Azure Data Studio, WinSCP, Putty, Service Now, GIT, Jenkins, FileZilla, Ultra Edit, ADF Ford Credit, Dearborn, MI August 2021 March 2023 ETL/Informatica Intelligent cloud lead developer (IICS) Role & Responsibilities: Worked on IICS to Anaplan Integration projects. Developed mappings to transform and load the data to Anaplan using V2 connector. Worked on converting a few Alteryx codes to Informatica. Developed data tasks, command tasks, Notification tasks. Developed integrations using CDI for flat files, Oracle. Developed integrations using CAI for SAP (s4 HANA) Worked on session level properties like Buffer block size, DTM buffer size, Line sequential buffer length. Followed Agile-Scrum methodology to design, develop, test and deliver the code. Developed IICS to GIT hub integration to migrate the code to Higher environments. Documentation to describe program development, logic, coding, testing, changes and corrections. Participated in business meeting to understand the requirements and providing solutions. Participated in daily status meetings, conducting internal and external reviews as well as formal walk through among various ss and documenting the proceedings. Performed unit testing at various levels of the ETL and actively involved in team code reviews. Performed in PROD support incidents to monitor and fix the issues. Worked with IICS Data integration, Application Integration, Mapping tasks, Synchronization tasks, Task flows, Task flow Monitor. Used Administrator tab to create File processor, Flat file, Anaplan v2, SQL server, Teradata and other ODBC, native, LDAP connections. Developed mappings that perform Extraction, Transformation and load source data into Target using transformation like Aggregator, Filter, Router, Joiner, Expression to meet business logic. Developed Informatica parallel, parallel task with decision, sequential task flows associated with the mappings and monitored the results using a monitor. Developed Pre and post session commands at Mapping task level in the workflow. Modify existing business logic in Informatica ETL flows. Used BTEQ, FASTLOAD, MLOAD, FASTEXPORT Scripts to automate pre-session and post-session processes. Used Unix Shell Scripts to automate pre-session and post-session processes. Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments. Used Debugger in identifying bugs in existing mappings by analyzing data flow. Used tools like WinSCP, Putty to move files or to run scripts in Linux servers. Used deployment tools like XML export, GIT for code deployment and migration. Validated the output according to the specifications. Automated/Scheduled the jobs to run daily, Monthly and certain period of time with email notifications for any failures and success. Created Good experience in designing, Implementation of Data warehousing and Business Intelligence solutions using ETL tools like Informatica Power Center, Informatica Power Exchange and Informatica Intelligent Cloud Services (IICS), Alteryx. Experience working with IICS concepts relating to data integration, Monitor, Administrator, deployments, permissions, schedules. Experience integrating data using IICS for reporting needs Environment: IICS, Alteryx, Teradata, XML, SQL Server 2017, WinSCP, Putty, GIT. Nationwide Insurance, DesMoines, IA July 2019 September 2020 ETL/ Informatica Developer Role & Responsibilities: Followed Agile-Scrum methodology to design, develop, test and deliver the code. Documentation to describe program development, logic, coding, testing, changes and corrections. Participated in weekly status meetings, conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings. Performed unit testing at various levels of the ETL and actively involved in team code reviews. Worked on some Investments projects. Worked with Informatica PowerCenter Designer, Workflow Manager, Workflow Monitor and Repository Manager. Used Informatica Workflow Manager to create workflows, sessions and batches to run the mappings. Developed mappings that perform Extraction, Transformation and load source data in to Target using transformation like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Joiner, Rank, Expression to meet business logic. Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager and monitored the results using workflow monitor. Developed Pre and post session commands at session level in the workflow. Developed mappings to send files daily to AWS. Modify existing business logic in Informatica ETL flows. Used Variables and Parameters in the mappings to pass the values between mappings and sessions. Used BTEQ, FASTLOAD, MLOAD, FASTEXPORT Scripts to automate pre-session and post-session processes. Used Unix Shell Scripts to automate pre-session and post-session processes. Extensively worked on batch framework to run all Informatica job scheduling. Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments. Used Debugger in identifying bugs in existing mappings by analyzing data flow. Modified Stored Procedures, Functions and Packages. Worked on Decommissioning unused BEE (Bash Execution Engine) processes. Used tools like WinSCP, Putty to move files or to run scripts in Linux servers. Used deployment tools like UCD (Urban Code Deployment), XML export, GIT for code deployment and migration. Validated the output according to the specifications. Used Z/OS explorer and have hands on Mainframe for creating change man packages. Automated/Scheduled the jobs to run daily with email notifications for any failures. Created Good experience in designing, Implementation of Data warehousing and Business Intelligence solutions using ETL tools like Informatica Power Center, Informatica Power Exchange and Informatica Intelligent Cloud Services (IICS). Experience working with IICS concepts relating to data integration, Monitor, Administrator, deployments, permissions, schedules. Experience integrating data using IICS for reporting needs. Worked on Kafka REST API to collect and load the data on Hadoop file system. Design the patterns to load data from Oracle DB into Salesforce using Informatica intelligent cloud services. Successfully loaded files to Hive and HDFS from oracle, SQL server using SQOOP. Imported and exported Data from different data sources like DB2, SQL server, Teradata to HDFS using Sqoop. Environment: Informatica 10.1.1, 9.5, Oracle, XML, SQL Server 2008, Mainframe, CA Workstation, AQT, WinSCP, Putty, Service Now, AWS, GIT, UCD, Bigdata. Farmers Insurance, Woodland Hills, CA November 2018 - July 2019 Informatica Developer / ETL Developer Role & Responsibilities: Followed Agile-Scrum methodology to design, develop, test and deliver the code. Modify existing business logic in Informatica ETL flows. Involved in performance tuning and optimization of mapping to manage very large volume of data Worked on Informatica Utilities Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer. Created mapping using various transformations like Joiner, Filter, Aggregator, Lookup, Router, Sorter, Expression, Normalizer, Sequence Generator and Update Strategy Develop complex ETL mappings on Informatica 10.x platform as part of the Risk Data integration efforts. Implemented SCD Type 1 and SCD Type 2 for loading data into data warehouse dimension tables. Implemented error handling for invalid and rejected rows by loading them into error tables. Extensively worked on batch framework to run all Informatica job scheduling. Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer. Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer. Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings. Used Variables and Parameters in the mappings to pass the values between mappings and sessions. Created Stored Procedures, Functions, Packages and Triggers using PL/SQL. Implemented restart strategy and error handling techniques to recover failed sessions. Used Unix Shell Scripts to automate pre-session and post-session processes. Did performance tuning to improve Data Extraction, Data process and Load time. Worked with data modelers to understand financial data models and provided suggestions to the logical and physical data model. Designed presentations based on the test cases and obtained UAT signoffs Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments Recorded defects as a part of Defect tracker during SIT and UAT Identified performance bottlenecks and suggested improvements. Performed Unit testing for jobs developed, to ensure that it meets the requirements Handled major Production GO-LIVE and User acceptance test activities. Created architecture diagrams for the project based on industry standards Defined escalation process metrics on any aborts and met SLA for production support ticket Environment: Informatica 10.1.1, 9.5, Oracle, XML, SQL Server 2008, Web services, DB2 Mainframe, DAC(Scheduler). Eagle Creek Software Service, Eden Prairie, MN December 2017 - October 2018 ETL Developer Role & Responsibilities: Involved in the requirement definition and analysis in support of Data Warehouse efforts. Developed ETL mappings, transformations using Informatica Power Center Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager. Extensively used ETL to load data from Flat Files to Oracle. Worked as a part of AWS build team. Created Data synchronization scripts to insert, update or upsert data into Salesforce based on the type of data that needed to be updated on Salesfroce. Automated/Scheduled the cloud jobs to run daily with email notifications for any failures. Developed data Mappings between source systems and warehouse components using Mapping Designer Worked extensively on different types of transformations like source qualifier, expression, Aggregator, Router, filter, update strategy, lookup, sorter, Union, Stored Procedure and sequence generator. Implemented Type 1 & Type 2 Slowly changing dimensions. Created, launched & scheduled Workflows/sessions. Involved in the Performance Tuning of Mappings and Sessions Troubleshoot connectivity problems. Looked up and read session, event and error logs for troubleshooting. Documentation to describe program development, logic, coding, testing, changes, and corrections. Used UNIX scripting to apply rules on the raw data within AWS. Environment: Informatica Power Center 9.6.1, Workflow Manager, Workflow Monitor, PL/SQL, Oracle 11g, Erwin, Autosys, SQL Server 2005, AWS, Toad 9.0. Merck, West Point, Pennsylvania March 2017 - December 2017 ETL Informatica Developer Role & Responsibilities: Involved in gathering and analyzing the business requirements. Designed and developed mappings by using Lookup, Expression, Update Strategy, Sequence generator, Aggregator, Router, Joiner transformations to implement the mapping. Worked with Informatica PowerCenter Designer, Workflow Manager, Workflow Monitor and Repository Manager. Extracted data from Hive files and transformed it in accordance with the Business logic and loaded data into Oracle tables using ETL tool. Involved in creating Oracle staging table structures and modifying existing tables as per the design specifications. Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica Power Center. Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager. Involved in debugging Informatica mappings, Performance and Unit testing of Informatica Sessions and Target Data. Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements. Involved in Performance Tuning of mappings in Informatica. Performed unit testing at various levels of the ETL and actively involved in team code reviews. Participated in weekly status meetings, conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings. Wrote PL/SQL Packages and Stored procedures to implement business rules and validations. Environment: Informatica Power Center 9.6.1, Oracle 11g, SQL, SQL Developer, UNIX, Maestro, Putty. Confidential Group, Bangalore, India July 2013 - August 2016 Informatica Developer Role & Responsibilities: Followed Agile-Scrum methodology to design, develop, test and deliver the code. Modify existing business logic in Informatica ETL flows. Design new ETL flows as per the mapping sheets. Involved in ETL developing, creating required mappings for the data flow using Informatica. Used transformations like Source qualifier, Filter, Aggregator, Joiner, Expression, Lookup, Router, Sorter and Union. Involved in designing the mapplet and reusable transformations according to the data flow requirements. Validated the output according to the specifications. Enhancements in existing mappings according to the business logic. Attended daily scrum calls to post the required status update. Testing activities involve Unit Test Plan Preparation, Unit Testing, Sprint Testing, UAT and performance testing. Gathering reporting requirements and making enhancements to any existing logic if required. Developed Oracle PL/SQL packages, procedures and functions. Coded Oracle SQL to create ad-hoc reports on an as-needed basis. Used Oracle Warehouse Builder to implement changes to the operational data store, as well as create data marts. Involved in the data analysis for source and target systems. Good understanding of Data warehousing concepts, Star schema and Snow-flake schema. Involved in supporting and maintaining Oracle Import, Export and SQL*Loader jobs. Environment: Informatica Power Center 9.5.1, PL/SQL. Keywords: business intelligence sthree database sfour zos active directory information technology golang hewlett packard microsoft procedural language California Connecticut Iowa Kentucky Michigan Minnesota |