Ramakoteswrao - Data Engineer |
[email protected] |
Location: Richland, Washington, USA |
Relocation: Yes , poen to relocate |
Visa: H1B |
Resume file: Ramakoteswararao @_1753280123487.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
PROFESSIONAL SUMMARY
With 16+ years of overall IT Experience, I have worked in Various Roles with varied technology Stack on Integration/Migration projects, E2E delivery of Big Data Projects of Customer Journey, Data warehousing projects. This experience includes Development/ Operations of applications for Verizon, T-Mobile, USAA , M&T Bank and MMC. PROFILE SUMMARY Enriched with good hold of Data warehousing & Business Intelligence projects management involving various technologies like Informatica, oracle, Teradata & concepts like Client/Server, grid, cloud and approaches like Dimensional, Logical & Physical data modeling. Knowledge of advanced concepts of ETL, ELT, Data Mining/Cleansing/Profiling/Masking, BI Analytics, BIG DATA, Hadoop, Visualization tools, python, R etc Extensively used ETL methodology for supplying Data Extraction, Transformations, loading to process solution in a corporate wide ETL Solution using Informatica DEI/BDM 10.4.0,10.2.2,10.21,10.2.0, Informatica Power Center 10.5.1,10.1.0,9.6,9.5.1,9.1.0,8.5,7.1, Informatica IICS. Developed scripts which help in smooth flow of files with process in Cloud Data Integration (CDI) and Colud Application Integration (CAI). Created mapping tasks and TaskFlows based on requirement in CDI/CAI. Created Process, Sub-Process and Reusable Process, reusable service connectors and reusable app connection to Oracle Web Services, to read File in Local and SFTP folders using File Parsers and to Write to Target Folders. Created Multiple Task Flows for loading data from different Sources to Salesforce using SalesForce connector with Data Synchronization Task, Mapping Task and used Bulk API,and Standard API as required. Worked on databases like Oracle 19c,12c,11g/10g/9i, Teradata 16/15/14/13, SQL Server 2018, Netezza, Snowflake DB. Extensive Experience using SQL, PL/SQL, SQL*Plus, SQL*Loader, Unix Shell Programming/Scripting, Hadoop Knowledge in SQL, PL/SQL, Visio, Erwin, DataStage, Spark, Python, Scala. Good knowledge in Cloud Platforms like Informatica IICS, Snowflake, AWS, Azure Data Factory. Experience in loading data into Teradata Database. Implemented Teradata Parallel Transporter connection as part of Performance Tuning by creating Stored Procedures. Involved in Complete ETL code migration to Teradata from Oracle RDBMS. Interacted with Management to identify key dimensions and Measures for business performance. Extensively worked on Data migration, Data Cleansing and Data Staging of operational sources using ETL process and providing data mining features for Data warehouses. Optimized the mappings using various optimization technique like Push Down Optimization (PDO). Implemented performance tuning techniques on Targets, Mappings, Sessions and SQL statements. Exposure to Dimensional Data modeling that includes, star schema and snowflake schema modeling. Extensive experience in Relational database/data warehouse environment: Slowly changing dimensions (SCD), operational data stores, data marts. Proficient in all phases of the Software Development Life Cycle (SDLC), including requirements definition, system implementation, system testing and acceptance testing, Production Deployment and Production Support. Used the job scheduling tools like Control-M, UC4 (Appworx), Cisco Tidal, Autosys, Automic. Substantial experience in Telecommunications, Financial and Banking Domain. Excellent communication, interpersonal, analytical skills and strong ability to perform as an individual as well part of a team. EDUCATION B.E in ECE (Electronics and Communication Engineering) from MADRAS UNIVERSITY passed out in the year of 2002. TECHNICAL SKILL - SET ETL Tools Informatica DEI/BDM 10.4.0/10.2.2/10.2.1/10.2.0/10.4.0, Informatica Power Center 10.5.1,10.1.0,9.6.1,9.5.1/9.1.0/8.x/7.1(Designer, Workflow Manager, Workflow Monitor, Repository Manager, Integration Services, Repository Services, Administration Console, Informatica IICS Data Modeling Tools Microsoft Visio 2007/2000, Erwin 7.0/4.1, Databases Oracle 19c,12c,11g/10g/9i/8i/8.0/7.x, Teradata 16/15/14/13, SQL Server 2018, MS Access, Hive,Snowflake DB Tools Toad, SQL Navigator, SQL*Loader, MS Access Reports, Informatica Data Analyzer, Teradata Utilities (MLoad, FLoad, Teradata SQL Assistant) Languages SQL, PL/SQL, C, Python, Unix Shell Scripting/Sed, Scala Reporting Tools Cognos 11.x, Tableau 2020.x Scheduling Tools Control-M, UC4 (Appworx), Cisco Tidal, Automic, Autosys Other Tools HADOOP (HDFS, Pig, Sqoop, Hive,Hbase ),Spark, Mongo DB, FLUME,KAFK Operating Systems HP-UX, IBM AIX 4.2/4.3, MS DOS 6.22, Win 7, Win NT 4.0, Sun Solaris 5.8, Unix and Red hat Linux7.2.2. Cloud Technolgies Informatica IICS, Snowflake, AWS, Azure Data Factory (ADF) CERTIFICATIONS AND TRAININGS SUMMARY Informatica Certified Developer, Certificate No: 292412 Oracle SQL Developer Certification. Work/Professional Experience Client : MMC (Marsh & McLennan Companies) USA Duration : May 2023 to till date Project Name : CIS (Customer Information Services) Role : Data Engineer Environment: : Informatica IICS ,Informatica PC 10.5.x, Oracle 19c, SQL SERVER, HP-UX , Automic Scheduler Project Description: The purpose of this project is migrating the Oracle Financial Force and ERP Systems data to Salesforce of CERTINIA. This project was critical for consolidating various business processes, enhancing data integrity, Conducted comprehensive analysis of existing data warehouse structures. From CERTINIA, Creating a Data pipelines for Financial Business applications like CIS (Customer Information Services) to provide the reports on the ProjectValueAdd(PVA),Time Transactions , Account Receivables and Backlog Versions amounts for MMC(Marsh McLennan Companies). Roles & Responsibilities Participated in all phases including Client Interaction, Requirements Gathering, Design, Coding, Testing, Release, Support and Documentation. Designed, developed, and implemented ETL processes using IICS data integration. Identify ETL specifications based on business requirements and create ETL mapping documents, high level documentation for Produce Owners. Design & developed mapping Tasks and Taskflows for large volumes of Data. Extensively used Parameters (Input, IN/OUT parameters), Expression Macros and Source Partitioning Partitions. Extensively used Cloud transformations-Aggregator, Expression, Filter, Joiner, Lookup (Connected and Unconnected) ,Router, Sequence Generator,Sorter,Update Strategy Transformations. Developed Cloud integration parameterized mapping templates (DB and table object parameterization for Stage, Dimension (SCD Type-2, CDC and Incremental loads) and Fact load process. Developed CDC load process for moving data from Source to SQL Datawarehouse using Informatica Cloud CDC for Oracle platform. Developed complex Informatica Cloud Taskflows (parallel) with multiple mapping tasks and Taskflows. Performed loads into Snowflake instance using Snowflake Connector in IICS to support data analytics for Treasury team. Created scripts to create on demand Cloud Mapping Tasks using Informatica REST API. Creates scripts which will used to START and STOP Cloud tasks through Informatica Cloud API calls. Extensively used performance techniques while loading data into Target systems using IICS. Loading Data to the Interface tables from multiple data sources such as Text files and Excel Spreadsheets as part of ETL jobs. Automate data processes using scripting languages and ETL tools to reduce manual effort and increase efficiency. Troubleshoot data issues and provide timely resolutions to ensure data availability and integrity. Document ETL processes, data models, and data dictionaries to ensure data lineage and facilitate knowledge sharing. Create trouble tickets for data that could not be parsed. Unit Testing and System Integration testing. Keywords: cprogramm user experience business intelligence database rlang information technology hewlett packard microsoft procedural language |