Informatica ETL - Informatica ETL |
[email protected] |
Location: Dallas, Texas, USA |
Relocation: yes |
Visa: H1 |
Sravanthi G
Email: [email protected] Phone: +1 947 228 6768 Ext 12 SUMMARY: 12+ years of experience in end-to-end development and implementation of Data integration solutions using Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS) for clients from Healthcare, Corporate Finance and Investment Banking domains. Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support. Extensive testing ETL experience using Informatica 9.1/8.6.1/8.58.1/7.1/6.2/5.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects. Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio. Experience in working with Code migration, Data migration and Extraction/Transformation/ Loading using Informatica Power Center and Power Exchange with Oracle, Sql Server, Teradata, XML, Flat files, and Cobol on UNIX, Windows NT/2000/9x. Expertise in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server 2008/2005, DB2 8.0/7.0, UDB, MS Access and Teradata. Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases. Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server and Worked on integrating data from flat files like fixed width and delimited. Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center and Informatica Cloud Real Time(ICRT). Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer. Experience in change implementation, monitoring and troubleshooting of AWS Snowflake databases and cluster related issues. Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators. Strong Experience in developing Sessions/tasks, Worklets, Workflows using Workflow Manager Tools - Task Developer, Workflow & Worklet Designer. Experience with Informatica Advanced Techniques like Dynamic Caching, Memory Management, Parallel Processing to increase Performance throughput. Familiar in using cloud components and connectors to make API calls for accessing data from cloud storage (Google Drive, Salesforce, Amazon S3, DropBox ) in Talend Open Studio . Experience using SAS ETL tool, Talend ETL tool and SAS Enterprise Data Integration Server highly preferred. Collaborated with stakeholders to gather requirements, analyze Data Integration needs, and design scalable and efficient workflows to support business objectives. Works on loading data into Snowflake DB in the cloud from various sources. Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL. Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions. Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting. Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting. Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart. Led the integration of data from diverse sources including Databases, Flat files, APIs, and Web services, adhering to industry standards and best practices. Designed and implemented complex Transformations, Mappings, and Workflows to meet regulatory compliance requirements and address specific business challenges. Optimized ETL processes for performance enhancements, resulting in significant improvements in Data processing times and operational efficiency. Provided ongoing Production support, troubleshooting, and resolution of issues related to Data Integration processes, ensuring minimal downtime and maximum reliability. Strong SQL skills for querying and manipulating data in Relational databases such as Oracle, SQL Server, and MySQL. Skilled in designing and implementing ETL workflows using Informatica PowerCenter, ensuring seamless Data Extraction, Transformation, and loading. Strong Data modeling skills, capable of designing efficient Data warehouses and Data Marts to support reporting and analytics requirements. Skilled in performance tuning and optimization of ETL processes to enhance data processing efficiency and reduce processing times. Extensive experience in Unix environments, adept at leveraging Shell scripting for automation and data processing tasks. Experience in Agile methodologies, adept at adapting to changing project requirements and delivering high- quality solutions on time. Experience with developing sophisticated Continuous Integration and Continuous Delivery (CI/CD) pipelines including software configuration management, Test automation, Version control, and Static code analysis. Mastered R programming language during graduate studies, specializing in Data visualization techniques and analytical capabilities to create visually compelling representations of complex datasets, enabling informed decision-making and enhancing analytical insights. Excellent communication and problem-solving skills, with a commitment to continuous learning and professional development in the rapidly evolving field of data management and analytics. TECHNICAL SKILLS: Informatica ETL Products: Informatica PowerCenter 10.2.x/10.1/9.x/8.x/7.x/6.x/5.x, (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica Data Integration Hub (DIH), Informatica Data Quality (IDQ) 10.2/10.1, Informatica Advanced Data Transformation (ADT) 10.2/10.1, INFA MDM Product 360, Informatica MDM, Informatica Power Center 10.x/9.x Informatica Intelligent Cloud Services (IICS). Reporting Tools: Power BI, Tableau. Databases: Oracle11g/10g, SQL Server 2012/2008. Programming Languages: SQL, PL/SQL, Unix, Shell Scripting, R, Python, MS-DOS Script Development Tools: Toad, SQL Developer, Azure SQL Database. Scheduling Tools: Autosys, Control-M. Cloud Platforms: Amazon AWS, GCP and Azure Hadoop Administration: Hadoop Administration, HortonWorks Administration Apache Products: Apache Spark, PIG, HIVE, SQOOP, HBASE, Cassandra, Oozie, Zookeeper, Ambari, Flume, Impala, Kafka ELK/Elastic Stack: ElasticSearch, Logstash, Kibana, Filebeat Data Governance: Informatica EDC/EIC, Informatica AXON Data Profiling: Informatica Data profiling, Informatica Data Analyst, IBM Information Analyzer (IA) Methodologies: Star/Snowflake, ETL, OLAP, SDLC. Other: Data Modeling, Data Warehousing, Data Integrity, Data Ingestion, Data Manipulation, Data Masking, Data Architecture, Extract Transform Load (ETL), Database Schema. Application Tools: Jira, Service Now, GIT Linux OS, JIRA, PuTTY, WinSCP, Git Bash, Jira, Bitbucket, Confluence, CITRIX, VDI, HPLAM. Software Development: Object-Oriented Programming (OOP), Service-Oriented Architecture (SOA), API Integration DataBricks ETL & Other: Databricks Spark big data ETL, Scala 2.1.1 Development Methodologies: Agile, Scrum, Waterfall. PROFESSIONAL EXPERIENCE: Dell Texas, USA Informatica ETL IICS Developer Jan 2024 till Present Responsibilities: Developed and implemented ETL solutions using Informatica Cloud to integrate data from various sources into cloud platforms. Designed and optimized data integration workflows, ensuring robust data quality and integrity. Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor. Parsed high-level design specification to simple ETL coding and mapping standards. Extracted data from different source systems - Oracle, DB2, My Sql, Flat Files and XML Files. Developed ETL programs using Informatica Power center 9.6.1/9.5.1 to implement the business requirements. Used Informatica as ETL tool to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities Created DWH, Databases, Schemas, Tables, write SQL queries against Snowflake. Managed data migration projects, successfully transferring legacy data to modern cloud infrastructure. Automated data flows and processes, reducing manual intervention and improving efficiency. Created comprehensive technical documentation and maintained standards for data governance and data security. Collaborated with cross-functional teams, including business analysts and data architects, to understand requirements and deliver solutions. Provided technical support and troubleshooting for ETL processes, ensuring minimal downtime. Participated in agile methodologies, including scrum meetings and sprint planning, to ensure timely project delivery. Configured encryption, access controls, and auditing mechanisms within IICS to protect data privacy and integrity. Worked extensively with Data migration, Data Cleansing, Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse Ensured high availability and fault tolerance of IICS solutions through proper configuration and management. Utilized IICS Data Masking to protect sensitive data and comply with privacy regulations. Created reusable transformations and components in PowerCenter to streamline development and maintenance. Loaded data in to the Teradata tables using Teradata Utilities Bteq, Fast Load, Multi Load, and Fast Export, TPT. Worked extensively with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings. Implemented change data capture (CDC) mechanisms within IICS to handle real-time data updates and synchronization. Ensure ETL/ELT s succeeded and loaded data successfully in Snowflake DB. Integrated PowerCenter with third-party tools such as Tableau, QlikView, and Cognos for enhanced reporting capabilities. Leveraged strong Java, Angular, and Spring skills to build and maintain robust, scalable web applications as part of data management and integration projects. Conducted regular security audits and assessments to identify and address potential vulnerabilities. Created comprehensive documentation for data integration processes, best practices, and configuration details within IICS. Monitored and maintained ETL workflows to ensure they run smoothly and handle any failures promptly. Provided training and knowledge-sharing sessions to end-users and stakeholders, empowering them to effectively utilize IICS features and functionalities. Supported migration of ETL code from development to QA and then from QA to production. Demonstrated ability to identify and resolve performance bottlenecks to ensure optimal data processing efficiency. Conducted performance tuning of SQL queries and PL/SQL scripts, reducing execution times and enhancing overall system efficiency. Reduced data discrepancies by 25% through rigorous quality checks and continuous process improvements. Applied OOP & SOA principles, design patterns, and industry best practices to all development tasks, ensuring high-quality code and maintainability. Utilized Power BI to create automated reports that updated dynamically with the latest data from IICS and other integrated systems. Applied a deep understanding of RDBMS concepts to manage and optimize databases, including Oracle, MS SQL Server, MySQL, and PostgreSQL. Integrated IICS with various cloud services and platforms such as Salesforce, AWS, Azure, and Google Cloud Platform. Reduced data processing delays by addressing bottlenecks in IICS workflows through the optimization of mappings, sessions, and database interactions. Used Snowflake Tasks to schedule procedures and Streams to load data real time. Worked closely with the operations team to continuously monitor and resolve any emerging bottlenecks in the IICS environment, ensuring sustained high performance. UBS Business Solutions Limited Mumbai, India Sr. Informatica Developer Aug 2017- Jul 2022 Responsibilities: Leveraged IICS for designing, developing, and implementing end-to-end data integration solutions. Proficient in utilizing various IICS components for data integration, including Cloud Data Integration, Cloud Application Integration, and Cloud Data Quality. Developed complex business logic in ETL mappings using advanced transformation techniques like Router, Joiner, and Lookup transformations. Demonstrated expertise in handling complex data integration scenarios and ensuring seamless data flow across cloud and on-premises environments. Successfully mapped source system data to target data warehouse schemas using IICS. Implemented comprehensive data transformation logic to meet diverse business requirements, ensuring data accuracy and consistency. Designed and orchestrated data integration workflows to automate data movement between disparate systems and applications. Implemented robust data quality measures within IICS, including data validation rules, cleansing techniques, and adherence to data governance standards. Conducted thorough data quality audits and implemented corrective actions to address identified issues. Optimized data integration processes within IICS for enhanced performance and scalability. Conducted performance tuning activities, including optimizing SQL queries, job configurations, and resource utilization. Performed ETL processes in IICS to prepare data for Tableau and Power BI, ensuring that data was clean, accurate, and ready for analysis. Migrated ETL workflows from on-premises PowerCenter to cloud-based IICS environments, ensuring minimal downtime. Automated ETL processes using scheduling tools like Autosys, Control-M, and Informatica s own scheduler. Integrated IICS with various cloud services and platforms such as Salesforce, AWS, Azure and Google Cloud Platform. Performed change management activities, including source code management, creating activity records, and developing implementation plans. Utilized Agile methodologies to manage ETL development cycles and ensure timely delivery of data integration solutions. Participated in GitHub discussions and community forums to engage with other developers and stay updated on best practices. Contributed to open-source projects by submitting pull requests and reporting issues on GitHub. Managed metadata and lineage within Informatica to maintain transparency and traceability of data flows. Designed and implemented seamless data exchange between IICS and cloud-based applications, ensuring data consistency and availability. Identified and resolved performance bottlenecks in Informatica Intelligent Cloud Services (IICS) workflows, improving data processing times by 30%. Conducted unit testing and integration testing of ETL workflows to ensure accurate data transformation and loading. Implemented audit logging and error handling mechanisms in Informatica PowerCenter. Maintained documentation and version control of ETL processes in Informatica PowerCenter. Configured and managed connections to various data sources including SQL Server, Oracle, and flat files in both PowerCenter and IICS. Developed complex ETL logic for slowly changing dimensions (SCD Type 1, 2, 3) in Informatica PowerCenter. Analyzed production issues and implemented efficient programming solutions, reducing downtime and improving application performance. Proficient in leveraging IICS connectors and APIs for seamless cloud integration. Integrated SQL and PL/SQL scripts with IICS mappings and workflows to streamline ETL processes and ensure consistent data flows. Implemented stringent security measures within IICS to safeguard sensitive data and ensure compliance with regulatory standards. Silicon Matrix Enduring Solutions Hyderabad, India Informatica Developer Jun 2014 Aug 2017 Responsibilities: Designed and developed ETL mappings, workflows, and sessions using Informatica PowerCenter. Worked with various transformations including Connected and Unconnected Lookup, Aggregator, Expression, Filter, Router, Update Strategy, Sequence Generator and Sorter. Expertise in slowly changing dimensions (SCD) to maintain historical as well as incremental data using Type I, Type II and Type III strategies. Extensively worked with Informatica performance tuning involving source level, target level and mapping level bottlenecks. Implemented complex mappings and mapplets in PowerCenter to address data integration requirements. Used Unix to navigate around the system and check for specific files, file content, change permissions, and see who the current users are. Designed workflows with many sessions with decision, assignment task, event wait and event raise tasks using Autosys to Schedule jobs. Used Informatica Debugging techniques to debug the mappings and used session log files and bad files to trace what occurred while loading. Introduced parallel processing techniques to enhance data extraction and loading speeds, leading to significant time savings. Responsible for implementing Incremental Loading mapping using Mapping Variables and Parameter files. Experience in PL/SQL programming (Stored Procedures, Triggers, Packages) using Oracle (SQL, PL/SQL), SQL Server, and Unix Shell Scripting to perform job scheduling. Involved in designing and creating hive tables to upload data in Hadoop and processes like merging, sorting, creating and joining tables. Effectively communicated technical concepts and solution designs to non-technical stakeholders, facilitating alignment and understanding. ETL Developer Oct 2010 May 2014 Responsibilities: Utilized Informatica PowerCenter to extract data from various sources like databases, flat files, APIs, etc. to transform and cleanse the data as per business requirements. Designed and developed ETL mappings using Informatica PowerCenter to move data from source to target systems efficiently. Created and managed workflows in Informatica Workflow Manager to orchestrate the execution of ETL processes. Implemented data quality checks and validations within Informatica mappings to ensure the accuracy and completeness of the data. Optimized Informatica mappings and workflows for better performance, including identifying and resolving bottlenecks. Maintained comprehensive documentation of ETL processes, mappings, workflows, and data dictionaries for easy understanding and future reference. Conducted unit testing of Informatica mappings and workflows to validate data transformations and troubleshoot issues as needed. Worked closely with business analysts, data architects, and other stakeholders to understand requirements and translate them into technical solutions. Used SQL to Clean, Filter, Aggregate, Join, and Manipulate data according to business requirements and worked on queries to check for Duplicate records, Missing values, Referential integrity, or any other data anomalies. Utilized Version control systems like Git to manage code changes and ensure the integrity of Informatica artifacts. Stayed updated with the latest Informatica technologies and best practices through self-learning and training programs. Monitored system resources (e.g., CPU, memory, disk I/O) using UNIX during ETL job execution to ensure optimal performance and resource utilization. Followed coding standards, naming conventions, and best practices in ETL development to maintain consistency and code quality. Provided support and troubleshooting assistance to users and other team members for Informatica-related issues. Ensured compliance with data governance policies and implemented security measures to protect sensitive information during ETL processes. ACADEMIC PROFILE: Master of Science in Computer Science Engineering at the University of Illinois Springfield, USA (2022-2023) Bachelors in computer science and engineering at P.V.P. Siddhartha Institute of Technology, India (2007-2010) Diploma in Computer Engineering at A.A.N.M & V.V.R.S.R Polytechnic, India (2004-2007) Keywords: continuous integration continuous deployment quality analyst business intelligence sthree database rlang microsoft procedural language Iowa |