Khagender Arekkuti - Data engineer |
rajesh@dotitsol.com |
Location: Dallas, Texas, USA |
Relocation: |
Visa: H1B |
Resume file: Khagender_Resume_Data_Engineer__1746727631747.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
PROFESSIONAL SUMMARY:
Over 12 years of experience in Design, Development, Testing, and Implementation with major focus in Business Intelligence applications, Data Integration (ETL / ELT) and Data Warehousing. Experience with MSBI Stack, Azure and have handled major data projects. Expert experience in all data related initiatives and BI applications such as Power BI, Tableau, QlikView and IBM Cognos. Wide experience with big data technologies such as Hadoop, Spark and Mongo DB. Proactive, flexible and work independently with self-direction in a team environment. Enthusiastic about learning new cutting-edge technologies SKILLS/TOOLS: ETL Tools: Informatica PowerCenter, IBM InfoSphere DataStage, SSIS, SSAS, Tableau Prep, Azure Data Factory, Databricks BI Tools: MS Power BI, Azure Synapse Analytics, SSRS, IBM Cognos Analytics, SQL Server Management Studio, Einstein Analytics / Tableau CRMA Programming: SQL, Python, VB, TSQL, R Internet Technologies: HTML, CSS, VB.NET, AJAX, JavaScript, ASP Front End: Developer 2000, Visual Interdev, and Visual Studio Databases: Oracle (10g, 11g, 12c), DB2, SQL Server, Salesforce, SAP HANA, Snowflake, Common Data Service, Teradata, MongoDB, Azure SQL Operating Systems: UNIX (Solaris, HP), Windows 10/7/XP Other Tools: HP ALM, JIRA, Octane, JMP, Minitab, Microsoft Visio EDUCATION: Bachelors of Technology (B.Tech), Jawaharlal Nehru Technological University, India. Master of Science (MS) in Technology Management, Texas A&M University-Commerce PROFESSIONAL EXPERIENCE: IBM, RTP, NC February 2021 Present Sr. Data Engineer The Device Management System (DMS) is a project that monitors the outages or incidents of ATM devices and enhances the resolution and maintenance processes. Migrated to the Azure Cloud implementing Azure SQL Data Warehouse, Azure Blob Storage, Azure Functions, Azure Batch Services, Azure Data Factory V2, PowerShell, YAML, JSON, and ARM Templates. Designed ELT processes and loaded external and on-premise data, transformed using Stored Procedures, retained in Azure SQL Data Warehouse, and refreshed Tableau data sources using Azure Data Factory (ADF). Gathered business requirements by conducting user interviews and documented their queries, business analysis and reporting needs. Designed and developed various Cognos reports which included complex functions, drill-throughs, Render Variables, Conditional Blocks, Cascading Prompts, and Conditional Formatting. Created multiple metadata models using Framework Manger, while following Cognos best practices. Analyzed and developed ETL processes. Created and managed data pipelines, Recipes and Data Flows using robust applications such as IBM s AppConnect, and Salesforce s Data Manager. Built various CRM Analytics (Einstein Analytics) dashboards using Analytics Studio. Designed Tableau data sources, workbooks, and dashboards in Tableau Desktop 2019.4 and 2020.2. Created and optimized indexes, partitioning, validation rules, natural keys, and exception handling rules. Loaded MySQL data to SQL Server using dynamic SQL, linked servers, and SSIS. Environment: SQL Server Management Studio, Azure SQL Data Warehouse, ADF, PowerShell, JSON, Tableau CRM Analytics (Einstein Analytics), AppConnect, Data Manager, Analytics Studio, MySQL, SSIS. Microsoft Corporation, Redmond, WA April 2020 January 2021 Data Analytics Engineer Microsoft Azure Portal Reporting is a project that provides metrics and insights on Azure Portal s Usage, Performance, Reliability, User preferences and other indicators to the internal clients, which help them to improve and enhance user experience with the portal. Designed various Reports and Dashboards using Grafana, Power BI. Used Azure Data Explorer (ADX) to write Kusto Queries and analyzed data across multiple sources. Extensively wrote Kusto queries (KQL) by following best practices to generate Reports, Dashboards in Power BI and Grafana. Managed and extracted business insights out of large sets of data. Analyzed requirements, worked with clients and business units to answer key business questions. Analyzed Geneva Monitoring System's telemetry data (Metrics & Logs) and generated Reports/Dashboards. Identified new data requirements, analyzed strategies and reporting mechanisms. Followed-up on action items and worked with cross-functional teams to address issues. Worked with internal clients to determine business requirements, priorities, define key performance indicators (KPI). Environment: Power BI, Grafana, Azure Data Explorer (ADX), Kusto Query Language (KQL-M), Python, Geneva Monitoring/Azure Monitor Metrics, Logs Dearborn National / HCSC, Lombard, IL December 2018 March 2020 Sr. Data Engineer Disability Claims Service (DCS) is a Claims management system that provides an enhanced customer service through a revamp of technology, processes and services leveraged for STD and LTD claims administration. The STAR project is a critical claim system replacement project for Dearborn National. STAR is a new claims system which utilizes software from FINEOS, a third-party vendor; and an Operational Data Store (ODS) to facilitate reporting requirements. Designed SSIS packages to transfer the data from various sources. Created various MSBI(SSIS) packages using data transformations like Merge, Aggregate, Sort, Multicasting, Conditional Split, and SCD (Slowly Changing Dimension) and Derived column. Used various Transformations in SSIS Control Flow, involving the loop Containers. Implemented Event Handlers and Error Handling in SSIS packages. Created SSIS packages for Data Conversion using data conversion transformation. Responsible for analyzing, designing, developing, and maintaining the BI infrastructure utilizing Power BI and Tableau. Performed and conducted complex custom analytics as needed by Business. Involved in performance tuning of slowly performing stored procedures and queries. Using SSIS, building high performance data integration solutions, including extraction, transformation, and load (ETL) packages for data warehousing. Scheduled the SSIS packages to automate its execution at regular time intervals. Developed Tabular Reports, Sub Reports, Matrix Reports, Drill down Reports and Charts using SQL Server Reporting Services (SSRS). Validated and tested reports, then published the reports to the report server. Designed and developed specific databases for collection, tracking and reporting of data. Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues. Also responsible for gathering requirements, project management, constructing functional designs, communicating with team members and business stakeholders, and supporting existing Business Intelligence applications. Environment: MSBI Stack (SSAS, SSIS, SSRS), Power BI, Windows 10 Enterprise, SQL Server, SQL Server Management Studio v11, Tableau Desktop 2019, Tableau Prep Builder 2020, Salesforce, Crystal Reports, Azure tools Data Bricks, Data Lakes, Data Factory, Git/GitHub Johnson & Johnson, Piscataway, NJ October 2017 November 2018 Data Engineer The CORE Platform is an integrated Revenue, Price and Contract Management Solution that aggregates and synthesizes functionality from a wide variety of legacy systems such as CARS/IS, Trinity, IMS, Ideal, Wildcat, and CCPS into a single system with advanced capabilities. One of the important features of the CORE is that it has the capability to act as a Single Source Platform and Reporting Structure for Revenue, Price and Contract Management. The foundational data supporting the platform is derived from a variety of sources and integrations and includes a Product Master, a Customer Master, a Pricing Master and a Membership Master List. The following modules are implemented as part of the CORE s solution: Master Data Management (MDM) o Customer Master o Product Master o Pricing Master o Membership Master Revenue Manager (RM) Contract Manager (CM) Validata Business Information (BI) and Reporting Architect data integration for data extraction from various sources, staging and its transformation using MS SSIS, C# scripts, T-SQL and SQL Data Tools. Architecture of time series streaming data from sensors and landing into MapR Hadoop eco system and then design Azure data factories pipelines, datasets, MS Azure triggers, Azure Alerts. Created hybrid integration using MS Azure SQL table and external table in warehouse containers, blob and on-prem SQL table. Designed real time report visualization from streaming data set using on-prem SQL table data and MS Azure and present it in Power BI. Worked with various SSIS control flow tasks and data transformation tasks such as data conversion, derived column, look-up, and fuzzy look-up, conditional split etc. Created checkpoints and configuration files in SSIS packages. Extensively used merge command as alternative to small SSIS packages in dimension and fact load. Developed Python scripts for streaming data sets, Machine Learning statistical experiments and algorithms. Collaborated with internal stakeholders and other IT staff as needed to evaluate potential ETL or integrations between systems, leveraging existing options where appropriate. Analyzed functional requirements from stakeholders to come up with the appropriate technical design and implementation options for a given application. Worked with Technical Solutions and Business Intelligence teams on project planning and breaking larger work items into iterative deliverables. Environment: SSRS, SSIS, Power BI, SQL Server, Management Studio, Azure, Windows 10 Verizon, Irving, TX April 2016 September 2017 Sr. Data Engineer/ Sr. Analyst Real-Time Customer Insights (RTCI) is central to CMB/VZW Customer Engagement Strategy. RTCI tracks service events, network alarms, customer interactions in real-time and generates next best action/offers (NBA/O) and insights. These NBA/O and insights are consumed by channels to provide personalization and integrated cross-channel customer experience. Involved in Requirement gathering from Business Users. Created Data Flow Tasks with Merge Joins to get data from multiple sources from multiple servers, with Look Ups to access and retrieve data from a secondary dataset, with Row Count to have the statistics of rows inserted and updated, and worked with Script Task to perform a custom code. Designed, deployed and maintained complex canned reports using SQL Server 2013 Reporting Services (SSRS) and PowerPivot. Created complex SSIS/ETL packages to load data from various heterogeneous sources to the database and transformed the data accordingly to fit into the current reporting system. Designed incremental daily loaded for large and small data warehouses, data marts as per the reporting requirement. Collaborated with internal partners and organizations in the design, testing, development, and maintenance of business intelligence and reporting solutions to support business needs with a specific focus on Power BI. Developed, validated and executed complex data sourcing and transformation, developed advanced reports and dashboards using SQL, VB, SSIS, and Power BI. Implemented advanced Power BI functionalities and features including Power BI server, data gateway, row level security, etc. Worked on large data sets from multiple data sources and applied data warehousing techniques to act as a single source point. Involved in writing Complex SQL queries to improve the performance of the report. Designed Database Tables, Indexes, Keys, Database Triggers and Stored Procedures in Oracle. Created incremental refreshes for data sources on Tableau server. Administered users, user groups and scheduled instances for reports in Tableau. Created report schedules, data connections, projects, groups in Tableau server and worked closely with business power users to create complex Reports/Dashboards using Tableau Desktop 9.2/10.1 Used Actions (Filters / Highlight / URL) to drill-down to detailed reports; and or navigate from one dashboard to another. Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop. Created Prompts, customized Calculations, Conditions and Filter (Local, Global) for various analytical reports and dashboards. Involved in testing the SQL Scripts for report development, Tableau reports, Dashboards, Scorecards and handled the performance issues effectively. Environment: Power BI, SSIS, SSRS, SQL, Power Pivot, SQL Server 2013, VB, Tableau 9.2/10.1, Hive, Hadoop, Sqoop, HQL GM/OnStar, Detroit, MI March 2015 March 2016 Sr. Data Engineer OnStar AtYourService is a commerce and engagement offering that connects drivers with retailers and merchants on their drive, providing information, convenience and money-saving values tied to their specific destinations. In addition, OnStar advisors can help locate hotels and book reservations for OnStar subscribers. Responsible for developing software applications using sound, repeatable, industry best practices and in accordance with GM's software development project methodology. Designed databases by adding tables, relationships, creating stored procedures, and views. Optimized stored procedures with SQL Profiler to improve report latency and accelerate delivery access. Formulated SQL scripts to analyze historical data and extracted required data to support management decisions. Designed complex ETL packages using SSIS to extract data from pre staging tables to OLAP with incremental load. Built complex SSAS cubes with multiple fact measures groups and multiple dimension hierarchies. Designed and created various Reports and Dashboards, using Tableau BI Used SSRS to create, design, and deliver tabular, matrix and sub reports. Optimized SSRS reports with snapshots, caches, execution log views, embedded queries, and stored procedures. Implemented an interface to organize reports, sort data sources, schedule report executions, and track report history. Environment: SQL Server 2012, T-SQL, SSIS, SSAS, SSRS, Power Pivot, SSMS, SSDT, Excel, Tableau Desktop 8.3/9.0 Kaiser Permanente, Oakland, CA July 2014 - February 2015 Sr. Data Engineer The HR Data Warehouse (HRDW) is a foundational system that is tightly integrated to My HR application components. It is utilized for operational reporting and analytical applications for HR/Payroll/Benefits, HR Service Center, and Recruitment functions of the organization. Created complex stored procedures, triggers, functions, indexes, tables, views and other T-SQL code for experience. Involved in unit testing, user acceptance testing to check whether the data is loading into target, which was extracted from source systems. Experience in creating database objects like procedures, functions, triggers and indexes. Created and managed event handlers, package configurations, logging, system and user-defined variables for SSIS packages. Deployed SSIS packages with minimal changes using XML configuration file. Extracted data from various heterogeneous sources and created packages using SSIS, Import/ Export data, bulk insert and BCP utilities. Used SQL profiler to optimize stored procedures. Created various SSRS reports involving variety of features like charts, Filters, Sub-Reports, Drill Down, Drill-Through, Multi-valued parameters. Administered interface to organize reports and data sources, schedule report execution and delivery, and track reporting history using SSRS. Created complex SSAS cubes with multiple fact measures groups, and multiple dimension hierarchies based on the OLAP reporting needs. Involved in designing Partitions in Cubes to improve performance using SSAS. Created reports using Crystal Reports with Standard, Summary, Cross Tabs, SQL, Command Objects, Selection Criteria, grouping, sub reports etc. Designed and developed Crystal Reports, the Cross Tab Reports, and Top N Reports for displaying vital data and summary reports with drill-down capabilities. Created calculated measures using MDX implementing business requirement. Worked on Star and snowflake schemas and used the fact and dimension tables to build the cubes, perform processing and deployed them to SSAS database. Environment: Microsoft SQL Server 2008 R2/ 2012, MS Access, ETL Packages(SSIS 2012), SQL Server Reporting Services(SSRS 2012), SSAS, Power BI, Power Pivot, MDX, Teradata, .Net, VB.Net, HTML, Visual Studio, Crystal Reports, Visual Source Safe, T-SQL, XML. JPMorgan Chase & Co., Columbus, OH March 2013 - June 2014 Cognos Reports Developer Metrics and Insights operations are split into Portfolio reporting and Dot Com reporting. Portfolio reporting contains household and account level information, while Dot com reporting focuses on non-customer and customer accessible web site usage. Developed and maintained over 85 reports on the Metrics and Insights website, serving about 1500 users every month. Used SAS to extract and transform data from the Enterprise Data Warehouse (EDW). Trained end users to use Report Studio, Query Studio, Workspace Advanced and Metrics Studio. Conducted post implementation feedback analysis to analyze the Cognos reports usage by the end users. Built various Segment Trend reports, which are used to analyze the current and future market trends by the Executive Management. Developed reports (MDX queries) using Cube based packages. Built variety of Executive Dashboards for a quick glance of various attributes/KPIs for Executive management. Migrated and validated Essbase cubes in each environment (DEV, UAT and PROD) during a move on to the new Essbase servers. Upgraded all DEV, UAT and PROD servers with the latest Cognos 10.2 software. Migrated Reports, Packages, Cubes and folders from one environment to other (DEV, UAT & PROD) upon BICC user requests. Created Data Connections in Cognos Administration, to connect various cubes through Framework Manager. Developed new Essbase cubes for internal clients. Created various roles to access the reports for different levels of users. Environment: Cognos 10.2.1 (Framework Manager, Transformer, Report Studio, Analysis Studio, Query Studio), Java Script, HTML, Java, SAS, Essbase v11.1, DB2, Windows XP, Oracle 11g, Unix. Keywords: csharp business intelligence database rlang information technology hewlett packard microsoft California Colorado Illinois Michigan New Jersey North Carolina Ohio Texas Washington |