Home

Veerasekhar - SAP BODS/Data Migration
[email protected]
Location: Dallas, Texas, USA
Relocation: YES
Visa: H1B
VEERASEKHARRAJU
SAP BODS/Data Migration Consultant
Phone: +1- (561) 830-6644
e-Mail: [email protected]

PROFESSIONAL SUMMARY:
Over 10+Years of experience as an SAPData MigrationConsultant(BODS),SAP HANA&SAP S/4 HANAin designing, ETL, development, implementation, data analysis, maintain and support.

SAP Data Services Consultant with 10+ years of experience in Data Migration, Integration and warehousing projects designing and implementing ETL solutions using SAP Data Services to migrate the data from legacy to SAP and Non-SAP systems.
Having domain expertise in ePayment services, Automobile and Telecommunications domains.
Extensive experience in System Analysis, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in ePayment, Automobile and Telecommunicationsdomains.
Involved in all phases of the SDLC (Software Development Life Cycle) from analysis, design, development, testing,implementation,and maintenance with timely delivery against aggressive deadlines.
Experience in migration of SAP and Non-SAP Systems to SAP S/4HANA.
Experience in SAP BPDM flows Customer Master, Material Master.
Having Good experience in DATAMIGRATION usingLTMC, IDOCS and BAPI Functions.
Knowledge on migrating Material Master Data using Rapid data solutions (RDS).
Strong experience in Extraction, Transformation, loading (ETL) data from various sources into Data Warehouses and Data Marts using SAP BODS as ETL tool on SAP HANA, Oracle, Sybase, and MS SQL Server, GCP Databases.
Involved in complete testing process after upgrading SAP BODS 3.2 to SAP 4.2.
Good experience in SCD s implementation.
Design and developing SAP Data Services jobs to extract data from Flat Files, CSV files, Excel files, SQL server.
Extensively worked on integrating BODS with about different source systems such as SAP ECC, RDBMS sources, Flat files, AWS, GCP.
Well experienced on handling nested structures such as XML, JSON files.
Design and developed SAP Data Services jobs to extract data from SAP ECC systems.
Having extensive knowledge on Data Services Management Console for scheduling, monitoring the jobs.
Good knowledge on Linux/Unix commands.
Good Knowledge on Power BI reporting tool.
Worked on production support projects.
Creation of SOP s (Standard Operating Procedure) which explains the procedure to be followed if any OUTAGES, network glitches in the production environments.
Experience in Production support to Dollar Universe batch processes and perform trouble shooting using SAP BODS batch jobs logs, UNIX scripts and DB processes.
Experience in automating the user Monthly and Daily reports based on the requirement.
Good understanding of Agile SCRUM process and extensive experience working in a collaborative environment.
Good communication and interpersonal skills and Strong understanding of ETL best practices.
Excellent analytical skills, oral and written communication skills.
Self-motivated team player with excellent problem-solving skills and ability to quickly learn new technologies and tools.
SOFTWARE SKILLS:
Databases : Oracle, Microsoft SQL Server, SAP HANA, Sybase,GCP
ERP : SAP ECC 6.0, SAP S/4 HANA, SAP BRIM
Languages : SQL, PL/SQL,Unix/Linux
ETL Tools : SAP BODS 3.2,4.2, 4.3, Talend Open studio 6.2.
Scheduling Tools : Dollar Universe (Automic UniViewer), Appworx (8,9)
Other Tools : Toad 9.5, Oracle SQL Developer, SVN, SAP GUI
Operating Systems : Windows, Unix/Linux
Ticketing Tools : ISMP,ServiceNow, BMC Remedy


PROFESSIONAL EXPERIENCE:
1.Working as Software Engineer for MobolutionsLLC Plano, Texas, USA from June 2023 to till date.
2.Worked as Consultant - Deliveryfor Worldline Global Services - Bangalore from May 2020 May 2023.
3.Worked as System Analyst for Atos India Private Ltd - Bangalore from Aug 2015 -April 2020.
4.Worked as Software Engineer for Minacs Ltd. (Aditya Birla Minacs) - Bangalore from January 2013- Aug 2015.
5.Worked as an Intern for Minacs Ltd. (Aditya Birla Minacs) -Bangalore from May 2012 December 2012.

EDUCATIONAL QUALIFICATION:
BTech from JNTU University.

PROJECT PROFILE:

1.Brightspeed


Client : Brightspeed
Company : Mobolutions LLC
Duration : May 2023 Till Date
Technology : SAP BODS 4.2,SAP DI, SAP IS,SAP S/4 BRIM, SAP HANA, SAP GUI, Google Big Query.
Project Description:
Brightspeed is a broadband and telecommunications services company that serves homes and businesses.
Responsibilities:
Extracted data from non-SAP system Google Big Query as source and loaded into SAP HANA staging tables.
Created different staging jobs for Extracting the data and loading into SAP HANA staging tables.
Created multiple jobs i.e., Customer master stagefor extracting data from staging area.
Created the different relevancy jobs to extract the data from SAP HANA staging tables and transform the data and load into Datamart s by applying validation rules.
Accordingly, the failed records can be analyzed with business /Functional team and reprocessed the records.
After Valid stage tables, then we further enrich data and fit them to be able to meet SAP standards by looking to lookup tables.
Validation lookups are the most frequently used to analyze the Lookups used for validation.
The data after all enrichments are loaded to Enriched Tables.
Created different jobs to migrate data from SAP HANA to SAP S/4 BRIM using LTMC as a loading technique.
Reconciliation is done after loading the data into SAP tables and send the detailed reports.
Experience in SAP Information Steward for data quality/profiling activities.


2. PASS Six Payment Services

Client : Worldline SIX Payment Services
Company : Worldline Global Services
Duration : Mar 2022 April 2023
Technology : SAP BODS 4.2,SAP S4/HANA, SAP ECC, SAP GUI, MS SQL Server, SQL Server Management Studio, Dollar Universe (Automic UniViewer), Putty, Linux/Unix, WinSCP, WEB Focus.
Project Description:
Worldline is the global leader in business and payments transactional services. They are in Merchant Services & Terminals, eMobility and eTransactional Services and Financial Processing.
Responsibilities:
Extracted data from SAP ECC as source and loaded intoSAP HANA staging tables.
Created different staging jobs for Extracting the data and loading into SAP HANA staging tables.
Created multiple jobs i.e., Material Master stagefor extracting data from staging area.
Created the different relevancy jobs to extract the data from SAP HANA staging tables and transform the data and load into Datamart s by applying validation rules.
Accordingly, the failed records can be analyzed with business /Functional team and reprocessed the records.
After Valid stage tables, then we further enrich data and fit them to be able to meet SAP standards by looking to lookup tables.
Validation lookups are the most frequently used to analyze the Lookups used for validation.
The data after all enrichments are loaded to Enriched Tables.
Created different jobs to migrate data from SAP HANA to SAP S/4 HANA using different mechanisms IDOCS, BAPI functions and LTMC.
Tested IDOC manually by passing some sample data using We19 and reprocessed the same using BD87.
Created Data Store for SAP S/4 HANA and imported MATMAS_BAPI_03, DEBMAS IDOC s.
Loaded data into SAP S/4 HANA with status code as 53.

3. equensWorldline Data Migration

Client : equensWorldline
Company : Worldline Global Services
Duration : May 2020 - Feb 2022
Technology : SAP BODS 4.2,SAP S4/HANA, SAP ECC, SAP GUI, Oracle 12g, SQL Developer, Dollar Universe (Automic UniViewer), Putty, Linux/Unix, WinSCP, WEB Focus.
Project Description:
Worldline is the global leader in business and payments transactional services. They are in Merchant Services & Terminals, eMobility and eTransactional Services and Financial Processing.
Responsibilities:
Implemented data migration flows for Material Master.
Extracted legacy data (.CSV files) to BODS 4.2 SP08 using File Locator.
Files are moved from one directory to another by using Linux/Unix commands.
Files are splitted by using Linux/Unix commands and loaded the data based on it.
Created different jobs for Extraction, Transformation, and upload data into SAP S/4 HANA.
Created Stage jobs for extracting data from SAP ECC to oracle 12g stage.
Created multiple jobs i.e., Material Master stage for extracting data from staging area.
Created the different relevancy jobs to extract the data from staging tables and transform the data and load into ODS area by applying validation rules.
Created different jobs for reconciliation report, for the data load in SAP S/4 HANA.
Accordingly, the failed records can be analyzed with business /Functional team and reprocessed the records.
Worked on IDOC s for IDOC type as MATMAS and message type as MATMAS03 from SAP BODS.
We have loaded data into SAP HANA from ECC EHP8.

4. BCMC(Ban contact/Mister Cash)

Client : Worldline
Company : Worldline Global Services
Duration : Jan 2017 Apr 2020
Technology :SAP BODS 4.2,SAP HANA, SAP ECC, SAP GUI, Oracle,MS SQL Server, SQL Server Management Studio, Dollar Universe (Automic UniViewer), Putty, Linux/Unix,WinSCP, WEB Focus.
Project Description:
Worldline is the global leader in business and payments transactional services. They are in Merchant Services & Terminals, eMobility and eTransactional Services and Financial Processing.
This project involves providing Architecture support and solution implementation for the Global Data Warehouse and Credit, Debit and ERPCRM Data Marts and provides support for migration of the Existing ETL architecture from BODI 3.2 to Data Services 4.2.
This Project is an END-TO-END implementation where it focuses on providing solutions to BCMC customer for various Kinds of transactions like POS, ATM, E&M, Customer to Customer type. In this project, data is profiled and undergoes quality checks with Data Quality transforms.
Here data is extracted from multiple source systems to staging and later to ODS & DWH area,
Where data is integrated by applying business rules and dimensions, facts & aggregates are created, and further reporting done on DWH objects.
Responsibilities:
Created Workflows, Data Flows, Transformations and Batch Jobs.
Extracted Data from different sources (SAP ECC, Oracle database and Flat files) and loaded to Target tables.
Implemented SCD Type1, Type2 for the customer selected dimensions using Table Comparison, History Preserving and Key Generation transforms.
Extensively worked on different transforms like Query, Case, Merge, Map Operation and worked on files as sources and targets.
Used LOOKUP, LOOKUP_EXT and Custom functions to derive the columns in BODS ETL.
Implemented Error Handing and Email Alerting solution to all the BODS ETL Jobs which automatically sends an Email to concerned team if Jobs fails or success.
Created Data stores & importing metadata creating File format, Batch job, Data Flow, Query transform, Executing job.
Scheduled the batch jobs using $U scheduling tool for creating the dependencies.
Files are moved from one directory to another by using Linux/Unix commands.
Files are splitted by using Linux/Unix commands and loaded the data based on it.
Created Test logs, documentation after completion of the development.
Involved in Unit Testing and Integration testing.
Involved in preparing of estimations and planning files for the new development request.
Involved in code review, code migration from DEV to other test environments (INT, UAT).
Creation of PFD s (Production follow up document). In case of recovery of any aborted job, which contains complete information of the job i.e., source, target, and dependencies.
Monitored the batch jobs by using Management Console in other environments (INT, UAT).
Attending the Sprint retrospective Meeting and Sprint planning Meeting.

5. ERPCRM/ERPGVA :( Business Intelligence Data Warehouse Application)
Client : Worldline
Company : Atos India Private Ltd.
Duration : Aug 2015 Dec 2016
Technology : SAP BODS 3.2, 4.2,SAP ECC, Oracle 12c/11g, Dollar Universe (Automic UniViewer), SQL Developer, MS SQL Server, Putty, Linux/Unix, WinSCP, WEB Focus,
Project Description:
Worldline is the global leader in business and payments transactional services. They are in Merchant Services & Terminals, eMobility and eTransactional Services and Financial Processing.
This project involves providing Architecture support and solution implementation for the Global Data Warehouse and Credit, Debit and ERPCRM Data Marts.
ERPCRM is the acronym of Enterprise Resource Planning Customer Relationship Management.It is the data warehouse system for all activities of Worldline, except the handling of financial transactions. Therefore, it stores information about the terminals and card readers (stock, manufacturing, orders to the suppliers, and orders from the customers), the feed & services (work orders for technicians), the contracts with customers (sales, renting, leasing, and maintenance of terminals), the use of bankcards.
Responsibilities:
Building of the BODS Jobs, Workflows, Data Flows, ABAP data flows as per the Mapping Specification.
Preparing the Technical Specification documents from Functional specifications.
Worked extensively on different types of transformations and using ABAP data flows for extracting the data from ECC tables.
Extensively used ETL to load data from SAP ECC tables to SQL server.
Implemented Error Handing and Email Alerting solution to all the BODS ETL Jobs which automatically sends an Email to concerned team if Jobs fails.
Created scripts using BODS.
Actively Involved for Creating Data Stores, File formats and System Configurations.
Involved in preparation and execution of the unit, integration, and end to end test cases.
Ensuring proper Dependencies and Proper running of loads (Incremental and Delta loads)
Scheduled Batch jobs using Data Services Management Console.
Experienced on extract the data from different files as sources like .txt, excel, xml file.
Ensuring proper Dependencies and Proper running of loads (Incremental and Delta loads).
Creating sequence and parallel workflows and data flows.
Executing the jobs using Management Console.
Actively involving the hyper care activities in production time.
Coordinating and interfacing with clients and the technical team.
Supported the TCC testing team for the issues in INT and UAT environments.
Attending the Sprint retrospective Meeting and Sprint planning Meeting.

6. Business Intelligence Data Warehouse Application
Title : MMS-GDC (Business Intelligence Data Warehouse Application)
Client : AHM (American Honda Motor Company)
Company : Minacs Ltd.
Duration :Jan 2013 Aug 2015
Technology : SAPBODS 4.0, Oracle 11g, Sybase, LogiXML, TOAD, Putty, WinSCP, Appworx 8,9.

Project Description:
Business Intelligence Data Warehouse Application is used tracking Sales and Marketing purposes for Automobile giants like Honda. The Application can send Service Reminders and Special Campaign offers.
Customer Information Reports: Gives all information about the Customer Status (Active, Inactive, Lost, Hibernating and Prospect) at Month wise and Quarter wise. Top N customer Reports for knowing the Loyal Customers.
Response Revenue Reports: Gives all information about the Revenue generated at Customer wise, Dealer wise, District wise, Zone wise, Region wise and Overall National Level cycle for each Month, Quarter, and Year.

Responsibilities:
1.Scheduling and monitoring of jobs in administrative management console.
2.With predefined SQL s, manually verify the data loads and job execution status in the control tables.
3.Expertise in generating reports and addressing Ad-hoc report requests from the customers using Oracle, Toad.
4.Documenting the resolution for known issues and responsible for updating and uploading the documentation in SharePoint.
5.Involved in end-to-endtesting after daily delta loads.
6.Tested entire customer portal of AHM after the daily delta loads and refreshing of data.
7.Attending the Sprint retrospective Meeting and Sprint planning Meeting.

Personal Details:
Name : Veerasekhar Raju C
DOB : 26-Feb-1989
Keywords: cprogramm business intelligence database sfour sfour active directory information technology microsoft procedural language

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];4457
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: