| Aishwarya - Sr Cloud Data Engineer |
| [email protected] |
| Location: Princeton Junction, New Jersey, USA |
| Relocation: yes |
| Visa: H1B |
| Resume file: Aishwarya_SCloud Data Engineer_Resume_1763667595525.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
|
Aishwarya
609 934 5280 _____________________________________________________________________________________ Summary Senior Cloud Data Engineer/Data Analyst with 8+ years of demonstrated experience in interpreting, analyzing, migrating data by building ETL pipelines and visualizing data for driving business solutions. I have exceptional knowledge in various analytics, ETL tools and Cloud platforms for big data analysis by working in projects across various leading Consulting firms. Technical Skills: Software Products: Jupyter Notebook, VSCode, Eclipse, PyCharm, Spyder, RStudio, Putty, GitHub, IBM AIX, SAP Business Intelligence (BI), Informatica Life Cycle Management (ILM), SAS Administrator ETL and CI/CD Tools: AWS Glue, AWS DMS, AWS SCT, Azure Data Factory (ADF), Informatica PowerCenter, SAP BODS, Databricks, Jenkins, Apache Sqoop, OpenShift, CDM, dbt Framework: Apache Spark, Hadoop, Airflow, NiFi, Kubernetes, Apache Kafka, AWS LightSail, OpenRouter Databases/Data warehouses: MySQL, MS SQL Developer, Oracle DB, Presto, Apache Hive, PostgreSQL, Snowflake, Teradata, AWS Redshift, AWS RDS, AWS Aurora, BigQuery, Daiquery Cloud Platforms: AWS, Azure, Google Cloud Platform (GCP) BI Reporting Visualization: Tableau, Microsoft Power BI, Microsoft Excel (advanced), Unidash ML/AI Libraries: NumPy, Scikit-Learn, PyTorch, TensorFlow, Keras, MatplotLib, SciPy, Pandas Development Data Governance/PM Tools: Resource Monitoring Tool, Dynatrace, Informatica Data Quality (IDQ), Oracle Enterprise Data Quality (EDQ), ServiceNow, JIRA, Confluence, Microsoft SharePoint, Visio, Mural Languages: SQL, T-SQL, PL/SQL, SnowSQL, Python, PySpark, Scala, R, Unix, Java, JavaScript Certifications: SQL Server, Power BI, Tableau, AWS, Azure, GCP, Google Analytics, Google Agentspace Education: Master from Rutgers University, Rutgers Business School - Newark, NJ - 2020 Bachelor from Sastra University - Thanjavur, India. - 2016 Professional Work Experience: Deloitte Consulting LLP, NJ, USA March 2022 - Present Role: Senior Cloud Data Engineer Responsibilities: Perform data analysis and create governance policies for BI platforms (Tableau, Power BI) to monitor and improve the dashboard performance issues like slow query, high CPU disk usage, high memory usage. Create JIRA user stories, facilitate Scrum meetings, Sprint planning, follow Agile methodologies and present governance and measurement steps that the end user/Admins can follow using Resource Monitoring Tool (RMT) and Dynatrace for monitoring and sending alerts. Deploy Cloud Mart, a GenAI powered e-commerce application having chatbox that demonstrates how modern business solve customer support challenges to increase efficiency by 60% using AWS LightSail container integrating with OpenRouter API. Analyze/map high-volume data and create ER Diagrams to get useful insights and migrate data from the legacy database MS SQL server to Postgres DB using AWS services like AWS S3, SCT, Glue, DMS, Redshift, RDS, Aurora, CloudWatch and IAM. Lead high volume data analysis and data mapping to migrate data from legacy Oracle DB to Postgres DB using Azure Data Factory (ADF), and GitHub for healthcare domain. Manage high volume data migration from client server (SQL server) to BigQuery, using Dataproc, Dataflow, Cloud Composer, Cloud Monitoring, and Cloud IAM in Google Cloud Platform (GCP). Create test plans to test the data in DEV, TEST, and PROD environments. Automate ETL/ELT process using tools and programming languages like Python, Postgres, PySpark, OpenShift, Hadoop, Jenkins for CI/CD and GitHub thus saving up to 5-9 hours/day. Perform data integrity, quality, profiling using Oracle Enterprise Data Quality (EDQ) tool. Perform data analysis using advanced SQL (query optimization, stored procedures, CTEs and joins) for building dashboards in Unidash to measure targeted, reached, engaged audiences for small business groups in Meta which helped in improving customer engagement and retention by 40%. Manage Privacy/compliance related projects to maintain compliance with 3rd party Ads company. Develop efficient and scalable Python data pipelines to move data within diverse groups in Meta by creating DAG using tools and software like Hive, Presto, Daiquery, CDM, VSCode, PL/SQL. Capgemini America Inc, NJ Feb 2020 - March 2022 Role: Senior Data Analyst Responsibilities: Led data migration from various sources using data ingestion tools like AWS Glue, AWS DMS, NiFi, Apache Sqoop, AWS S3 to Snowflake using SnowSQL and Python. Also performed data quality checks and profiling using AWS Glue to check any data integrity, quality, and consistency before data migration thus facilitating seamless migration to cloud data warehouse. Automated code testing using Python in Jupyter Notebook that saved the SIT/UAT time by 4 hours. Managed end-to-end ETL/ELT data migration from sources like BW, SAP S4/HANA, to target Oracle tables using SAP Business Object Data Services (BODS). Performed troubleshooting by log monitoring/root-cause analysis during failures thus triaging the issues avoiding any pipeline delays. Migrated healthcare related data from IBM AIX to Linux using SAS Administration tool and Putty. Analyzed financial KPIs related to healthcare billing, budget, forecasted claims trends, created reports and optimized cost projections using Power BI and SQL. Analyzed customer accounts and devices discrepancies in the Billing (Gateway), Unified Inventory (UIM), Broadband Provisioning System (BPS) using PL/SQL, Tableau to build a solution. Ana-Data Consulting Inc, NJ June 2019 - Feb 2020 Role: Data Analyst Responsibilities: Analyzed total annual returns of JP Morgan s Emerging Market Bond Index investment data based on Moody's bond factors for developing countries using Excel and performed regression and clustering analysis ML models and statistical algorithms in R and Python thus delivered actionable insights on emerging market bonds, improving investment strategy and risk assessment accuracy. Analyzed complex CRM e-commerce sales data for a Super Store by creating DAX commands and interactive/scalable dashboards using Power BI in the company s web server to measure metrics like sales variance, trend, forecasting, customer retention, current and previous year transactions, high selling product and give recommendations to clients. This enabled data-driven sales decisions with real-time sales performance, improving business visibility and tracking customer retention. Documented all the project activities, created reports, and presented them to the client on weekly basis thus enhanced stakeholder engagement and project transparency through presentations. Performed Heart Disease analysis using ML algorithms to identify age and health factors that cause heart attack. Designed Power BI dashboard that the users can utilize to build preventive measures. Later integrated with AI that helped in detecting the possibilities at an earlier stage. Tata Consultancy Services Pvt. Ltd, India Jun 2016 - Aug 2018 Role: Data Engineer Responsibilities: Analyzed data, generated reports, and created visualizations for informed decision-making using SAP Business Intelligence (BI) tool. Managed DR and critical Production migration activities using Informatica PowerCenter ETL tool and Putty Unix. Provided source and target connections, user access to the required database. Automated the migration process which resulted in cost and time savings up to 8 hours/day. Monitored data quality and data integrity using Informatica Data Quality (IDQ) tool. Managed complete lifecycle of data, including creation, storage, usage, and retirement, with a focus on optimizing utility, minimizing costs, and reducing risks using Informatica Life Cycle Management (ILM) tool. This facilitated storage optimization, data quality, security, data retention and archival. Documented change requests and incidents in ServiceNow, ensuring audit readiness. Keywords: continuous integration continuous deployment artificial intelligence machine learning business intelligence sthree database sfour rlang business works microsoft mississippi procedural language New Jersey |