Home

Gajalakshmi - Data Analyst
[email protected]
Location: Dearborn, Michigan, USA
Relocation: Open to relocate
Visa: OPT
Resume file: (1A)Gajalakshmi- data analyst_engineer_1773349633911.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
Employer Details:
E-mail: [email protected]
Phone: +1 (732) 481-2818
Summary:
Data Engineer & Analyst specializing in building scalable data pipelines and transforming complex datasets into actionable insights. Experienced in SQL, Python, cloud platforms, and BI tools to design robust ETL workflows, data models, and performance dashboards that support strategic decision-making across public sector and financial services environments.
Technical skills :
Programming Languages Python, SQL, Java
Data Analysis & Analytics Exploratory Data Analysis (EDA), Statistical Analysis, A/B Testing, Impact Analysis, Data Validation, Data Cleansing
Data Engineering / ETL ETL Pipelines, Data Transformation, Data Integration, Data Modeling, Data Migration
Big Data & Processing Apache Spark, Databricks
Workflow Orchestration Apache Airflow
Cloud Platforms AWS (S3), Azure, Google Cloud Platform (GCP)
Databases & Data Warehousing Oracle Database, Snowflake
Data Visualization & BI Power BI, Tableau
Machine Learning Regression, Clustering, Predictive Modeling, Feature Engineering
Data Tools Excel (Advanced Excel, Pivot Tables, Data Reconciliation)
DevOps / CI-CD CI/CD Pipelines, Cloud Build, Artifact Registry
APIs & Development REST APIs, Tableau Extension API
Operating Systems Windows, macOS, Linux
Methodologies Agile, Cross-Functional Collaboration
Domains Worked Healthcare, State Government, Education, Nonprofit (Crime Analytics), Banking & Finance

WORK EXPERIENCE
State of Michigan -DHHS, Lansing, MI Sep 2025 - Present
Data Analyst
Responsibilities :
Analyzed large, multi-source datasets through exploratory data analysis (EDA) using Python across health services to identify patterns, trends, anomalies and risks impacting program outcomes and compliance.
Developed and optimized advanced SQL queries to extract, transform, and validate data for reporting and analytical requests.
Designed interactive Power BI dashboards to monitor performance metrics, compliance indicators, and workload distribution for data-driven decision making.
Performed advanced Excel analysis, including data cleansing, pivot tables, and reconciliation checks, to ensure reporting accuracy and executive decision-making.
Executed A/B testing and impact analysis in collaboration with cross-functional Agile teams to evaluate performance changes and quantify business outcomes.
Translated policy and operational requirements into scalable data models, KPIs, and dashboards to assess program effectiveness and workload distribution.
University of Michigan-Dearborn, MI, USA Sep 2024 - Sep 2025
Grad - Data Analyst
Responsibilities :
Developed a Python-based application for Windows, macOS, and Linux that automated complex data cleaning and document extraction, delivering ready-to-use CSVs to non-technical users and reducing processing time by 80%.
Designed interactive dashboards and performed advanced data cleaning using Python and Power BI, enhancing data accessibility and enabling real-time decision-making for academic advisors.
Performed Extract, Transform, Load (ETL) processes on multi-term enrollment data to uncover registration trends, identify capacity constraints, and support data-driven course planning decisions.
Applied statistical analysis and machine learning techniques to predict student success rates, leveraging historical data to identify patterns and optimize academic support interventions.
Provided technical support to students, improving their efficiency with academic software tools for coursework and research.

Advocates for Community Transformation (Act), TX, USA May 2024 - Jul 2024
Data Analyst Intern
Responsibilities :
Created interactive dashboards using Power BI to present crime trends and actionable insights to stakeholders.
Utilized Python and SQL to clean, transform, and prepare complex datasets for analysis, ensuring data accuracy and consistency across multiple sources.
Designed predictive models using clustering and regression algorithms and geospatial mapping tools to identify high-crime areas, improving resource allocation by 20%.

Oracle, India Aug 2022 - Aug 2023
Data Engineer
Responsibilities :
Developed automated pipelines using Apache Airflow, Spark, and AWS S3 to unify payments data from multiple systems, improving data availability and reliability for analytics teams.
Built machine learning models in Python to assess customer creditworthiness using transaction and loan data, improving credit scoring accuracy by 15% and enabling more reliable lending decisions.
Migrated financial datasets from on-premise Oracle databases to Snowflake, optimizing query performance and enabling faster reporting through scalable cloud storage.
Created Python scripts and ETL workflows to automate loan data validation and reconciliation, reducing manual effort and ensuring accurate financial records.
Partnered with product and finance teams to evaluate payment and loan performance using SQL and Tableau, uncovering key drivers of transaction volume and cost.
Integrated GenAI-driven insights into Tableau dashboards using the Tableau Extension API, enabling dynamic, context-aware analytics and improving decision-making efficiency across teams.

HSBC- India Jun 2020 - Aug 2022
Data Engineer
Responsibilities :
Designed and optimized ETL pipelines for high-volume transaction data, improving data ingestion efficiency by 40% using Databricks.
Used SQL to query and analyze payment traffic, tracking transaction performance and building interactive Tableau dashboards to support business decisions.
Implemented Python scripts to automate data validation, transformation, and preprocessing workflows, reducing manual effort by 60%.
Developed and applied machine learning models to improve merchant fraud detection, increasing model accuracy by 25% and reducing false positives by 15%.
Collaborated with cross-functional teams to design scalable data workflows and ensure smooth integration between analytical and transactional systems.

Software Engineer Jul 2019 - Jun 2020
Responsibilities :
Architected and implemented business logic for Corporate Banking using Java, Spring Boot and REST API, significantly enhancing the backend processing capabilities.
Managed CI/CD pipelines on Google Cloud Platform (GCP) using Cloud Build and Artifact Registry to streamline release cycles.
Collaborated with cross-functional teams to design scalable data workflows and ensure smooth integration between analytical and transactional systems.
EDUCATION
Master of Science - University of Michigan -Data Science with GPA: 3.94/4.0 - Apr 2025
Bachelor of Technology- Anna university - Information Technology with GPA: 3.69/4.0 - Apr 2019
PROJECTS
Financial Transactions Pipeline
Developed an end-to-end ETL pipeline using Apache Airflow, Spark, and AWS S3 to process 10M+ daily transaction records. Improved data refresh speed by 40% and enhanced data reliability for downstream analytics.
Public Service Performance Dashboard
Built interactive Power BI dashboards using Azure-integrated data to visualize case workloads, response times, and regional trends. Enabled leadership teams to identify inefficiencies and improve service delivery metrics by 15%.
Credit Risk Prediction Model
Designed and deployed a Python-based machine learning model using logistic regression and feature engineering on customer loan data. Improved credit risk classification accuracy by 18%, supporting better lending decisions.
AWARDS
Star Performer, Pat on the Back, Top Performer awards for significant project contributions.
Keywords: continuous integration continuous deployment business intelligence sthree Michigan Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];6985
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: