Home

Abhishek Rauniyar - Data Analyst
[email protected]
Location: Charlotte, North Carolina, USA
Relocation: Open To Relocate
Visa: GC Holder
Resume file: Abhishek_DA_1751996809213.pdf
Please check the file(s) for viruses. Files are checked manually and then made available for download.
ABHISHEK RAUNIYAR |Data Analyst| Green Card Holder
NC | +1 (980) 505- 7767 | [email protected]| LinkedIn
SUMMARY
Results-driven Data Analyst with 8+ years of experience delivering actionable insights, interactive dashboards, and scalable data solutions across banking, financial services, insurance, and industrial manufacturing sectors. Adept at transforming raw, multi-source data into impactful visualizations and predictive models using SQL, Python, R, Tableau, and Power BI. Proven track record of enhancing data pipelines, automating ETL workflows, and optimizing reporting infrastructures across cloud and on-premise environments (AWS, Azure, Redshift, SQL Server, Oracle). Expert in statistical modeling, forecasting, and root cause analysis, with strong command of regulatory reporting (SOX, audit readiness), operational KPIs, treasury analytics, and risk scoring systems. Skilled in working with large-scale datasets, conducting time series analysis, and deploying enterprise-wide dashboards that improve decision-making speed by up to 40%. Collaborative leader with experience coordinating Agile teams, integrating Salesforce CRM, and executing data governance across diverse business functions.
TECHNICAL SKILLS
Programming Language
Python (Pandas, NumPy, Scikit Learn), R, SQL, MDX, DAX, SAS
Visualization Tools
Tableau, Power BI, Microsoft Excel (Pivot Tables, VLOOKUP), SSRS, SSAS
Database
SQL Server, Oracle, Teradata, DB2, Redshift
Cloud Technologies
AWS (S3, Redshift), Azure (Analysis Services, Synapse), Tableau Server
Advanced Analytical Skills
Risk Modeling, Trend Analysis, KPI Tracking, Statistical Analysis, Root Cause Analysis, Data Lineage Documentation
ETL Tools
SQL, Python (ETL), Informatica, SSIS
Automation & Scheduling
Autosys, SQL Server Agent, Python Scripting
Other Skills
Data Cleaning, Data Manipulation, Data Transformation, Data Governance, Business Intelligence, Self-Service Analytics
Methodologies
Agile (Scrum), Waterfall
Tools & IDEs
Excel (Advanced), PowerPoint, SharePoint, Jira, Confluence, TFS
CRM
Salesforce CRM
EXPERIENCE
Data Analyst (SQL/Power BI/Tableau/Dashboard Developer) Mar 2017 May 2025
First Hawaiian Bank

Architected enterprise-wide Tableau Server infrastructure with custom permission controls and SSL configuration to ensure secure self-service analytics, reducing access request bottlenecks by 60%.

Leveraged AWS S3 and Redshift to automate ingestion of large-scale treasury transaction data, enabling scalable cloud-based storage and querying, which decreased data retrieval latency by 35% for BI reports.

Employed Python libraries such as Pandas, NumPy, and Stats models to perform outlier detection and statistical anomaly checks on daily financial feeds, reducing data quality issues by 25% and improving executive trust in risk dashboards.

Automated ETL processes using SQL and Python to import daily transactional data into Power BI models, increasing data freshness by 70% for treasury analytics.

Devised interactive Power BI dashboards with advanced DAX calculations to visualize credit risk exposures, allowing executives to identify high-risk assets 40% faster.

Spearheaded integration of Azure Analysis Services to enrich Power BI reports with real-time financial data streams, resulting in 15% greater forecasting accuracy.

Formulated custom Tableau visualizations blending Oracle, Teradata, and Excel data to monitor deposit trends, which helped the marketing team boost retention rates by 18%.

Monitored Tableau Server usage and performance metrics, creating audit trails that reduced compliance gaps and ensured 100% SOX audit readiness.

Constructed KPI dashboards for operational efficiency metrics using dual-axis charts and LOD expressions in Tableau, leading to a 30% improvement in branch-level performance monitoring.

Revamped legacy SSRS reports into interactive dashboards using Power BI, shortening reporting turnaround time by 3 business days for regulatory reporting teams.

Utilized R in combination with SQL to perform time series forecasting and statistical trend analysis on treasury liquidity data, enhancing the bank s ability to proactively manage cash reserves and resulting in a 20% improvement in liquidity planning accuracy.
Data Analyst (ETL/SQL Developer) Dec 2015 - Mar 2017
TIAA (Retirement & Investing Division)

Developed ETL workflows using SSIS to aggregate retirement account balances across multiple custodians, reducing reconciliation time by 40%.

Analyzed fund performance using complex SQL joins and window functions, enabling product teams to optimize investment products for underperforming segments.

Scripted Python-based automation workflows to cleanse and transform monthly retirement fund inflows, cutting data preparation time by 35% and ensuring timely report readiness for compliance reviews.

Integrated Salesforce CRM data with SQL Server to enrich participant contribution analytics, enabling a 20% improvement in cross-platform engagement metrics and personalized retirement plan recommendations.

Implemented version-controlled Tableau dashboards with parameterized filters and action links, enhancing user adoption by 25% among financial advisors.

Refactored SAS-based transformation logic into SQL Server stored procedures, decreasing runtime by 50% and improving maintainability.

Visualized 401(k) plan contributions using Tableau Gantt charts and trend lines to support plan sponsors in tracking participant behavior, increasing plan insights by 30%.

Synthesized multi-source data (Oracle, Excel, flat files) into a centralized DataMart to support audit compliance, achieving 100% SLA adherence during quarterly reviews.

Facilitated Agile ceremonies using JIRA and Confluence to break down deliverables, track progress, and document data lineage, resulting in on-time delivery of 5+ high-priority analytics projects.

Instituted scheduled report delivery using Tableau and SSRS for monthly investment committee meetings, automating 90% of manual reporting tasks.

Employed R (tidyverse, ggplot2) to build retirement fund performance visualizations and statistical summaries for quarterly reports, enabling investment managers to detect underperforming sectors 15% earlier and rebalance portfolios more effectively.
Business Intelligence Developer Jul 2014 - Dec 2015
Wells Fargo (Wholesale Insurance, Home Mortgage)

Modeled mortgage and insurance claim data using star and snowflake schema to improve OLAP cube design, which accelerated report load times by 45%.

Engineered Python scripts to automate validation of mortgage data pipelines, reducing manual QA time by 40% and ensuring consistent accuracy across risk modeling inputs.

Coded complex MDX queries for custom SSAS cubes that powered actuarial risk dashboards, enabling underwriters to make faster decisions on high-value policies.

Transformed VBScript-based reporting logic into modern SSIS pipelines and SQL procedures, cutting support ticket volumes by 30%.

Consolidated disparate insurance data sources using Informatica, creating a unified staging environment that enhanced cross-policy visibility by 50%.

Administered Tableau dashboards for senior mortgage officers, including geographic heat maps and delinquency indicators, which increased executive insight by 40%.

Coordinated onshore and offshore development teams, managing sprint deliverables and facilitating stakeholder demos, leading to a 20% reduction in project backlogs.

Executed end-to-end data lineage documentation across SSIS/SSRS/Power BI environments to meet stringent internal audit requirements.

Launched automated nightly job schedules in Autosys and SQL Server Agent, ensuring 99.9% uptime for all risk and performance reports across divisions.

Integrated R scripts into SSIS workflows to conduct regression-based risk modeling on mortgage defaults, improving the accuracy of risk flags by 12%, which supported more informed underwriting decisions.
Data Analyst Dec 2013 Jun 2014 Honeywell

Designed dynamic Excel models with pivot tables and conditional logic to track production efficiency across 10+ manufacturing lines, which led to a 22% improvement in plant-level performance review cycles.

Synthesized real-time inventory data using VLOOKUP and nested formulas to produce automated Excel dashboards for procurement teams, reducing order delays by 18% through proactive stock replenishment alerts.

Queried SQL Server databases to extract downtime and throughput data, enabling root cause analysis of recurring bottlenecks, which supported a 15% decrease in unplanned machine outages.

Developed interactive Tableau dashboards visualizing KPIs such as yield rate and asset utilization by plant and shift, which improved executive decision-making efficiency by 30% during quarterly reviews.

Programmed Python-based automation scripts to cleanse and merge equipment telemetry and ERP datasets, eliminating 90% of manual pre-processing time and improving model-ready data quality for predictive maintenance.

Rectified inconsistent supplier data by standardizing units, correcting mismatched fields, and filtering duplicates enhancing the accuracy of supplier compliance reports and reducing exception handling by 25%.
EDUCATION
Bachelor s In Engineering Dec 2014
Illinois Institute of Technology, IL
Keywords: quality analyst business intelligence sthree rlang Illinois North Carolina

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];5783
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: