Bo Guo Data Analyst - Data Analyst |
[email protected] |
Location: Vancouver, British Columbia, Canada |
Relocation: Yes |
Visa: |
Resume file: Xuanbo Guo_Data Analyst_1755035898250.pdf Please check the file(s) for viruses. Files are checked manually and then made available for download. |
Xuanbo Guo
+1(236) 858-3330 [email protected] Greater Vancouver, Canada OBJECTIVE Business and Data Analyst with over 6+ years of experience in the utilities industry, specializing in complex business process analysis, workflow optimization, and the development of scalable, efficient solutions. Adept at bridging the gap between business and technology teams to deliver strategic, data-driven outcomes. Skilled in gathering and transforming business requirements into functional deliverables while ensuring regulatory and organizational compliance. Proficient in eliciting, analyzing, and documenting complex requirements through workshops, interviews, and stakeholder engagement. Strong expertise in SQL query development, optimization, and debugging for big data processing. Experienced in .NET application development, collaborating with developers to build scalable and business-driven solutions. In-depth knowledge of mass market utility operations, including billing, metering, customer engagement, and regulatory requirements. Capable of foreseeing business implications based on current industry trends and regulations. Extensive experience in agile methodologies, including sprint planning, user story creation, and facilitating smooth coordination between business and technical teams. Familiar with both Agile and Waterfall frameworks for IT project implementation. Strong background in data processing, management, and optimization across multiple operational processes. Experienced in data quality, lineage, modeling, metadata practices, and data flow analysis to improve business outcomes. Proficient in Tableau for dashboard development, data slicing, and reporting. Skilled in creating professional presentations with high visual appeal to support strategic programs. Adept at data analysis, reporting, and visualization using Excel and Google Sheets. Known for exceptional communication, critical thinking, and stakeholder management abilities. Highly skilled in bridging communication gaps between business and IT teams to ensure clarity and alignment. Capable of presenting complex ideas effectively to both technical and non-technical audiences. Experienced in documenting workflows, creating test cases, and improving operational efficiencies. Competent in identifying gaps in current data models and business processes to enhance data integrity and performance. SKILLS: Programming and Tools: Python, MySQL, Azure, Wireshark, Windows, Linux, Virtual Machines (VM), Shell Scripting, PowerShell, Git, Jenkins, ServiceNow Software Testing: Test Planning, Test Case Management, UI Testing, Regression Testing, Integration Testing, Functional Testing, Automation Testing (Selenium) Documentation: Bug Report, Performance Monitoring, Log Analysis, Crash Reporting, Knowledge Base Articles, Runbooks, Incident Reports Monitoring & Support: Application Monitoring (e.g., Dynatrace, Nagios, Splunk), Alert Management, Root Cause Analysis (RCA), Incident & Problem Management (ITIL) ITSM & DevOps: IT Service Management (ITSM), Change Management, Release Management, Configuration Management, CI/CD Pipelines, Ticketing Systems (JIRA, ServiceNow) Soft Skills: Analytical Thinking, Problem Solving, Communication, Team Collaboration, Customer Support EXPERIENCE Microsoft (Contractor) Jun 2025 - Current Business Analyst Remote, Canada Performed end to end manual testing for software applications across multiple platforms, including web, mobile, Windows OS, and AI integrated features, ensuring consistent functionality and user experience across diverse environments and device configurations Partnered with developers and product managers to design detailed test plans and write comprehensive test cases aligned with evolving software requirements; actively participated in sprint planning and daily stand ups to prioritize QA tasks within Agile workflows Logged, tracked, and validated software defects using industry standard bug tracking tools; delivered clear and reproducible bug reports, significantly accelerating resolution timelines and improving release quality Executed tests to validate data integrity, device compatibility, and edge case scenarios, with a focus on ensuring successful and timely data releases across hardware dependent features and legacy systems Identified inefficiencies in QA workflows and proactively recommended updates to testing methodologies; contributed to adoption of new tools and testing strategies to optimize test coverage and team productivity Involved in building data pipelines to extract transform and load the raw data and perform analytics on the transformed data using AWS services S3, Glue, Redshift, EMR, RDS, Dynamo DB, Lambda, Athena, KMS focusing on high-availability, fault tolerance, and auto-scaling using AWS Cloud Formation. Monitored regulatory updates to ensure compliance with industry standards in all developed solutions. Operated within an agile framework, collaborating with cross-functional teams to ensure project alignment with business objectives. Provided regular stakeholder updates on project progress, highlighting risks and proposing solutions to maintain project timelines. Designed and reviewed test cases to validate application functionality and alignment with business requirements. Conducted thorough testing phases, identifying defects early and contributing to their resolution. Performed detailed business process analysis and requirement elicitation through stakeholder engagement, ensuring well-documented specifications for development teams. Leveraged Excel and Google Sheets for advanced data analysis, reporting, and business insights. Utilized business analysis tools, including process modeling, requirements documentation, and project management software, to drive successful project outcomes. Managed competing priorities in a dynamic, fast-paced environment, ensuring timely and high-quality project delivery. Applied IT project management methodologies, including Agile and Waterfall, to align projects with stakeholder expectations and business goals. Led the design and deployment of Veeva CRM custom solutions, enhancing team productivity by automating critical sales processes. Developed and refined data analysis models using Python, enabling predictive insights that improved business decision-making. Employed SQL and Python for advanced data analytics and reporting, improving data accuracy and minimizing redundancy. Designed automated CI/CD pipelines using Jenkins, streamlining the development cycle through automated testing and deployments. Client: AGA FinCorp Inc Location: Remote Nov 2021 - May 2022 Role: Sr. Business/Data Analyst Expertise in generating reports using SQL Server Reporting Services (SSRS). Good knowledge of SSRS architecture and Hands-on Experience in developing, deploying, and managing parameterized reports and Linked reports, sub-reports, drill-down reports, ad-hoc reports, and Crystal Reports with thorough knowledge of report service architecture. (Table, Chart, and matrix report) using SSRS. Strong Knowledge in using Reporting services to deliver output according to client requirements, including Dashboards, scorecards, and Graphical Reports. Developed and maintained interactive dashboards and reports in Power BI, providing real-time insights and visualizations for stakeholders. Configured automated database backups, integrity checks, and deployment rollbacks using Azure DevOps Pipelines. Optimized SQL query performance by offloading heavy computations to PySpark, improving execution speed and resource utilization. Transformed complex datasets into meaningful business intelligence using Power BI's data modeling, DAX, and Power Query functions. Streamlined database schema changes with Azure DevOps YAML pipelines for consistent and traceable deployments. Migrated around 50 reports from Sisense to SSRS. Develop complex stored procedures using T-SQL to generate ad hoc reports within SQL Server Reporting Service (SSRS). Built scalable ETL pipelines with PySpark to ingest, transform, and load data into SQL databases efficiently. Built predictive models using Python s scikit-learn to support business decision-making. Wrote custom Python scripts to parse, clean, and aggregate data from multiple sources. Created RDLs for Sisense reports to be converted to SSRS. Enabled continuous integration and versioning of SQL stored procedures and scripts through Azure DevOps Repos. Integrated PySpark with Hive and HDFS for seamless big data access and transformation. Built reusable PySpark modules for data ingestion, transformation, and aggregation workflows. Sound knowledge of Relational and multi-dimensional databases, data migration, system & data integration, Reporting, Analytics, and Data Management. Implemented robust database testing strategies within Azure DevOps to ensure code quality and minimize deployment errors. Ensured data accuracy and consistency across EDW and Data Mart environments. Experience in writing T-SQL for developing complex Stored Procedures, Triggers, indices, views, user functions, and SQL Query joins for Data Integrity and normalization. Provided visual/graphic analysis based on data content. Involved in creating SQL queries and edited inner, left, and right joins in SSMS. Created various database objects like stored procedures, functions, tables, and views. Coordinated/converted 50+ reports to SSRS and was the key point of contact for migration and support issues. Modified numerous reports/stored procedures that were not performing properly (slow response time, inconsistent results, poor format of the actual report). Created reports to retrieve data using a stored procedure that accepts parameters. Environment: Microsoft Visual Studio 2019, Sisense, Power BI, Azure Devops,Tableau, Pyspark, SSMS 2019, SSRS Crane Worldwide Logistics, Remote Aug 2019 July 2021 Sr. Data Analyst Responsibilities: Provided strategic market intelligence, enhancing business performance through identified growth and optimization opportunities. Implementing robust data pipelines ensured high platform reliability, security, and scalability. Collaborated cross-functionally with teams to develop effective data visualizations, dashboards, and reports, driving data-driven decision-making and business intelligence. Leveraged Python, VBA, and SQL to automate processes, clean data, and enhance reporting capabilities for better operational efficiency. Successfully maintained data accuracy and regulatory compliance by performing detailed data validation and integrity checks. Optimized SQL queries, performed indexing, and managed partitioning to improve database performance and query efficiency. Created and maintained reports using Excel, Google Sheets, QlikView, and other business intelligence tools, supporting KPIs and providing actionable insights. Developed and refined VBA macros for automating repetitive tasks and improving data processing workflows. Maintained and enhanced SAP BusinessObjects and other reporting systems to ensure effective data governance, compliance, and integration with relational databases like Oracle. Supported AWS infrastructure to ensure high availability, scalability, and performance tuning of the cloud environment. Designed and implemented data transformation strategies to ensure seamless data integration, accuracy, and quality. Applied Scrapy for efficient web scraping and gathering external data for competitive analysis and business intelligence. Collaborated in project management using tools like Trello and Lucidchart, assisting in workflow visualization and process optimization across teams. Drove cost reduction initiatives by optimizing data processes and ensuring the most efficient use of resources within the organization. Regularly performed data cleaning and ensured data reliability to provide accurate and actionable insights for business operations. Supported strategic decision-making through detailed analysis and reports, using data analytics and market insights to forecast trends and opportunities. Environment: Python, Spark, Spark-Streaming, Spark SQL, AWS EMR, S3, EC2, MapR, HDFS, Hive, PIG, Apache Kafka, Sqoop, Python, Scala, PySpark, shell scripting, Linux, MySQL, NoSQL, Jenkins, Eclipse, Oracle, Git, Tableau, Power BI and Agile Methodologies, GCP,Google Big Query,Pub sub, Cloud Functions EDUCATION Master in Management Information Systems, Temple University - Philadelphia, PA Bachelor in Supply Chain Management, Temple University - Philadelphia, PA 2020 - 2021 2014 2017 PROJECTS Publication NetVisionary: An Automated and Scalable Suricata-Based IDS Platform IEEE Canadian Conference on Electrical and Computer Engineering 2025 This paper presents NetVisionary, an automated and scalable platform built on Suricata for efficient intrusion detection. Abstract This paper presents NetVisionary, an automated and scalable platform built on Suricata for efficient intrusion detection. Evaluated using IDS benchmark datasets, Suricata is selected for its superior multi-threading and low false positive rate. Department of Computer Science, New York Institute of Technology OWASP Top Ten and Cyber Security Testing. My project bridges academic concepts with practical cybersecurity testing. Understanding and mitigating the OWASP Top Ten and classic vulnerabilities like buffer overflow is critical for any cyber defender. Through hands on testing, analysis, and secure coding practices, students will gain skills directly applicable to real-world security engineering roles. Keywords: continuous integration continuous deployment quality analyst artificial intelligence user interface business intelligence sthree database active directory information technology Pennsylvania |