Home

Data Quality Engineer - Hybrid in Boston, MA - (GC & H4 needed) - C2C at Boston, Massachusetts, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2117169&uid=

From:

Kiran Kumar,

Panzer Solutions LLC

[email protected]

Reply to:   [email protected]

Job Description:

Position:
Data Quality Engineer

Duration: Long Term

Location:
Hybrid in Boston, MA

Who We Are Looking For

The CASM (Cyber AI App & Attack Surface Management) Platform Product & Engineering team is focused on building enterprise-level robust and scalable Cyber AI applications and Platform to manage vulnerability, risk, security configurations, and attack surfaces oversight across the State Street Bank. We are seeking an experienced, hands-on Sr. Data and Application Senior Quality Assurance (SQA) Engineer. This role focuses on end-to-end quality assurance testing, covering data pipelines, API integrations, backend services, user interfaces, and automation across our Cyber AI application platform. This position plays a crucial role in ensuring the integrity, reliability, and security of the CASM platform that supports State Streets cybersecurity, vulnerability, and risk management needs. We have multiple openings for this role, and it is open to candidates with varying levels of experience.

About CASM Platform

The CASM (Cyber AI App & Attack Surface Management) team is focused on building an Enterprise-level robust, scalable CASM Platform to support cybersecurity, vulnerability management, risk management, and attack surface oversight across the State Street Bank. As part of our mission, we build Cyber AI applications and leverage on 3rd party product to manage vulnerabilities, risk exposure, cyber incidents, continuous compliance driving a unified platform approach for data and insights at enterprise scale.

Role Responsibilities

As a Sr. Data and Application SQA Engineer, CASM Platform, you will:

Develop Comprehensive Test Strategies: Create and implement test plans covering E2E testing, including data source validation, API testing, data pipeline automation, user plane, backend services, GUI, and web application testing.

Automate Test Workflows: Design and maintain automated test scripts for data pipelines, leveraging tools like Databricks for DQM (Data Quality Management) and ensuring data accuracy and quality from ingestion through to analysis layers.

Collaborate Across Teams: Work closely with Product Owner, Data Engineers, AI Application Engineers, and DevOps teams to understand requirements, identify test scenarios, and troubleshoot issues efficiently.

Data and API Validation: Conduct in-depth testing of data workflows, ETL/ELT jobs, and API integrations, verifying data integrity, transformation accuracy, and adherence to security protocols.

Continuous Integration & Deployment (CI/CD): Implement and maintain CI/CD pipelines to automate testing, security scanning, and deployment for data and application workflows.

Test Backend Services and SQL Workflows: Execute SQL and Python-based tests to validate database operations, backend services, and data plane consistency.

Perform web application and GUI testing, ensuring alignment with CASM platform requirements.

Drive Quality Assurance Standards: Lead initiatives to establish testing best practices, promote data accuracy, and enhance system reliability through rigorous, continuous testing.

Monitor and Report Quality Metrics: Establish KPIs for testing effectiveness, create detailed reports on testing outcomes, and participate in defect analysis to improve testing processes.

Minimum Qualifications

Education: Bachelors degree in Computer Science, Information Systems, or a related field, or equivalent professional experience.

Experience: 5+ years in a quality assurance role with a focus on data and application testing, including hands-on experience in data pipeline testing, API, and backend service validation.

Automation Tools: Proven experience of Databricks and test automation frameworks (e.g., Selenium, Postman) and data quality tools (Databricks DQM or similar).

Programming Languages: Proficiency in SQL and Python for writing automated tests and conducting data validation.

Cloud Knowledge: Familiarity with AWS services (e.g., S3, EC2, Lambda) and experience with CI/CD tools like Jenkins, GitLab, or AWS CodePipeline.

Security Testing: Knowledge of SCAS, SAST, DAST/WAS, and experience within secure SDLC frameworks.

Agile Methodologies: Strong understanding of Agile practices and experience in Agile/Scrum environments.

Keywords: continuous integration continuous deployment artificial intelligence sthree information technology Massachusetts
Data Quality Engineer - Hybrid in Boston, MA - (GC & H4 needed) - C2C
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2117169&uid=
[email protected]
View All
01:03 AM 28-Jan-25


To remove this job post send "job_kill 2117169" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 35

Location: , Indiana