Home

New Requirement Lead Data Engineer (Snowflake / dbt / Qlik) ||| Dallas , TX , Raleigh, NC , Phoenix, AZ (Onsite role from Day 1) at Dallas, Texas, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=3179650&uid=75196fd6ee1d4e498d8a1e4ffabf5835

Role -

Lead Data Engineer (Snowflake / dbt / Qlik)

Location - Dallas , TX , Raleigh , NC , Phoenix , AZ (Onsite role from Day 1)
Duration - Long term Contract
Interview - Phone and Webex

Only or - with passport Number

Implementation Partner TCS

need 12-14+ year of Experience

Rate - $90/hr C2C (Including VMS charges)

Job Description

Required Technical Skills in Resume
Snowflake (production hands-on experience)
dbt Cloud (Data Vault 2.0 + Dimensional modeling)
Airflow (MWAA)
Terraform (HashiCorp practices)
Advanced SQL & Python
AWS basics (S3, IAM, CloudWatch)
Snowflake governance automation & data classification
Iceberg / External tables / Event-driven ingestion (Kafka)
Data observability tools (Great Expectations, Monte Carlo, etc.)
BI/Semantic modeling (ThoughtSpot, Looker, Power BI)
SRE & Platform Engineering practices
SCIM/SSO integrations (Okta)

Key Responsibilities
Data Modeling & Warehousing
Implement Raw Data Vault 2.0 (Hubs/Links/Satellites) Consumption Layer patterns using dbt Cloud.
Build optimized Snowflake objects (tables, streams, tasks, materialized views).
Optimize clustering, micro-partitioning, and query performance.

Data Ingestion & Orchestration
Design scalable ingestion frameworks using Qlik / Glue / ETL tools.
Author and manage Airflow (MWAA) DAGs and/or dbt Cloud orchestration.
Develop idempotent, rerunnable workflows with SLAs and lineage tracking.

Security & Governance
Implement RBAC/ABAC, masking policies, row-level access controls, tagging, and data classification.
Enforce audit-ready controls including change management, role segregation, and evidence tracking.
Support regulatory compliance requirements.

Infrastructure as Code & DevOps
Use Terraform with Git-based CI/CD pipelines for infrastructure provisioning and promotion.
Maintain environment separation (DEV/QA/UAT/PROD).

Data Quality & Observability
Embed testing within dbt (uniqueness, freshness, relationships).
Implement reconciliation checks and anomaly detection.
Monitor via Snowflake ACCOUNT_USAGE and integrate logs with SIEM/APM tools.

Cost & Performance Optimization
Right-size Snowflake warehouses, configure auto-suspend/resume.
Implement resource monitors and concurrency scaling.
Drive FinOps and cost optimization practices.

--

Keywords: continuous integration continuous deployment quality analyst business intelligence sthree information technology Arizona North Carolina Texas
New Requirement Lead Data Engineer (Snowflake / dbt / Qlik) ||| Dallas , TX , Raleigh, NC , Phoenix, AZ (Onsite role from Day 1)
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=3179650&uid=75196fd6ee1d4e498d8a1e4ffabf5835
[email protected]
View All
05:39 PM 03-Mar-26


To remove this job post send "job_kill 3179650" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.

Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 9

Location: Dallas, Texas