tatari

Sr Data Quality Engineer - SDET

Apply Now

At a Glance

Location
New York, United States
Work Regime
hybrid
Experience
5+ years
Posted
2026-03-20T09:28:44-04:00

Key Requirements

Required Skills

Data EngineeringData ScienceDatabricksMicroservicesPythonSQL

Domain Knowledge

  • Automation
  • Engineering

Benefits & Perks

Time Off

ductivity Perk Unlimited PTO and sick days Monthly Company

Health Insurance

Equity compensation Health insurance coverage for you and your dependents 40

Requirements

5+ years in data quality engineering, data engineering, or backend engineering with a strong focus on automated testing and data validation

Strong proficiency in Python and SQL, including experience with relational and cloud-hosted databases

Experience building test automation and validation frameworks for data pipelines, APIs, and microservices using pytest or similar tools

Experience with data pipeline orchestration tools (e.g., Airflow, Databricks) and cloud data infrastructure

Experience with data quality frameworks such as Great Expectations is a strong plus

Comfortable working across engineering, data science, and product to communicate data quality standards and tradeoffs to both technical and non-technical stakeholders

Compensation & Benefits

Total compensation ($140,000-170,000/annually)

Equity compensation

Health insurance coverage for you and your dependents

401K, FSA, and commuter benefits

$150 monthly spending account

$1,000 annual continued education benefit

Responsibilities

Build and maintain data validation frameworks for critical measurement and reporting pipelines

Develop automated data quality checks across ingestion, transformation, and reporting layers

Own pytest infrastructure improvements and drive integration testing patterns across the engineering org

Partner with engineers, data scientists, and product managers to debug complex data issues and define data quality standards

Evaluate, adopt, and own data quality and lineage tooling to improve pipeline observability and traceability

Define and enforce data quality standards for new feature launches and pipeline changes