constantcontact
Senior Data Engineer
At a Glance
- Location
- US or Remote, Ontario, Canada
- Work Regime
- remote
- Experience
- 5+ years
- Posted
- 2026-03-09T14:05:15-04:00
Key Requirements
Required Skills
Domain Knowledge
- Education
- Engineering
- Legal
Benefits & Perks
y and a competitive benefits package that supports the health and well-being
Requirements
5+ years of professional experience in Data Engineering, focused on building large-scale, high-throughput data platforms is required.
Proficiency in SQL and extensive experience with modern cloud data warehouses (e.g., Snowflake, Databricks, BigQuery), including advanced concepts like performance tuning, clustering, and materialized views.
Deep expertise in Python for data engineering, specifically for building and optimizing complex data processing applications.
Proven experience with a modern data pipeline orchestration tool (e.g., Apache Airflow, Prefect, or Dagster) and data transformation tool (dbt).
Demonstrable experience with streaming data technologies (e.g., Apache Kafka, Kinesis, or Spark Streaming) to handle real-time customer engagement data.
Strong understanding of DataOps principles, CI/CD practices, and using Infrastructure as Code (IaC) tools.
Responsibilities
This role moves beyond data transformation and focuses on architecture, optimization, and technical leadership.
Architect Scalable Systems: Design, develop, and maintain highly scalable, fault-tolerant, and performant ELT/ETL data pipelines using modern cloud data services to process billions of customer events daily.
Technical Leadership: Serve as a technical leader within the data team, defining data engineering standards, code review processes, and architectural best practices. Mentor junior and mid-level engineers on complex data challenges and best-in-class solutions.
Data Modeling & Optimization: Lead the design and implementation of optimized data models (e.g., Data Vault, Kimball, or other dimensional models) in our cloud data warehouse (e.g., Snowflake, BigQuery) to support real-time reporting, internal analytics, and machine learning initiatives.
Infrastructure as Code (IaC): Implement and manage data infrastructure using Terraform or CloudFormation to ensure reproducible and reliable deployment of data environments.
Ensure Data Quality & Governance: Establish advanced monitoring, alerting, and automated testing frameworks (e.g., dbt tests, Great Expectations) to enforce high standards of data quality, integrity, and regulatory compliance (GDPR, CCPA) across all data assets.