okta
Staff Data Engineer, Federal
At a Glance
- Location
- Washington, District of Columbia, United States
- Work Regime
- hybrid
- Experience
- 8+ years
- Posted
- 2026-03-10T11:38:07-04:00
Key Requirements
Required Skills
Domain Knowledge
- Automation
- Engineering
- Government
- Insurance
- Regulatory
Requirements
Experience: 8+ years in a data engineering role, with proven experience in a senior or lead capacity. Experience with Federal government compliance and security requirements is highly desirable.
Technical Expertise: Expert-level experience with SQL, ETL/ELT tools (Airflow, dbt), and MPP databases (Snowflake, Redshift). Extensive hands-on experience with AWS services (S3, Lambda, EMR, ECR, EKS).
Modern Architectures: Deep experience designing and working with lakehouse architectures (Databricks) and modern file formats (Iceberg, Delta).
Infrastructure & Automation: Mastery of Terraform for IaC and a strong track record of implementing best practices for CI/CD (GitHub/GitLab) and containerization (Docker).
Security Acumen: In-depth knowledge of vulnerability scanning tools (Splunk, Nessus, Qualys, Falco) and best practices for securing data platforms.
What success looks like
Compensation & Benefits
Making
Social Impact
Developing
Talent and Fostering Connection + Community at Okta
Some roles may require travel to one of our office locations for in-person onboarding.
Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws.
Responsibilities
Lead the architecture, design, and development of highly scalable data platforms using AWS, Snowflake, dbt (Core and Cloud), and Airflow.
Own and manage the team's infrastructure as code (IaC) strategy using Terraform, ensuring consistent, secure, and reproducible environments.
Architect, develop, and maintain robust CI/CD pipelines for data platform applications using GitHub and GitLab.
Containerize applications and services using Docker to ensure portability and scalability across all environments.
Mentor other engineers and drive technical strategy for troubleshooting and resolving complex issues related to data infrastructure, security, and workflows.
Define and enforce data security and compliance standards, leading initiatives for vulnerability scanning (e.g., Nessus, Qualys) and patch management.