tenableinc

Staff Software Engineer - Event Sourcing/Stream Processing - Kafka

Apply Now

At a Glance

Location
Remote - California - Bay Area
Work Regime
remote
Experience
8+ years
Posted
2026-02-10T12:56:05-05:00

Key Requirements

Required Skills

AWSDevOpsJavaKafkaKotlinMicroservicesPostgreSQLSQLTerraform

Domain Knowledge

  • Education
  • Engineering
  • Insurance
  • Medical

Requirements

of Backend Engineering experience with a focus on high-volume data processing or distributed systems.

You should understand memory management and performance within the JVM ecosystem.

Stream Processing Architecture: Proven experience with Kafka (ideally), AWS Kinesis, et al.

You understand topics, partitions, and how to process teams of data asynchronously

Distributed Systems Knowledge: You understand the challenges of microservices, eventual consistency, and data resiliency.

You understand how to take a stream of raw data and "collapse" it into a current status.

Responsibilities

to join our VM Platform team.

Our team sits at the center of our Tenable One architecture; we ingest massive volumes of assets and find data from collection teams, process it to calculate the "state of the world" for our customers, and feed it to downstream search and reporting products.

We are not just building web apps; we are solving a complex Big Data problem.

You will build and maintain the high-throughput, event-driven pipelines responsible for processing the history of assets and vulnerabilities.

You will move beyond simple CRUD operations to design systems that handle massive scale, ensuring that when we say an asset is vulnerable (or patched), that data is accurate and available in real-time.