aevexaerospace

Software Engineer 3 - Sensor Fusion and Perception

Apply Now

At a Glance

Location
Tampa, Florida, United States
Experience
5+ years
Posted
2026-03-25T11:47:13-04:00

Key Requirements

Required Skills

Computer VisionPyTorchTensorFlow

Domain Knowledge

  • Aerospace
  • Defense
  • Embedded Systems
  • Robotics

Benefits & Perks

Health Insurance

des a full suite of comprehensive benefits, including a 401(k)-retirement plan, co

Requirements

5+ years of relevant industry or research experience in robotics, autonomous systems, or navigation algorithm development.

Experience with computer vision and ML frameworks such as OpenCV, PyTorch, or TensorFlow.

Experience with embedded systems, real-time computing, and deploying algorithms to fielded platforms.

Experience leading or significantly contributing to R&D projects involving autonomy or robotics.

Experience with robotics middleware (ROS/ROS2) and simulation environments (Gazebo, CARLA).

Background in aerospace, defense, or autonomous ground/aerial vehicle development.

Responsibilities

Design, implement, and deploy robust sensor fusion and state estimation algorithms for autonomous navigation in degraded GPS environments.

Lead development of perception and localization systems that integrate data from vision, IMU, GNSS, and alternative navigation sensors.

Develop and optimize algorithms for real-time performance on embedded and edge compute platforms (e.g., Jetson, FPGA, ARM SoCs).

Perform simulation-based validation and hardware-in-the-loop testing using environments such as ROS, Gazebo, or CARLA.

Drive research into emerging techniques, including deep learning for perception and SLAM, and assess applicability to navigation problems.

Interface with internal and external collaborators to support test campaigns, data collection, and field evaluations.