State Estimation Engineer

Dyna Robotics

Dyna Robotics

Other Engineering
Redwood City, CA, USA
USD 200k-270k / year + Equity
Posted on Mar 11, 2026

Location

Redwood City, CA

Employment Type

Full time

Location Type

On-site

Department

Engineering

Compensation

  • $200K – $270K • Offers Equity

Base Salary range for full-time position in US. Final compensation may vary outside this range, depending on factors such as role, level, and location. Individual pay will be determined based on job-related skills, experience, location, and relevant education or training.

Company Overview:

Dyna Robotics makes general-purpose robots powered by a proprietary embodied AI foundation model that generalizes and self-improves across varied environments with commercial-grade performance. Dyna's robots have been deployed at customers across multiple industries. Its frontier model has the top generalization and performance in the industry.

Dyna Robotics was founded by repeat founders Lindon Gao and York Yang, who sold Caper AI for $350 million, and former DeepMind research scientist Jason Ma. The company has raised over $140M, backed by top investors, including CRV and First Round. We're positioned to redefine the landscape of robotic automation. Join us to shape the next frontier of AI-driven robotics!

Learn more at dyna.co

Position Overview:

As our first dedicated State Estimation Engineer, you will own the estimation systems that enable our robots and data collection tools to perceive their state in dynamic, unstructured environments. You will build sensor fusion pipelines, calibration tools, and diagnostics infrastructure—leveraging classical techniques while exploring learning-based approaches to push the boundaries of what's possible. This is a foundational role with significant ownership and direct impact on developing the most robust robotics foundation model.

Key Responsibilities:

  • Build core estimation systems

    • Design and implement state estimation algorithms (EKF, UKF, factor graphs, optimization-based methods) for localization, pose tracking, and contact estimation

    • Develop sensor fusion pipelines integrating IMUs, encoders, force/torque sensors, cameras, and LiDAR

    • Own visual state estimation (VIO, visual SLAM) and LiDAR-based localization

  • Own the full stack from prototype to production

    • Build calibration systems, diagnostics tools, and validation benchmarks

    • Own estimation for both production robots and data collection tools

  • Push the boundaries

    • Decide when to ship a robust classical solution now versus invest in a learned approach

    • Experiment with learning-based estimation (learned dynamics, neural filtering) to complement classical methods

    • Collaborate with AI and hardware teams on sensor selection and integration

Qualifications:

  • 5+ years building state estimation systems on physical robots (not just simulation)

  • Deep expertise in sensor fusion, visual or LiDAR-based localization, and filtering/optimization methods

  • Strong foundation in probability, linear algebra, optimization, and 3D geometry

  • Proficiency in C++ for real-time systems; Python for tooling

  • Track record of taking an estimation system from prototype to production deployment

Preferred Qualifications:

  • MS or PhD with research focus on state estimation, SLAM, VIO, or sensor fusion

  • Experience with manipulation or high-DOF robotic systems

  • Hands-on experience with learning-based estimation or differentiable filtering

  • Contributions to open-source robotics or publications (ICRA, IROS, RSS)

Compensation Range: $200K - $270K