TL;DR: The Extended Kalman Filter (EKF) often underestimates uncertainty on nonlinear systems, can become inconsistent or even diverge, is brittle to modeling/tuning errors, and struggles with constraints and multi-modal posteriors. Modern alternatives—Unscented/σ-point filters, smoothing/factor-graph methods, particle filters, or optimization-based Moving Horizon Estimation—are usually safer and more accurate.


The Core Problems

1. Linearization error → bias, inconsistency, and divergence

EKF relies on first-order Taylor expansions of your process and measurement models. When the system is meaningfully nonlinear between updates, the approximation injects bias and the covariance becomes too optimistic—a classic route to inconsistency and eventual divergence. This has been observed repeatedly in navigation, control, and SLAM applications.

2. Broken observability from local linearizations

In coupled, partially observed systems (e.g., SLAM), evaluating Jacobians at the latest estimate can change the observable subspace of the linearized system, producing mathematically inconsistent updates (the filter appears confident while being wrong). Observability-based analyses pin this down as a fundamental failure mode of EKF-SLAM.

3. Can’t represent non-Gaussian or multi-modal posteriors

Even with Gaussian process/measurement noise, nonlinear transformations produce skewed or multi-modal posteriors. EKF’s single mean/covariance is a poor fit in these cases; decades of work motivated σ-point and particle approaches precisely because EKF’s first-order moment matching is inadequate.

4. Fragile in practice: Jacobians, tuning, and maintenance

EKF needs correct, differentiable models and hand-crafted Jacobians. Small algebra mistakes, unmodeled latencies, or mis-calibration can quietly corrupt estimates. Classic surveys note that EKF works best only when the system is “almost linear on the timescale of updates.”

5. Constraints are awkward

Physical constraints (positivity, bounds, contact/no-slip, geometry) are hard to enforce in a Gaussian filter. Workarounds (projection, clamping, inflation) are ad hoc. Optimization-based estimators incorporate constraints natively.


Better Defaults (and When to Use Them)

Unscented / σ-point Kalman Filters (UKF)

When: You want a drop-in Kalman-style recursion with better nonlinear handling and no Jacobians.
Why better: Deterministic sigma points propagate mean/covariance through nonlinearities more accurately than first-order linearization; robustness improves with only a modest compute increase.

Smoothing & Factor Graphs (Gauss–Newton, LM, √SAM, iSAM)

When: You can batch or slide a window and care about global consistency (robotics/SLAM/VIO, sensor fusion).
Why better: Solve a nonlinear least-squares problem over a trajectory or window. This respects system structure and observability and avoids EKF’s myopia; sparse linear algebra yields scalable, stable solutions.

Moving Horizon Estimation (MHE)

When: You need constraints, robust loss functions, or strong nonlinearities—and can afford per-step optimization.
Why better: MHE explicitly optimizes over a finite history, enforces constraints, and often outperforms EKF on challenging systems.

Particle Filters / Ensemble Methods

When: Posteriors are strongly non-Gaussian or multi-modal, state dimension is moderate, or you need full Bayesian flexibility.
Why better: Particle filters represent arbitrary distributions (at higher compute cost). Ensemble Kalman filters (EnKF) offer a practical middle ground for large-scale systems, with proven success in geosciences.

If You Must Stick with a Kalman-Style Filter

Consider error-state or invariant formulations (better geometry, stability) and, where feasible, second-order terms. These can reduce—but not eliminate—EKF’s structural issues.


Practical Decision Checklist

Use EKF only if most of these are true:

  • Dynamics & measurements are nearly linear between updates.
  • Models and calibrations are excellent; unmodeled effects are small.
  • You can derive and maintain correct Jacobians.
  • You don’t need hard constraints or multi-modal beliefs.
  • You’ll monitor residuals/NEES and use covariance inflation or gating if needed.

If not, reach for UKF, smoothing/factor graphs, MHE, or particle/ensemble methods—your future self (and your stability plots) will thank you.


References

  • Julier & Uhlmann, Unscented Filtering and Nonlinear Estimation, Proc. IEEE, 2004 — Foundational critique of EKF and σ-point alternative.
  • Bailey et al., Consistency of the EKF-SLAM Algorithm, 2006 — Demonstrates inconsistency in EKF-SLAM.
  • Huang, Mourikis, & Roumeliotis, Observability-based Rules for Designing Consistent EKF-SLAM Estimators, IJRR, 2010 — Observability explanation for inconsistency.
  • Haseltine & Rawlings, Critical Evaluation of Extended Kalman Filtering and Moving-Horizon Estimation, 2005 — Constraints and MHE vs EKF.
  • Dellaert & Kaess, Square Root SAM: SLAM via Square Root Information Smoothing, IJRR, 2006 — Factor-graph alternative to EKF.
  • Wan & van der Merwe, The Unscented Kalman Filter for Nonlinear Estimation, 2000 — Practical UKF introduction.
  • Arulampalam et al., A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Tracking, IEEE TSP, 2002 — Particle filters for non-Gaussian posteriors.
  • Evensen, The Ensemble Kalman Filter: Theoretical Formulation and Practical Implementation, 2003 — Ensemble methods at scale.

Newsletter