A robot operating in the real world must cope with pervasive uncertainty: its sensors are noisy, its actuators are imprecise, the environment is partially observable, and other agents behave unpredictably. Bayesian inference provides the principled framework for representing, updating, and acting on uncertain beliefs about the world. From the Kalman filter in early spacecraft navigation to modern autonomous vehicles and drones, Bayesian methods are embedded in the core algorithms of robotics.
Simultaneous Localization and Mapping (SLAM)
SLAM is the problem of building a map of an unknown environment while simultaneously tracking the robot's position within it — a chicken-and-egg problem that is naturally formulated as Bayesian inference. The robot maintains a joint posterior distribution over its pose and the map, updating this belief with each sensor observation and motion command.
where x_t is the robot pose, m is the map, z_t are observations, and u_t are control inputs.
EKF-SLAM uses a Gaussian approximation to this posterior, maintaining a joint covariance matrix over the robot pose and all landmark positions. FastSLAM uses Rao-Blackwellized particle filters that sample robot trajectories while maintaining analytical landmark estimates. GraphSLAM formulates the posterior as a sparse graph optimization problem. Each variant makes different approximations to the Bayesian ideal, trading off accuracy, computational cost, and scalability.
Sensor Fusion
Modern robots carry multiple sensors — cameras, LiDAR, IMUs, GPS, radar, ultrasonic — each providing partial, noisy information about the environment. Bayesian sensor fusion combines these heterogeneous measurements into a coherent state estimate. The extended Kalman filter and its variants remain the workhorses of multi-sensor fusion, while particle filters handle the nonlinear, multi-modal situations that arise in cluttered environments or during sensor failure.
Autonomous vehicles are perhaps the most demanding application of Bayesian robotics. They must simultaneously track dozens of dynamic objects (vehicles, pedestrians, cyclists) with uncertain positions and velocities, predict their future trajectories, and make safe decisions in real time. Multi-object Bayesian filtering, Bayesian occupancy grids, and probabilistic motion prediction are core components of every autonomous driving stack. The key challenge is maintaining calibrated uncertainty estimates — knowing what the car doesn't know — because overconfidence kills.
Bayesian Motion Planning
Planning under uncertainty requires reasoning about the expected consequences of actions across the distribution of possible world states. Partially Observable Markov Decision Processes (POMDPs) provide the formal framework, with the robot maintaining a belief state (a posterior distribution over world states) and choosing actions that maximize expected utility. Bayesian reinforcement learning extends this to situations where the dynamics themselves are uncertain, balancing exploration (learning about the environment) and exploitation (acting on current knowledge).
Object Recognition and Scene Understanding
Bayesian methods contribute to robotic perception through probabilistic object recognition, where prior knowledge about object categories and spatial relationships combines with sensor evidence to identify and localize objects. Bayesian nonparametric models handle open-world scenarios where the number of object categories is unknown a priori.
"Robotics without probability is like aviation without aerodynamics — it might get off the ground occasionally, but it won't stay up for long." — Sebastian Thrun, pioneer of probabilistic robotics and leader of Google's self-driving car project
Current Frontiers
Bayesian deep learning is enabling learned perception models that quantify their own uncertainty. Bayesian task and motion planning combines symbolic reasoning with continuous-space Bayesian inference. And multi-robot Bayesian systems — swarms that share beliefs and coordinate actions through decentralized Bayesian inference — are moving from research to deployment in warehouse automation, search and rescue, and environmental monitoring.