Signal processing is concerned with extracting information from noisy measurements. Whether the signal is a radar return, a speech waveform, a seismic trace, or a biomedical recording, the fundamental problem is the same: separate the signal of interest from noise, interference, and distortion. Bayesian signal processing frames this as inference about the unknown signal given the observed data, with the posterior distribution providing both the optimal estimate and its uncertainty.
Bayesian Filtering
The Kalman filter, introduced in 1960, is the most celebrated Bayesian algorithm in engineering. It recursively estimates the state of a linear dynamical system from noisy observations, maintaining a Gaussian posterior distribution that is updated at each time step. The prediction step propagates the prior through the system dynamics; the update step incorporates the new observation via Bayes' theorem.
Update: K_t = P_{t|t-1} Hᵀ (H P_{t|t-1} Hᵀ + R)⁻¹
x̂_{t|t} = x̂_{t|t-1} + K_t (y_t − H x̂_{t|t-1})
P_{t|t} = (I − K_t H) P_{t|t-1}
For nonlinear and non-Gaussian systems, the extended Kalman filter (EKF), unscented Kalman filter (UKF), and particle filters provide Bayesian filtering with varying tradeoffs between accuracy and computational cost. Particle filters, in particular, represent the posterior distribution with a set of weighted samples and can handle arbitrary nonlinearities and multimodal distributions.
Bayesian Denoising
Bayesian denoising treats the clean signal as a random variable with a prior distribution and the observation as the signal plus noise. The posterior mean (MMSE estimate) or posterior mode (MAP estimate) provides the denoised signal. For images, wavelet-domain Bayesian denoising — where wavelet coefficients are given sparse priors (e.g., spike-and-slab or Laplace distributions) — achieves near-optimal performance. BM3D and related algorithms, while not always presented in Bayesian terms, can be interpreted as approximate Bayesian inference with learned priors.
Compressed sensing demonstrated that sparse signals can be recovered from far fewer measurements than traditional sampling theory requires. The Bayesian approach to compressed sensing — placing sparsity-promoting priors on the signal and computing the posterior — provides not only the recovered signal but also pixel-wise uncertainty estimates and automatic determination of the sparsity level. Bayesian compressed sensing consistently matches or outperforms convex optimization approaches while providing richer output.
Spectral Estimation
Bayesian spectral estimation infers the power spectral density of a signal from finite data. The Bayesian approach naturally handles short data records, irregular sampling, and missing data — all situations where classical periodogram-based methods perform poorly. Bayesian nonparametric spectral estimation uses Gaussian process priors on the log-spectrum, while parametric approaches place priors on autoregressive model parameters.
Bayesian Adaptive Beamforming
In array signal processing, beamforming combines signals from multiple sensors to enhance sources from desired directions and suppress interference. Bayesian beamforming estimates the posterior distribution over source locations, powers, and the noise environment, providing robust performance when the number of sources is unknown or the array calibration is imperfect.
"The Kalman filter is simply Bayes' theorem applied recursively in time. Every engineer who uses it is doing Bayesian inference, whether they know it or not." — Simo Särkkä, author of Bayesian Filtering and Smoothing
Current Frontiers
Deep learning and Bayesian signal processing are converging through deep Kalman filters, variational sequential Monte Carlo, and neural network-parameterized state-space models. Bayesian methods for graph signal processing extend classical filtering to irregular domains like sensor networks and social graphs. And real-time Bayesian inference on edge devices is enabling intelligent signal processing in IoT, autonomous vehicles, and brain-computer interfaces.