Quantum mechanics is, at its core, a theory about probabilities. The Born rule assigns probabilities to measurement outcomes, but the interpretation of these probabilities — and the role of the observer — has been debated since the theory's inception. Bayesian approaches to quantum physics operate on two levels: as practical statistical tools for quantum state estimation, and as a foundational framework that reshapes how we understand quantum theory itself.
Quantum State Tomography
Quantum state tomography is the task of reconstructing an unknown quantum state ρ from measurement data. In a Bayesian approach, the quantum state is treated as an unknown parameter, measurements provide likelihood information, and the posterior distribution over density matrices characterizes what is known about the state after the experiment. Unlike maximum likelihood estimation, Bayesian tomography naturally produces uncertainty estimates and avoids the problem of unphysical estimates (negative eigenvalues).
P(D | ρ) = Π_k Tr(E_k ρ)^{n_k}
where E_k are measurement operators and n_k are outcome counts.
The prior P(ρ) on density matrices is typically chosen as a Hilbert-Schmidt measure, a Bures measure, or a physically-motivated prior that favors states close to an expected preparation. For multi-qubit systems, the exponential growth of Hilbert space dimension makes full tomography impractical, and Bayesian compressed sensing or matrix product state priors enable efficient reconstruction from fewer measurements.
Quantum Parameter Estimation
Bayesian estimation extends to any quantum parameter — Hamiltonian parameters, decoherence rates, coupling strengths, or magnetic fields sensed by quantum sensors. Bayesian adaptive measurement strategies, where each measurement setting is chosen to maximize expected information gain given the current posterior, achieve the quantum Cramér-Rao bound more efficiently than fixed measurement schemes. This is quantum Bayesian experimental design, and it has practical applications in quantum magnetometry, quantum computing calibration, and precision timekeeping.
QBism (Quantum Bayesianism), developed by Christopher Fuchs, Ruediger Schack, and Carlton Caves, proposes that quantum probabilities are not properties of physical systems but Bayesian degrees of belief held by agents. In this view, the wave function is not a physical entity but a tool an agent uses to organize beliefs about future experiences. QBism dissolves the measurement problem by denying that wave function collapse is a physical process — it is simply a Bayesian update. While controversial, QBism has stimulated deep research into the foundations of both quantum mechanics and probability theory.
Quantum Error Correction and Verification
As quantum computers scale up, Bayesian methods become essential for characterizing and mitigating errors. Bayesian process tomography estimates the noise channel affecting quantum gates. Bayesian model selection helps identify whether errors are Markovian or non-Markovian, correlated or independent. And Bayesian inference on syndrome measurement data enables real-time adaptive error correction in fault-tolerant quantum computing architectures.
Bayesian Approaches to Quantum Foundations
Beyond QBism, Bayesian reasoning illuminates foundational questions. Bayesian confirmation theory has been applied to assess the evidential support for competing interpretations of quantum mechanics. The de Finetti theorem for quantum states — the quantum analogue of the classical exchangeability theorem — provides a Bayesian justification for the quantum state concept itself: an unknown quantum state is the Bayesian agent's summary of beliefs about an exchangeable sequence of quantum systems.
"Quantum mechanics is a law of thought — a normative framework for making decisions and forming expectations in a quantum world, not a description of a reality behind the scenes." — Christopher Fuchs, founding architect of QBism
Current Frontiers
Bayesian online learning algorithms adapt quantum control strategies in real time. Bayesian optimization tunes the parameters of variational quantum eigensolvers and quantum approximate optimization algorithms. And the intersection of quantum computing with Bayesian inference itself — quantum-enhanced Bayesian inference, where quantum resources speed up posterior sampling — represents a tantalizing frontier at the boundary of computation and physics.