Modern astrophysics is awash in data from telescopes, satellites, and detectors spanning the electromagnetic spectrum and beyond. The scientific questions — What is the age of the universe? How fast is it expanding? What fraction is dark matter versus dark energy? — require fitting complex physical models to noisy observations. Bayesian inference provides the natural framework: physical theory supplies the likelihood, prior knowledge constrains parameter spaces, and the posterior distribution delivers both best-fit parameters and their uncertainties.
Cosmological Parameter Estimation
The cosmic microwave background (CMB) — radiation from 380,000 years after the Big Bang — is the most precisely measured cosmological observable. The Planck satellite's analysis of the CMB power spectrum used Bayesian methods to estimate six fundamental parameters of the ΛCDM model (the Hubble constant, baryon density, dark matter density, optical depth, scalar spectral index, and amplitude of fluctuations). The posterior distributions from CosmoMC, a widely-used Bayesian MCMC sampler, define our current knowledge of cosmic composition.
The Bayes factor B₁₂ quantifies the evidence for model M₁ relative to M₂.
Bayesian model comparison via the evidence (marginal likelihood) has been used to test extensions to ΛCDM — curvature, extra neutrino species, dark energy dynamics — finding that the simplest six-parameter model is consistently favored. Nested sampling, pioneered by John Skilling specifically with astrophysical applications in mind, efficiently computes the evidence integral.
Gravitational Wave Astronomy
The detection and characterization of gravitational waves by LIGO and Virgo is fundamentally Bayesian. Each candidate event is analyzed with Bayesian parameter estimation to infer the masses, spins, distances, and orientations of the merging compact objects. The LALInference and Bilby pipelines use nested sampling and MCMC to explore the 15-dimensional parameter space of binary black hole mergers, producing posterior distributions that quantify everything known about the source.
On September 14, 2015, LIGO detected gravitational waves from a binary black hole merger (GW150914). Bayesian parameter estimation determined the component masses (36 and 29 solar masses), the distance (410 Mpc), and the final black hole mass and spin. The Bayesian false alarm rate — less than one event per 200,000 years of observation — established the detection with overwhelming statistical confidence.
Exoplanet Detection and Characterization
Bayesian methods are essential to exoplanet science. Radial velocity planet searches use Bayesian model selection to determine the number of planets orbiting a star: is the data better explained by one planet, two, or none? Transit photometry uses Bayesian fitting to extract planet radius, orbital period, and atmospheric properties from light curves. Bayesian atmospheric retrieval codes like NEMESIS and petitRADTRANS infer the chemical composition and temperature structure of exoplanet atmospheres from transmission spectra.
Galaxy Evolution and Large-Scale Structure
Bayesian spectral energy distribution (SED) fitting infers the stellar mass, star formation rate, and dust content of galaxies from multi-band photometry. Bayesian photometric redshift estimation provides probability distributions over galaxy distances when spectroscopic data are unavailable. And Bayesian hierarchical models enable the study of galaxy populations, separating intrinsic scatter from measurement error.
"In astrophysics, we cannot repeat experiments — each observation of a supernova, a gravitational wave, or the cosmic microwave background is unique. Bayesian inference is the only coherent framework for extracting knowledge from unrepeatable observations." — Roberto Trotta, cosmologist and author of Bayes in the Sky
Current Frontiers
Simulation-based inference (likelihood-free methods) is enabling Bayesian analysis of complex astrophysical simulations where the likelihood is intractable. Machine learning emulators — Gaussian process surrogates and neural network emulators — accelerate Bayesian parameter estimation by orders of magnitude. And the next generation of surveys (Vera Rubin Observatory, Euclid, LISA) will demand scalable Bayesian methods to handle petabytes of data.