Bayesian Statistics

Section on Bayesian Computation

The ISBA Section on Bayesian Computation is devoted to advancing the theory and practice of computational methods that underpin modern Bayesian inference, from MCMC and variational methods to scalable algorithms for big data.

Modern Bayesian statistics is inseparable from computation. The ability to fit complex hierarchical models, perform posterior inference in high dimensions, and handle massive datasets all depend on powerful computational algorithms. The Section on Bayesian Computation within ISBA provides a dedicated community for researchers who develop, analyze, and apply these methods.

Scope and Mission

The section's remit covers the full spectrum of computational approaches to Bayesian inference. This includes foundational methods such as Markov chain Monte Carlo (MCMC), sequential Monte Carlo (SMC), and importance sampling, as well as newer paradigms including variational inference, approximate Bayesian computation (ABC), and simulation-based inference. The section also addresses the computational infrastructure needed to deploy these methods, including probabilistic programming languages and high-performance computing frameworks.

The Computational Revolution

The modern era of Bayesian statistics arguably began with the computational revolution of the late 1980s and early 1990s, when MCMC methods made it possible to fit models that were previously intractable. The Section on Bayesian Computation ensures that this tradition of algorithmic innovation continues, supporting research into methods that push the boundaries of what is computationally feasible.

Activities

The section organizes workshops, invited sessions at ISBA World Meetings, and satellite events focused on computational methodology. A highlight of the section's activities is its involvement in the MCMSki conference series, which brings together researchers working on Monte Carlo methods and Bayesian computation in an informal and interactive setting.

The section also sponsors webinars and short courses aimed at disseminating best practices in Bayesian computation. Topics have ranged from practical MCMC diagnostics and convergence assessment to cutting-edge research on gradient-based sampling methods such as Hamiltonian Monte Carlo and the No-U-Turn Sampler (NUTS).

Key Research Areas

Members of the section are active across a wide range of computational research areas:

Scalable inference addresses the challenge of applying Bayesian methods to datasets with millions or billions of observations, using techniques such as stochastic gradient MCMC and divide-and-conquer strategies. Approximate methods, including variational Bayes and expectation propagation, trade exactness for computational speed. Probabilistic programming provides flexible software frameworks—such as Stan, PyMC, and JAGS—that automate the mechanics of Bayesian computation, enabling practitioners to focus on modeling rather than algorithm implementation.

"Bayesian computation is not a support service for Bayesian statistics—it is a research discipline in its own right, with deep connections to mathematics, computer science, and applied science."— Christian Robert

Community and Collaboration

The section fosters collaboration between methodological researchers and applied scientists, ensuring that computational advances are motivated by real problems and that practitioners benefit from the latest algorithmic developments. This bridge between theory and practice is one of the section's most important contributions to the Bayesian community.

Related Topics