High-Dimensional Bayesian Model Comparison in Cosmology with GPU-accelerated Nested Sampling and Neural Emulators

In our latest work, 2509.13307, lead author [Toby Lovick] and our team tackle one of the major computational bottlenecks in modern cosmology: performing robust Bayesian model comparison in high-dimensional parameter spaces. As astronomical datasets from surveys like Euclid and the Vera Rubin Observatory grow in size and complexity, our theoretical models must incorporate more parameters to capture nuisance effects and explore new physics. This has pushed traditional inference methods to their limits, particularly those designed to calculate the Bayesian evidence—a critical quantity for comparing competing cosmological models.
The Challenge: Scaling Bayesian Evidence Calculation
Bayesian inference provides a powerful framework for not just estimating parameters, but for assessing the relative merit of different physical theories. The Bayesian evidence, or marginal likelihood, naturally penalises overly complex models, offering a principled approach to model selection often referred to as Occam’s razor (Trotta, 2008). Nested Sampling (Skilling, 2006) is a gold-standard algorithm for this task, calculating the evidence while simultaneously producing posterior samples.
However, the computational cost of traditional CPU-based implementations, such as the widely-used polychord, scales poorly with dimensionality. For cutting-edge analyses involving dozens of parameters, these methods can become prohibitively slow, lagging behind gradient-based MCMC samplers like Hamiltonian Monte Carlo (HMC), which, when paired with separate evidence estimators, have offered a faster alternative (Piras et al., 2024). This created a crucial need to accelerate Nested Sampling to keep it competitive and ensure robust model comparison remains feasible.
A Modern Framework: GPUs, JAX, and Nested Slice Sampling
Our paper demonstrates a solution by building a modern, end-to-end inference pipeline that leverages three key technologies:
- GPU Acceleration: We employ a highly parallelized Nested Sampling algorithm, specifically the Nested Slice Sampler (NSS) developed by David Yallup et al. (Yallup et al., 2025). By vectorizing the process of generating new live points, this implementation can fully exploit the massive parallelism of modern GPUs, evolving hundreds of points simultaneously rather than one at a time.
- Neural Emulators: The performance of any sampler is limited by the speed of the likelihood evaluation. We replace computationally expensive Boltzmann solvers like
CAMBwith theJAX-based neural emulatorCosmoPower-JAX. This provides near-instantaneous, differentiable predictions for cosmological observables like the matter power spectrum, which is essential for unlocking the full potential of GPU hardware. - End-to-End
JAXPipeline: The entire framework, from the sampler to the likelihood, is built within theJAXecosystem. This eliminates communication overhead between different components (e.g., CPU and GPU) and allows for just-in-time (JIT) compilation, resulting in a highly optimised and efficient analysis tool.
Putting the Framework to the Test
We validated our approach on two distinct cosmological problems:
-
A 6D CMB Analysis: In this idealised, cosmic-variance-limited scenario, the likelihood is perfectly vectorizable. Our GPU-NS framework completed the analysis in just 12 seconds on a single GPU—a staggering 300x speed-up compared to the hour-long runtime of a traditional CPU-based approach.
-
A 39D Cosmic Shear Analysis: This represents a frontier challenge for Bayesian inference. A previous analysis using
polychordandCAMBwas estimated to take 8 months. Our GPU-accelerated framework, combining NSS withCosmoPower-JAX, completed the analysis for two competing dark energy models ($\Lambda$CDM and $w_0w_a$) in just 2 days on a single A100 GPU.
This result is transformative. It places Nested Sampling on an equal computational footing with the fastest HMC-based methods for high-dimensional problems. The work by Toby Lovick, David Yallup, Will Handley, and collaborators demonstrates that with modern hardware and software, we no longer have to choose between speed and the robust evidence calculation that Nested Sampling provides. This revitalises a cornerstone method of Bayesian inference, ensuring it will remain a vital tool for scrutinising our cosmological models with the next generation of astronomical data.



Content generated by gemini-2.5-pro using this prompt.
Image generated by imagen-4.0-generate-001 using this prompt.