12nd June 2007 — Nested Sampling
Iain will discuss:
Nested sampling estimates directly how the likelihood function relates to prior mass. The evidence (alternatively the marginal likelihood, marginal density of the data, or the prior predictive) is immediately obtained by summation. It is the prime result of the computation, and is accompanied by an estimate of numerical uncertainty. Samples from the posterior distribution are an optional byproduct, obtainable for any temperature. The method relies on sampling within a hard constraint on likelihood value, as opposed to the softened likelihood of annealing methods. Progress depends only on the shape of the \nested" contours of likelihood, and not on the likelihood values. This invariance (over monotonic relabelling) allows the method to deal with a class of phase-change problems which ectively defeat thermal annealing.
Nested sampling is a recent Monte Carlo algorithm by John Skilling. It provides estimates and error bars for any quantity of interest in Bayesian inference problems. This includes the normalization of the posterior known as the evidence or marginal likelihood p(data|model).
The purpose of this reading group is to understand nested sampling and its position with respect to related approaches. Questions and discussion are encouraged.