The Library
Proposals which speed up function-space MCMC
Tools
Law, K. J. H. (2014) Proposals which speed up function-space MCMC. Journal of Computational and Applied Mathematics, Volume 262 . pp. 127-138. doi:10.1016/j.cam.2013.07.026 ISSN 0377-0427.
Research output not available from this repository.
Request-a-Copy directly from author or use local Library Get it For Me service.
Official URL: http://dx.doi.org/10.1016/j.cam.2013.07.026
Abstract
Inverse problems lend themselves naturally to a Bayesian formulation, in which the quantity of interest is a posterior distribution of state and/or parameters given some uncertain observations. For the common case in which the forward operator is smoothing, then the inverse problem is ill-posed. Well-posedness is imposed via regularization in the form of a prior, which is often Gaussian. Under quite general conditions, it can be shown that the posterior is absolutely continuous with respect to the prior and it may be well-defined on function space in terms of its density with respect to the prior. In this case, by constructing a proposal for which the prior is invariant, one can define Metropolis–Hastings schemes for MCMC which are well-defined on function space (Stuart (2010) [1], Cotter et al. [2]), and hence do not degenerate as the dimension of the underlying quantity of interest increases to infinity, e.g. under mesh refinement when approximating PDE in finite dimensions. However, in practice, despite the attractive theoretical properties of the currently available schemes, they may still suffer from long correlation times, particularly if the data is very informative about some of the unknown parameters. In fact, in this case it may be the directions of the posterior which coincide with the (already known) prior which decorrelate the slowest. The information incorporated into the posterior through the data is often contained within some finite-dimensional subspace, in an appropriate basis, perhaps even one defined by eigenfunctions of the prior. We aim to exploit this fact and improve the mixing time of function-space MCMC by careful rescaling of the proposal. To this end, we introduce two new basic methods of increasing complexity, involving (i) characteristic function truncation of high frequencies and (ii) Hessian information to interpolate between low and high frequencies. The second, more sophisticated version, bears some similarities with recent methods which exploit local curvature information, for example RMHMC, Girolami and Calderhead (2011) [3], and stochastic Newton, Martin et al. (2012) [4]. These ideas are illustrated with numerical experiments on the Bayesian inversion of the heat equation and Navier–Stokes equation, given noisy observations.
Item Type: | Journal Article | ||||||||
---|---|---|---|---|---|---|---|---|---|
Divisions: | Faculty of Science, Engineering and Medicine > Science > Mathematics | ||||||||
Journal or Publication Title: | Journal of Computational and Applied Mathematics | ||||||||
Publisher: | Elseivier Science BV | ||||||||
ISSN: | 0377-0427 | ||||||||
Official Date: | 15 May 2014 | ||||||||
Dates: |
|
||||||||
Volume: | Volume 262 | ||||||||
Page Range: | pp. 127-138 | ||||||||
DOI: | 10.1016/j.cam.2013.07.026 | ||||||||
Status: | Peer Reviewed | ||||||||
Publication Status: | Published | ||||||||
Access rights to Published version: | Restricted or Subscription Access |
Request changes or add full text files to a record
Repository staff actions (login required)
View Item |