Skip to content Skip to navigation
University of Warwick
  • Study
  • |
  • Research
  • |
  • Business
  • |
  • Alumni
  • |
  • News
  • |
  • About

University of Warwick
Publications service & WRAP

Highlight your research

  • WRAP
    • Home
    • Search WRAP
    • Browse by Warwick Author
    • Browse WRAP by Year
    • Browse WRAP by Subject
    • Browse WRAP by Department
    • Browse WRAP by Funder
    • Browse Theses by Department
  • Publications Service
    • Home
    • Search Publications Service
    • Browse by Warwick Author
    • Browse Publications service by Year
    • Browse Publications service by Subject
    • Browse Publications service by Department
    • Browse Publications service by Funder
  • Help & Advice
University of Warwick

The Library

  • Login
  • Admin

An optimization-centric view on Bayes’ rule : reviewing and generalizing variational inference

Tools
- Tools
+ Tools

Knoblauch, Jeremias, Jewson, Jack E. and Damoulas, Theodoros (2021) An optimization-centric view on Bayes’ rule : reviewing and generalizing variational inference. Journal of Machine Learning Research, 23 . (In Press)

[img] PDF
WRAP-optimization-centric-view-Bayes’-rule-2022.pdf - Accepted Version
Embargoed item. Restricted access to Repository staff only - Requires a PDF viewer.

Download (2143Kb)

Request Changes to record.

Abstract

We advocate an optimization-centric view of Bayesian inference. Our inspiration is the representation of Bayes’ rule as infinite-dimensional optimization (Csiszár, 1975; Donsker and Varadhan, 1975; Zellner, 1988). Equipped with this perspective, we study Bayesian inference when one does not have access to (1) well-specified priors, (2) well-specified likelihoods, (3) infinite computing power. While these three assumptions underlie the standard Bayesian paradigm, they are typically inappropriate for modern Machine Learning applications. We propose addressing this through an optimization-centric generalization of Bayesian posteriors that we call the Rule of Three (RoT). The RoT can be justified axiomatically and recovers Bayesian, PAC-Bayesian and VI posteriors as special cases. While the RoT is primarily a conceptual and theoretical device, it also encompasses a novel sub-class of tractable posteriors which we call Generalized Variational Inference (GVI) posteriors. Just as the RoT, GVI posteriors are specified by three arguments: a loss, a divergence and a variational family. They also possess a number of desirable properties, including modularity, Frequentist consistency and an interpretation as approximate ELBO. We explore applications of GVI posteriors, and show that they can be used to improve robustness and posterior marginals on Bayesian Neural Networks and Deep Gaussian Processes.

Item Type: Journal Article
Alternative Title:
Divisions: Faculty of Science, Engineering and Medicine > Science > Computer Science
Journal or Publication Title: Journal of Machine Learning Research
Publisher: M I T Press
ISSN: 1532-4435
Official Date: 2021
Dates:
DateEvent
2021Available
15 December 2021Accepted
Volume: 23
Status: Peer Reviewed
Publication Status: In Press
Access rights to Published version: Restricted or Subscription Access
Funder: JK and JJ are funded by EPSRC grant EP/L016710/1 as part of the Oxford-Warwick Statistics Programme (OxWaSP)., JK is additionally funded by the Facebook Fellowship Programme and the London Air Quality project at the Alan Turing Institute for Data Science and AI., TD is funded by the UKRI Turing AI Fellowship EP/V02678X/1, EPSRC grant EP/T004134/1 and the Lloyd’s RegisterFoundation programme on Data Centric Engineering at The Alan Turing Institute., This work was furthermore supported by The Alan Turing Institute for Data Science and AI under EPSRC grant EP/N510129/1 in collaboration with the Greater London Authority.
Related URLs:
  • Publisher

Request changes or add full text files to a record

Repository staff actions (login required)

View Item View Item
twitter

Email us: wrap@warwick.ac.uk
Contact Details
About Us