On Bayesian bounds

Research output: Chapter in Book/Report/Conference proceedingConference contribution

20 Scopus citations

Abstract

We show that several important Bayesian bounds studied in machine learning, both in the batch as well as the online setting, arise by an application of a simple compression lemma. In particular, we derive (i) PAC-Bayesian bounds in the batch setting, (ii) Bayesian log-loss bounds and (iii) Bayesian bounded-loss bounds in the online setting using the compression lemma. Although every setting has different semantics for prior, posterior and loss, we show that the core bound argument is the same. The paper simplifies our understanding of several important and apparently disparate results, as well as brings to light a powerful tool for developing similar arguments for other methods.

Original languageEnglish (US)
Title of host publicationICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
Pages81-88
Number of pages8
Volume2006
StatePublished - 2006
EventICML 2006: 23rd International Conference on Machine Learning - Pittsburgh, PA, United States
Duration: Jun 25 2006Jun 29 2006

Publication series

NameICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
Volume2006

Other

OtherICML 2006: 23rd International Conference on Machine Learning
Country/TerritoryUnited States
CityPittsburgh, PA
Period6/25/066/29/06

Fingerprint

Dive into the research topics of 'On Bayesian bounds'. Together they form a unique fingerprint.

Cite this