TY - GEN
T1 - On Bayesian bounds
AU - Banerjee, Arindam
PY - 2006
Y1 - 2006
N2 - We show that several important Bayesian bounds studied in machine learning, both in the batch as well as the online setting, arise by an application of a simple compression lemma. In particular, we derive (i) PAC-Bayesian bounds in the batch setting, (ii) Bayesian log-loss bounds and (iii) Bayesian bounded-loss bounds in the online setting using the compression lemma. Although every setting has different semantics for prior, posterior and loss, we show that the core bound argument is the same. The paper simplifies our understanding of several important and apparently disparate results, as well as brings to light a powerful tool for developing similar arguments for other methods.
AB - We show that several important Bayesian bounds studied in machine learning, both in the batch as well as the online setting, arise by an application of a simple compression lemma. In particular, we derive (i) PAC-Bayesian bounds in the batch setting, (ii) Bayesian log-loss bounds and (iii) Bayesian bounded-loss bounds in the online setting using the compression lemma. Although every setting has different semantics for prior, posterior and loss, we show that the core bound argument is the same. The paper simplifies our understanding of several important and apparently disparate results, as well as brings to light a powerful tool for developing similar arguments for other methods.
UR - http://www.scopus.com/inward/record.url?scp=33749262485&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33749262485&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:33749262485
SN - 1595933832
SN - 9781595933836
VL - 2006
T3 - ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
SP - 81
EP - 88
BT - ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
T2 - ICML 2006: 23rd International Conference on Machine Learning
Y2 - 25 June 2006 through 29 June 2006
ER -