Stochastic boosting algorithms

Ajay Jasra, Christopher C. Holmes

Research output: Contribution to journalArticlepeer-review

7 Scopus citations


In this article we develop a class of stochastic boosting (SB) algorithms, which build upon the work of Holmes and Pintore (Bayesian Stat. 8, Oxford University Press, Oxford, 2007). They introduce boosting algorithms which correspond to standard boosting (e. g. Bühlmann and Hothorn, Stat. Sci. 22:477-505, 2007) except that the optimization algorithms are randomized; this idea is placed within a Bayesian framework. We show that the inferential procedure in Holmes and Pintore (Bayesian Stat. 8, Oxford University Press, Oxford, 2007) is incorrect and further develop interpretational, computational and theoretical results which allow one to assess SB's potential for classification and regression problems. To use SB, sequential Monte Carlo (SMC) methods are applied. As a result, it is found that SB can provide better predictions for classification problems than the corresponding boosting algorithm. A theoretical result is also given, which shows that the predictions of SB are not significantly worse than boosting, when the latter provides the best prediction. We also investigate the method on a real case study from machine learning. © 2010 Springer Science+Business Media, LLC.
Original languageEnglish (US)
JournalStatistics and Computing
Issue number3
StatePublished - Jul 1 2011
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2019-11-20


Dive into the research topics of 'Stochastic boosting algorithms'. Together they form a unique fingerprint.

Cite this