Resource Page

Introduction

AdaBoost is a boosting meta-algorithm, which aims to build a strong hypothesis (for classification or regression) using a weak learning algorithm several times and building and ensemble with the resulting weak hypothesis. Its main insight is to carefully select the training set for each weak hypothesis in such a way that the global ensemble training error is minimized. It has the major advantage that the test error improves even when the training error cannot improve further, which results in resistance to over-fitting. Theoretical results that prove that characteristic are in a preliminary state, and those more complete are based on VC-theory and the concept of margins in classifiers (strongly related to SVMs).

Our Resources

Tic Tac Toe Example (Matlab - Requires AdaBoost Toolbox)

Circular classes Example (Matlab - Requires AdaBoost Toolbox)

Software

External Resources

A Short Introduction to Boosting - Freund & Schapire, 1999

Theoretical Views of Boosting and Applications - Schapire, 1999