Resource Page 


AdaBoost is a boosting meta-algorithm, which aims to build a strong hypothesis (for classification or regression) using a weak learning algorithm several times and building and ensemble with the resulting weak hypothesis. Its main insight is to carefully select the training set for each weak hypothesis in such a way that the global ensemble training error is minimized. It has the major advantage that the test error improves even when the training error cannot improve further, which results in resistance to over-fitting. Theoretical results that prove that characteristic are in a preliminary state, and those more complete are based on VC-theory and the concept of margins in classifiers (strongly related to SVMs).

Our Resources

Presentación (PDF)

Tic Tac Toe Example (Matlab - Requires AdaBoost Toolbox)

Circular classes Example (Matlab - Requires AdaBoost Toolbox)


AdaBoost Matlab Toolbox   

AdaBoost Applet 

External Resources 

AdaBoost - Wikipedia

A Short Introduction to Boosting - Freund & Schapire, 1999

Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods - Schapire et. al., 1998 

Theoretical Views of Boosting and Applications - Schapire, 1999

A Decision-Theoretic Generalization of on-Line Learning and an Application to Boosting, Freund & Schapire, 1995