AdaBoost in a nutshell

Post date: Apr 30, 2017 3:19:23 AM

The core equation of AdaBoost is

where is the strong classifier comprising weak classifiers . is the weight of .

AdaBoost objective is to minimize the following error/loss function [1]

where is the dataset. It is worth noting that this loss function takes the same form as most machine learning loss functions, e.g. the cross entropy loss where

is the prediction about [2]

Furthermore, the error term can be rearranged as follows

which shows that the

weak classifier is chosen by minimizing the second term. Its weight can then be found by differentiate the error.

[1] https://en.m.wikipedia.org/wiki/AdaBoost#Derivation

[2] http://rdipietro.github.io/friendly-intro-to-cross-entropy-loss/#cross-entropy