Yale University
Department of Statistics
Seminar

Monday, April 8, 2002

Robert Schapire
AT&T Labs
 

The Boosting Approach to Machine Learning

Boosting is a general method for producing a very accurate
classification rule by combining rough and moderately inaccurate
"rules of thumb."  While rooted in a theoretical framework of machine
learning, boosting has been found to perform quite well empirically.
In this talk, I will introduce the boosting algorithm AdaBoost, and
explain the underlying theory of boosting, including our explanation
of why boosting often does not suffer from overfitting.  I also will
describe some recent applications and extensions of boosting.


Seminar to be held in Room 107, 24 Hillhouse Avenue at 4:15 pm