Abstract
The boosting algorithm is one of the most successful binary classification techniques due to its relative immunity to overfitting and flexible implementation. Several attempts have been made to extend the binary boosting algorithm to multiclass classification. In this article,a novel cost-sensitive multiclass boosting algorithm is proposed that naturally extends the popular binary AdaBoost algorithm and admits unequal misclassification costs. The proposed multiclass boosting algorithm achieves superior classification per-formance by combining weak candidate models that only need to be better than random guessing. More importantly, the proposed algorithm achieves a large margin separation of the training sample while attaining an L1-norm constraint on the model complexity. Finally, the effectiveness of the proposed algorithm is demonstrated in a number of simulated and real experiments. The supplementary files are available online, including the technical proofs, the implemented R code, and the real datasets. © 2013 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.
| Original language | English |
|---|---|
| Pages (from-to) | 178-192 |
| Journal | Journal of Computational and Graphical Statistics |
| Volume | 22 |
| Issue number | 1 |
| DOIs | |
| Publication status | Published - 2013 |
| Externally published | Yes |
Research Keywords
- Adaboost
- Classification tree
- Generalization
- Margin
- Misclassification cost
Fingerprint
Dive into the research topics of 'Boosting the generalized margin in cost-sensitive multiclass classification'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver