Skip to main navigation Skip to search Skip to main content

Boosting the generalized margin in cost-sensitive multiclass classification

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Abstract

The boosting algorithm is one of the most successful binary classification techniques due to its relative immunity to overfitting and flexible implementation. Several attempts have been made to extend the binary boosting algorithm to multiclass classification. In this article,a novel cost-sensitive multiclass boosting algorithm is proposed that naturally extends the popular binary AdaBoost algorithm and admits unequal misclassification costs. The proposed multiclass boosting algorithm achieves superior classification per-formance by combining weak candidate models that only need to be better than random guessing. More importantly, the proposed algorithm achieves a large margin separation of the training sample while attaining an L1-norm constraint on the model complexity. Finally, the effectiveness of the proposed algorithm is demonstrated in a number of simulated and real experiments. The supplementary files are available online, including the technical proofs, the implemented R code, and the real datasets. © 2013 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.
Original languageEnglish
Pages (from-to)178-192
JournalJournal of Computational and Graphical Statistics
Volume22
Issue number1
DOIs
Publication statusPublished - 2013
Externally publishedYes

Research Keywords

  • Adaboost
  • Classification tree
  • Generalization
  • Margin
  • Misclassification cost

Fingerprint

Dive into the research topics of 'Boosting the generalized margin in cost-sensitive multiclass classification'. Together they form a unique fingerprint.

Cite this