Abstract
This paper studies support vector machine classification algorithms. We analyze the 1-norm soft margin classifier. The consistency is considered in two forms. When the regularization error decays to zero, the Bayes-risk consistency is proved and learning rates are derived by means of techniques of uniform convergence. The main difficulty we overcome here is to bound the offset. For the consistency with hypothesis space, we present a counterexample. COPYRIGHT 2006 EUDOXUS PRESS, LLC.
| Original language | English |
|---|---|
| Pages (from-to) | 99-119 |
| Journal | Journal of Computational Analysis and Applications |
| Volume | 8 |
| Issue number | 2 |
| Publication status | Published - 2006 |
Research Keywords
- Bayes-risk consistency
- Consistency with hypothesis space
- Mercer kernel
- Misclassification error
- Regularization error
- Support vector machine classification
Fingerprint
Dive into the research topics of 'Analysis of support vector machine classification'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver