Projects per year
Abstract
Support vector classification (SVC) is a well-known statistical technique for classification problems in machine learning and other fields. An important question for SVC is the selection of covariates (or features) for the model. Many studies have considered model selection methods. As is well-known, selecting one winning model over others can entail considerable instability in predictive performance due to model selection uncertainties. This paper advocates model averaging as an alternative approach, where estimates obtained from different models are combined in a weighted average. We propose a model weighting scheme and provide the theoretical underpinning for the proposed method. In particular, we prove that our proposed method yields a model average estimator that achieves the smallest hinge risk among all feasible combinations asymptotically. To remedy the computational burden due to a large number of feasible models, we propose a screening step to eliminate the uninformative features before combining the models. Results from real data applications and a simulation study show that the proposed method generally yields more accurate estimates than existing methods. © 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
Original language | English |
---|---|
Article number | 117 |
Journal | Statistics and Computing |
Volume | 33 |
Issue number | 5 |
Online published | 8 Aug 2023 |
DOIs | |
Publication status | Published - Oct 2023 |
Research Keywords
- Asymptotic optimality
- Binary classification
- Model selection
- Weight choice
Fingerprint
Dive into the research topics of 'Model averaging for support vector classifier by cross-validation'. Together they form a unique fingerprint.Projects
- 1 Finished
-
GRF: Statistical Inference after Model Averaging
WAN, T.-K. A. (Principal Investigator / Project Coordinator) & Zhang, X. (Co-Investigator)
1/11/19 → 16/10/23
Project: Research