Direct convex relaxations of sparse SVM
Research output: Chapters, Conference Papers, Creative and Literary Works › RGC 32 - Refereed conference paper (with host publication) › peer-review
Author(s)
Detail(s)
Original language | English |
---|---|
Title of host publication | ACM International Conference Proceeding Series |
Pages | 145-153 |
Volume | 227 |
Publication status | Published - 2007 |
Externally published | Yes |
Publication series
Name | |
---|---|
Volume | 227 |
Conference
Title | 24th International Conference on Machine Learning, ICML 2007 |
---|---|
Place | United States |
City | Corvalis, OR |
Period | 20 - 24 June 2007 |
Link(s)
Abstract
Although support vector machines (SVMs) for binary classification give rise to a decision rule that only relies on a subset of the training data points (support vectors), it will in general be based on all available features in the input space. We propose two direct, novel convex relaxations of a non-convex sparse SVM formulation that explicitly constrains the cardinality of the vector of feature weights. One relaxation results in a quadratically-constrained quadratic program (QCQP), while the second is based on a semidefinite programming (SDP) relaxation. The QCQP formulation can be interpreted as applying an adaptive soft-threshold on the SVM hyperplane, while the SDP formulation learns a weighted inner-product (i.e. a kernel) that results in a sparse hyperplane. Experimental results show an increase in sparsity while conserving the generalization performance compared to a standard as well as a linear programming SVM.
Citation Format(s)
Direct convex relaxations of sparse SVM. / Chan, Antoni B.; Vasconcelos, Nuno; Lanckriet, Gert R. G.
ACM International Conference Proceeding Series. Vol. 227 2007. p. 145-153.
ACM International Conference Proceeding Series. Vol. 227 2007. p. 145-153.
Research output: Chapters, Conference Papers, Creative and Literary Works › RGC 32 - Refereed conference paper (with host publication) › peer-review