Direct convex relaxations of sparse SVM

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

70 Scopus Citations
View graph of relations

Author(s)

Detail(s)

Original languageEnglish
Title of host publicationACM International Conference Proceeding Series
Pages145-153
Volume227
Publication statusPublished - 2007
Externally publishedYes

Publication series

Name
Volume227

Conference

Title24th International Conference on Machine Learning, ICML 2007
PlaceUnited States
CityCorvalis, OR
Period20 - 24 June 2007

Abstract

Although support vector machines (SVMs) for binary classification give rise to a decision rule that only relies on a subset of the training data points (support vectors), it will in general be based on all available features in the input space. We propose two direct, novel convex relaxations of a non-convex sparse SVM formulation that explicitly constrains the cardinality of the vector of feature weights. One relaxation results in a quadratically-constrained quadratic program (QCQP), while the second is based on a semidefinite programming (SDP) relaxation. The QCQP formulation can be interpreted as applying an adaptive soft-threshold on the SVM hyperplane, while the SDP formulation learns a weighted inner-product (i.e. a kernel) that results in a sparse hyperplane. Experimental results show an increase in sparsity while conserving the generalization performance compared to a standard as well as a linear programming SVM.

Citation Format(s)

Direct convex relaxations of sparse SVM. / Chan, Antoni B.; Vasconcelos, Nuno; Lanckriet, Gert R. G.
ACM International Conference Proceeding Series. Vol. 227 2007. p. 145-153.

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review