Sparse additive support vector machines in bounded variation space
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Article number | iaae003 |
Number of pages | 29 |
Journal / Publication | Information and Inference |
Volume | 13 |
Issue number | 1 |
Online published | 8 Feb 2024 |
Publication status | Published - Mar 2024 |
Link(s)
Abstract
We propose the total variation penalized sparse additive support vector machine (TVSAM) for performing classification in the high-dimensional settings, using a mixed l1-type functional regularization scheme to induce sparsity and smoothness simultaneously. We establish a representer theorem for TVSAM, which turns the infinite-dimensional problem into a finite-dimensional one, thereby providing computational feasibility. Even for the least squares loss, our result fills a gap in the literature when compared with the existing representer theorem. Theoretically, we derive some risk bounds for TVSAM under both exact sparsity and near sparsity, and with arbitrarily specified internal knots. In this process, we develop an important interpolation inequality for the space of functions of bounded variation, relying on analytic techniques such as mollification and partition of unity. An efficient implementation based on the alternating direction method of multipliers is employed. © 2024 Oxford University Press. All rights reserved.
Research Area(s)
- additive models, empirical norm penalty, high dimensionality, SVM, total variation penalty
Citation Format(s)
Sparse additive support vector machines in bounded variation space. / Wang, Yue; Lian, Heng.
In: Information and Inference, Vol. 13, No. 1, iaae003, 03.2024.
In: Information and Inference, Vol. 13, No. 1, iaae003, 03.2024.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review