Sparse additive support vector machines in bounded variation space

Yue Wang, Heng Lian*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Abstract

We propose the total variation penalized sparse additive support vector machine (TVSAM) for performing classification in the high-dimensional settings, using a mixed l1-type functional regularization scheme to induce sparsity and smoothness simultaneously. We establish a representer theorem for TVSAM, which turns the infinite-dimensional problem into a finite-dimensional one, thereby providing computational feasibility. Even for the least squares loss, our result fills a gap in the literature when compared with the existing representer theorem. Theoretically, we derive some risk bounds for TVSAM under both exact sparsity and near sparsity, and with arbitrarily specified internal knots. In this process, we develop an important interpolation inequality for the space of functions of bounded variation, relying on analytic techniques such as mollification and partition of unity. An efficient implementation based on the alternating direction method of multipliers is employed. © 2024 Oxford University Press. All rights reserved.
Original languageEnglish
Article numberiaae003
JournalInformation and Inference
Volume13
Issue number1
Online published8 Feb 2024
DOIs
Publication statusPublished - Mar 2024

Funding

NSFC (12371297 to H.L.) at CityU Shenzhen Research Institute; NSF of Jiangxi Province under Grant 20223BCJ25017; Hong Kong RGC general research fund 11300519, 11300721 and 11311822; CityU internal grant 7006014.

Research Keywords

  • additive models
  • empirical norm penalty
  • high dimensionality
  • SVM
  • total variation penalty

RGC Funding Information

  • RGC-funded

Fingerprint

Dive into the research topics of 'Sparse additive support vector machines in bounded variation space'. Together they form a unique fingerprint.

Cite this