Statistical Rates of Convergence for Functional Partially Linear Support Vector Machines for Classification

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

3 Scopus Citations
View graph of relations

Author(s)

Detail(s)

Original languageEnglish
Pages (from-to)1-24
Journal / PublicationJournal of Machine Learning Research
Volume23
Online published22 May 2022
Publication statusPublished - 2022

Link(s)

Abstract

In this paper, we consider the learning rate of support vector machines with both a functional predictor and a high-dimensional multivariate vectorial predictor. Similar to the literature on learning in reproducing kernel Hilbert spaces, a source condition and a capacity condition are used to characterize the convergence rate of the estimator. It is highly non-trivial to establish the possibly faster rate of the linear part. Using a key basic inequality comparing losses at two carefully constructed points, we establish the learning rate of the linear part which is the same as if the functional part is known. The proof relies on empirical processes and the Rademacher complexity bound in the semi-nonparametric setting as analytic tools, Young's inequality for operators, as well as a novel "approximate convexity" assumption.

Research Area(s)

  • Convergence rate, Prediction risk, Rademacher complexity, Support vector classification

Download Statistics

No data available