Improved analysis of supervised learning in the RKHS with random features: Beyond least squares

Jiamin Liu, Lei Wang, Heng Lian*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Abstract

We consider kernel-based supervised learning using random Fourier features, focusing on its statistical error bounds and generalization properties with general loss functions. Beyond the least squares loss, existing results only demonstrate worst-case analysis with rate n−1/2 and the number of features at least comparable to n, and refined-case analysis where it can achieve almost n−1 rate when the kernel's eigenvalue decay is exponential and the number of features is again at least comparable to n. For the least squares loss, the results are much richer and the optimal rates can be achieved under the source and capacity assumptions, with the number of features smaller than n. In this paper, for both losses with Lipschitz derivative and Lipschitz losses, we successfully establish faster rates with number of features much smaller than n, which are the same as the rates and number of features for the least squares loss. More specifically, in the attainable case (the true function is in the RKHS), we obtain the rate n−(2ξ/2ξ+γ) which is the same as the standard method without using approximation, using o(n) features, where ξ characterizes the smoothness of the true function and γ characterizes the decay rate of the eigenvalues of the integral operator. Thus our results answer an important open question regarding random features. © 2025 Elsevier Ltd
Original languageEnglish
Article number107091
JournalNeural Networks
Volume184
Online published8 Jan 2025
DOIs
Publication statusPublished - Apr 2025

Funding

The work of Jiamin Liu was supported in part by the NSFC at University of Science and Technology Beijing under Grant 12401332. The research of Heng Lian is supported by NSFC 12371297 at CityU Shenzhen Research Institute, NSF of Jiangxi Province under Grant 20223BCJ25017, and by Hong Kong RGC general research fund 1130 0519, 11300721 and 11311822, and by CityU internal grant 7006014.

Research Keywords

  • Logistic regression
  • Quantile regression
  • Regression and classification
  • Reproducing kernel Hilbert space
  • Source and capacity conditions

Fingerprint

Dive into the research topics of 'Improved analysis of supervised learning in the RKHS with random features: Beyond least squares'. Together they form a unique fingerprint.

Cite this