Optimal prediction of quantile functional linear regression in reproducing kernel Hilbert spaces

Rui Li, Wenqi Lu, Zhongyi Zhu, Heng Lian*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

8 Citations (Scopus)

Abstract

Quantile functional linear regression was previously studied using functional principal component analysis. Here we consider the alternative penalized estimator based on the reproducing kernel Hilbert spaces (RKHS) setting. The motivation is that, for the functional linear (mean) regression, it has already been shown in Cai and Yuan (2012) that the approach based on RKHS performs better when the coefficient function does not align well with the eigenfunctions of the covariance kernel. We establish its optimal convergence rate in prediction risk using the Rademacher complexity to bound appropriate empirical processes. Some Monte Carlo studies are carried out for illustration.
Original languageEnglish
Pages (from-to)162-170
JournalJournal of Statistical Planning and Inference
Volume211
Online published3 Jul 2020
DOIs
Publication statusPublished - Mar 2021

Research Keywords

  • Convergence rate
  • Prediction risk
  • Quantile regression
  • Rademacher complexity

Fingerprint

Dive into the research topics of 'Optimal prediction of quantile functional linear regression in reproducing kernel Hilbert spaces'. Together they form a unique fingerprint.

Cite this