Learning Rate for Convex Support Tensor Machines

Heng Lian*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

5 Citations (Scopus)

Abstract

Tensors are increasingly encountered in prediction problems. We extend previous results for high-dimensional least-squares convex tensor regression to classification problems with a hinge loss and establish its asymptotic statistical properties. Based on a general convex decomposable penalty, the rate depends on both the intrinsic dimension and the Rademacher complexity of the class of linear functions of tensor predictors.
Original languageEnglish
Article number9174817
Pages (from-to)3755-3760
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume32
Issue number8
Online published24 Aug 2020
DOIs
Publication statusPublished - Aug 2021

Research Keywords

  • Empirical processes
  • high-dimensional regression
  • hinge loss
  • intrinsic dimension
  • Rademacher complexity

Fingerprint

Dive into the research topics of 'Learning Rate for Convex Support Tensor Machines'. Together they form a unique fingerprint.

Cite this