Convergence of unregularized online learning algorithms

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

10 Scopus Citations
View graph of relations

Author(s)

  • Yunwen Lei
  • Lei Shi
  • Zheng-Chu Guo

Related Research Unit(s)

Detail(s)

Original languageEnglish
Journal / PublicationJournal of Machine Learning Research
Volume18
Issue number1
Publication statusPublished - Apr 2018

Link(s)

Abstract

In this paper we study the convergence of online gradient descent algorithms in reproducing kernel Hilbert spaces (RKHSs) without regularization. We establish a sufficient condition and a necessary condition for the convergence of excess generalization errors in expectation. A sufficient condition for the almost sure convergence is also given. With high probability, we provide explicit convergence rates of the excess generalization errors for both averaged iterates and the last iterate, which in turn also imply convergence rates with probability one. To our best knowledge, this is the first high-probability convergence rate for the last iterate of online gradient descent algorithms in the general convex setting. Without any boundedness assumptions on iterates, our results are derived by a novel use of two measures of the algorithm's one-step progress, respectively by generalization errors and by distances in RKHSs, where the variances of the involved martingales are cancelled out by the descent property of the algorithm.

Research Area(s)

  • Convergence analysis, Learning theory, Online learning, Reproducing kernel Hilbert space

Citation Format(s)

Convergence of unregularized online learning algorithms. / Lei, Yunwen; Shi, Lei; Guo, Zheng-Chu.
In: Journal of Machine Learning Research, Vol. 18, No. 1, 04.2018.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Download Statistics

No data available