Convergence of online mirror descent
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 343-373 |
Journal / Publication | Applied and Computational Harmonic Analysis |
Volume | 48 |
Issue number | 1 |
Online published | 22 May 2018 |
Publication status | Published - Jan 2020 |
Link(s)
Abstract
In this paper we consider online mirror descent (OMD), a class of scalable online learning algorithms exploiting data geometric structures through mirror maps. Necessary and sufficient conditions are presented in terms of the step size sequence {ηt}t for the convergence of OMD with respect to the expected Bregman distance induced by the mirror map. The condition is limt→∞ ηt = 0,∑∞t=1 ηt = ∞ in the case of positive variances. It is reduced to ∑∞t=1 ηt = ∞ in the case of zero variance for which linear convergence may be achieved by taking a constant step size sequence. A sufficient condition on the almost sure convergence is also given. We establish tight error bounds under mild conditions on the mirror map, the loss function, and the regularizer. Our results are achieved by some novel analysis on the one-step progress of OMD using smoothness and strong convexity of the mirror map and the loss function.
Research Area(s)
- Bregman distance, Convergence analysis, Learning theory, Mirror descent, Online learning
Citation Format(s)
Convergence of online mirror descent. / Lei, Yunwen; Zhou, Ding-Xuan.
In: Applied and Computational Harmonic Analysis, Vol. 48, No. 1, 01.2020, p. 343-373.
In: Applied and Computational Harmonic Analysis, Vol. 48, No. 1, 01.2020, p. 343-373.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review