Optimal Stochastic and Online Learning with Individual Iterates

Yunwen Lei, Peng Yang, Ke Tang*, Ding-Xuan Zhou

*Corresponding author for this work

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

4 Citations (Scopus)

Abstract

Stochastic composite mirror descent (SCMD) is a simple and efficient method able to capture both geometric and composite structures of optimization problems in machine learning. Existing strategies require to take either an average or a random selection of iterates to achieve optimal convergence rates, which, however, can either destroy the sparsity of solutions or slow down the practical training speed. In this paper, we propose a theoretically sound strategy to select an individual iterate of the vanilla SCMD, which is able to achieve optimal rates for both convex and strongly convex problems in a non-smooth learning setting. This strategy of outputting an individual iterate can preserve the sparsity of solutions which is crucial for a proper interpretation in sparse learning problems. We report experimental comparisons with several baseline methods to show the effectiveness of our method in achieving a fast training speed as well as in outputting sparse solutions.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 32 (NIPS 2019)
EditorsH. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, R. Garnett
Publication statusPublished - Dec 2019
Event33rd Conference on Neural Information Processing Systems (NeurIPS 2019) - Vancouver Convention Center, Vancouver, Canada
Duration: 8 Dec 201914 Dec 2019
https://europe.naverlabs.com/updates/neurips-2019/
https://nips.cc/
https://nips.cc/Conferences/2019/Schedule?type=Poster
https://nips.cc/Conferences/2019/ScheduleMultitrack?event=13891
http://papers.nips.cc/book/advances-in-neural-information-processing-systems-32-2019

Publication series

NameAdvances in Neural Information Processing Systems
Volume32
ISSN (Print)1049-5258

Conference

Conference33rd Conference on Neural Information Processing Systems (NeurIPS 2019)
Abbreviated titleNeurIPS 2019
PlaceCanada
CityVancouver
Period8/12/1914/12/19
Internet address

Research Keywords

  • SUBGRADIENT METHODS
  • ALGORITHMS

Fingerprint

Dive into the research topics of 'Optimal Stochastic and Online Learning with Individual Iterates'. Together they form a unique fingerprint.

Cite this