Abstract
Stochastic composite mirror descent (SCMD) is a simple and efficient method able to capture both geometric and composite structures of optimization problems in machine learning. Existing strategies require to take either an average or a random selection of iterates to achieve optimal convergence rates, which, however, can either destroy the sparsity of solutions or slow down the practical training speed. In this paper, we propose a theoretically sound strategy to select an individual iterate of the vanilla SCMD, which is able to achieve optimal rates for both convex and strongly convex problems in a non-smooth learning setting. This strategy of outputting an individual iterate can preserve the sparsity of solutions which is crucial for a proper interpretation in sparse learning problems. We report experimental comparisons with several baseline methods to show the effectiveness of our method in achieving a fast training speed as well as in outputting sparse solutions.
| Original language | English |
|---|---|
| Title of host publication | Advances in Neural Information Processing Systems 32 (NIPS 2019) |
| Editors | H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, R. Garnett |
| Publication status | Published - Dec 2019 |
| Event | 33rd Conference on Neural Information Processing Systems (NeurIPS 2019) - Vancouver Convention Center, Vancouver, Canada Duration: 8 Dec 2019 → 14 Dec 2019 https://europe.naverlabs.com/updates/neurips-2019/ https://nips.cc/ https://nips.cc/Conferences/2019/Schedule?type=Poster https://nips.cc/Conferences/2019/ScheduleMultitrack?event=13891 http://papers.nips.cc/book/advances-in-neural-information-processing-systems-32-2019 |
Publication series
| Name | Advances in Neural Information Processing Systems |
|---|---|
| Volume | 32 |
| ISSN (Print) | 1049-5258 |
Conference
| Conference | 33rd Conference on Neural Information Processing Systems (NeurIPS 2019) |
|---|---|
| Abbreviated title | NeurIPS 2019 |
| Place | Canada |
| City | Vancouver |
| Period | 8/12/19 → 14/12/19 |
| Internet address |
Research Keywords
- SUBGRADIENT METHODS
- ALGORITHMS