Stability and Generalization for Markov Chain Stochastic Gradient Methods

Puyu Wang, Yunwen Lei, Yiming Ying*, Ding-Xuan Zhou

*Corresponding author for this work

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

10 Citations (Scopus)

Abstract

Recently there is a large amount of work devoted to the study of Markov chain stochastic gradient methods (MC-SGMs) which mainly focus on their convergence analysis for solving minimization problems. In this paper, we provide a comprehensive generalization analysis of MC-SGMs for both minimization and minimax problems through the lens of algorithmic stability in the framework of statistical learning theory. For empirical risk minimization (ERM) problems, we establish the optimal excess population risk bounds for both smooth and non-smooth cases by introducing on-average argument stability. For minimax problems, we develop a quantitative connection between on-average argument stability and generalization error which extends the existing results for uniform stability [38]. We further develop the first nearly optimal convergence rates for convex-concave problems both in expectation and with high probability, which, combined with our stability results, show that the optimal generalization bounds can be attained for both smooth and non-smooth cases. To the best of our knowledge, this is the first generalization analysis of SGMs when the gradients are sampled from a Markov process. © 2022 Neural information processing systems foundation. All rights reserved.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 35
Subtitle of host publication36th Conference on Neural Information Processing Systems (NeurIPS 2022)
EditorsS. Koyejo, S. Mohamed, A. Agarwa, D. Belgrave, K. Cho, A. Oh
PublisherNeural Information Processing Systems (NeurIPS)
Number of pages14
ISBN (Print)9781713871088
Publication statusPublished - Nov 2022
Event36th Conference on Neural Information Processing Systems (NeurIPS 2022) - Hybrid, New Orleans Convention Center, New Orleans, United States
Duration: 28 Nov 20229 Dec 2022
https://neurips.cc/
https://nips.cc/Conferences/2022
https://proceedings.neurips.cc/paper_files/paper/2022

Publication series

NameAdvances in Neural Information Processing Systems
Volume35
ISSN (Print)1049-5258

Conference

Conference36th Conference on Neural Information Processing Systems (NeurIPS 2022)
Abbreviated titleNIPS '22
PlaceUnited States
CityNew Orleans
Period28/11/229/12/22
Internet address

Fingerprint

Dive into the research topics of 'Stability and Generalization for Markov Chain Stochastic Gradient Methods'. Together they form a unique fingerprint.

Cite this