Projects per year
Abstract
This paper focuses on generalization performance analysis for distributed algorithms in the framework of learning theory. Taking distributed kernel ridge regression (DKRR) for example, we succeed in deriving its optimal learning rates in expectation and providing theoretically optimal ranges of the number of local processors. Due to the gap between theory and experiments, we also deduce optimal learning rates for DKRR in probability to essentially reflect the generalization performance and limitations of DKRR. Furthermore, we propose a communication strategy to improve the learning performance of DKRR and demonstrate the power of communications in DKRR via both theoretical assessments and numerical experiments.
| Original language | English |
|---|---|
| Article number | 93 |
| Number of pages | 38 |
| Journal | Journal of Machine Learning Research |
| Volume | 21 |
| Online published | Apr 2020 |
| Publication status | Published - 2020 |
Research Keywords
- Communication
- Distributed learning
- Kernel ridge regression
- Learning theory
Publisher's Copyright Statement
- This full text is made available under CC-BY 4.0. https://creativecommons.org/licenses/by/4.0/
Fingerprint
Dive into the research topics of 'Distributed Kernel Ridge Regression with Communications'. Together they form a unique fingerprint.Projects
- 1 Finished
-
GRF: Approximation Theory of Structured Deep Nets and Learning
ZHOU, D. (Principal Investigator / Project Coordinator)
1/01/18 → 4/05/21
Project: Research