Distributed Kernel Ridge Regression with Communications
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Article number | 93 |
Number of pages | 38 |
Journal / Publication | Journal of Machine Learning Research |
Volume | 21 |
Online published | Apr 2020 |
Publication status | Published - 2020 |
Link(s)
Attachment(s) | Documents
Publisher's Copyright Statement
|
---|---|
Document Link | Links
|
Link to Scopus | https://www.scopus.com/record/display.uri?eid=2-s2.0-85087340474&origin=recordpage |
Permanent Link | https://scholars.cityu.edu.hk/en/publications/publication(3c210e9a-5bd3-45fa-8167-b8e484f74d67).html |
Abstract
This paper focuses on generalization performance analysis for distributed algorithms in the framework of learning theory. Taking distributed kernel ridge regression (DKRR) for example, we succeed in deriving its optimal learning rates in expectation and providing theoretically optimal ranges of the number of local processors. Due to the gap between theory and experiments, we also deduce optimal learning rates for DKRR in probability to essentially reflect the generalization performance and limitations of DKRR. Furthermore, we propose a communication strategy to improve the learning performance of DKRR and demonstrate the power of communications in DKRR via both theoretical assessments and numerical experiments.
Research Area(s)
- Communication, Distributed learning, Kernel ridge regression, Learning theory
Citation Format(s)
Distributed Kernel Ridge Regression with Communications. / Lin, Shao-Bo; Wang, Di; Zhou, Ding-Xuan.
In: Journal of Machine Learning Research, Vol. 21, 93, 2020.
In: Journal of Machine Learning Research, Vol. 21, 93, 2020.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Download Statistics
No data available