TY - JOUR
T1 - Learning and approximation by Gaussians on Riemannian manifolds
AU - Ye, Gui-Bo
AU - Zhou, Ding-Xuan
PY - 2008/10
Y1 - 2008/10
N2 - Learning function relations or understanding structures of data lying in manifolds embedded in huge dimensional Euclidean spaces is an important topic in learning theory. In this paper we study the approximation and learning by Gaussians of functions defined on a d-dimensional connected compact C ∞ Riemannian submanifold of IRn which is isometrically embedded. We show that the convolution with the Gaussian kernel with variance σ provides the uniform approximation order of O(σs) when the approximated function is Lipschitz s ∈ (0, 1]. The uniform normal neighborhoods of a compact Riemannian manifold play a central role in deriving the approximation order. This approximation result is used to investigate the regression learning algorithm generated by the multi-kernel least square regularization scheme associated with Gaussian kernels with flexible variances. When the regression function is Lipschitz s, our learning rate is (log2 m)/m)s/(8 s + 4 d) where m is the sample size. When the manifold dimension d is smaller than the dimension n of the underlying Euclidean space, this rate is much faster compared with those in the literature. By comparing approximation orders, we also show the essential difference between approximation schemes with flexible variances and those with a single variance. © 2007 Springer Science+Business Media, Inc.
AB - Learning function relations or understanding structures of data lying in manifolds embedded in huge dimensional Euclidean spaces is an important topic in learning theory. In this paper we study the approximation and learning by Gaussians of functions defined on a d-dimensional connected compact C ∞ Riemannian submanifold of IRn which is isometrically embedded. We show that the convolution with the Gaussian kernel with variance σ provides the uniform approximation order of O(σs) when the approximated function is Lipschitz s ∈ (0, 1]. The uniform normal neighborhoods of a compact Riemannian manifold play a central role in deriving the approximation order. This approximation result is used to investigate the regression learning algorithm generated by the multi-kernel least square regularization scheme associated with Gaussian kernels with flexible variances. When the regression function is Lipschitz s, our learning rate is (log2 m)/m)s/(8 s + 4 d) where m is the sample size. When the manifold dimension d is smaller than the dimension n of the underlying Euclidean space, this rate is much faster compared with those in the literature. By comparing approximation orders, we also show the essential difference between approximation schemes with flexible variances and those with a single variance. © 2007 Springer Science+Business Media, Inc.
KW - Approximation
KW - Gaussian kernels
KW - Learning theory
KW - Multi-kernel least square regularization scheme
KW - Reproducing kernel Hilbert spaces
KW - Riemannian manifolds
UR - http://www.scopus.com/inward/record.url?scp=55049127622&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-55049127622&origin=recordpage
U2 - 10.1007/s10444-007-9049-0
DO - 10.1007/s10444-007-9049-0
M3 - 21_Publication in refereed journal
VL - 29
SP - 291
EP - 310
JO - Advances in Computational Mathematics
JF - Advances in Computational Mathematics
SN - 1019-7168
IS - 3
ER -