TY - JOUR
T1 - Cross-validation based adaptation for regularization operators in learning theory
AU - CAPONNETTO, Andrea
AU - YAO, Yuan
PY - 2010/4
Y1 - 2010/4
N2 - We consider learning algorithms induced by regularization methods in the regression setting. We show that previously obtained error bounds for these algorithms, using a priori choices of the regularization parameter, can be attained using a suitable a posteriori choice based on cross-validation. In particular, these results prove adaptation of the rate of convergence of the estimators to the minimax rate induced by the "effective dimension" of the problem. We also show universal consistency for this broad class of methods which includes regularized least-squares, truncated SVD, Landweber iteration and ν-method. © 2010 World Scientific Publishing Company.
AB - We consider learning algorithms induced by regularization methods in the regression setting. We show that previously obtained error bounds for these algorithms, using a priori choices of the regularization parameter, can be attained using a suitable a posteriori choice based on cross-validation. In particular, these results prove adaptation of the rate of convergence of the estimators to the minimax rate induced by the "effective dimension" of the problem. We also show universal consistency for this broad class of methods which includes regularized least-squares, truncated SVD, Landweber iteration and ν-method. © 2010 World Scientific Publishing Company.
KW - Error bounds
KW - Learning theory
KW - Regression
KW - Statistical adaptation
UR - http://www.scopus.com/inward/record.url?scp=77951538361&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-77951538361&origin=recordpage
U2 - 10.1142/S0219530510001564
DO - 10.1142/S0219530510001564
M3 - RGC 21 - Publication in refereed journal
SN - 0219-5305
VL - 8
SP - 161
EP - 183
JO - Analysis and Applications
JF - Analysis and Applications
IS - 2
ER -