Concentration estimates for learning with l1-regularizer and data dependent hypothesis spaces

Lei Shi, Yun-Long Feng, Ding-Xuan Zhou

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

120 Citations (Scopus)

Abstract

We consider the regression problem by learning with a regularization scheme in a data dependent hypothesis space and 1-regularizer. The data dependence nature of the kernel-based hypothesis space provides flexibility for the learning algorithm. The regularization scheme is essentially different from the standard one in a reproducing kernel Hilbert space: the kernel is not necessarily symmetric or positive semi-definite and the regularizer is the 1-norm of a function expansion involving samples. The differences lead to additional difficulty in the error analysis. In this paper we apply concentration techniques with 2-empirical covering numbers to improve the learning rates for the algorithm. Sparsity of the algorithm is studied based on our error analysis. We also show that a function space involved in the error analysis induced by the 1-regularizer and non-symmetric kernel has nice behaviors in terms of the 2-empirical covering numbers of its unit ball. © 2011 Elsevier Inc.
Original languageEnglish
Pages (from-to)286-302
JournalApplied and Computational Harmonic Analysis
Volume31
Issue number2
DOIs
Publication statusPublished - Sept 2011

Research Keywords

  • 1-regularizer and sparsity
  • 2-empirical covering number
  • Concentration estimate for error analysis
  • Data dependent hypothesis space
  • Learning theory

Fingerprint

Dive into the research topics of 'Concentration estimates for learning with l1-regularizer and data dependent hypothesis spaces'. Together they form a unique fingerprint.

Cite this