Learning sparse conditional distribution : An efficient kernel-based approach

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)1610-1635
Journal / PublicationElectronic Journal of Statistics
Volume15
Issue number1
Online published26 Mar 2021
Publication statusPublished - 2021

Link(s)

Abstract

This paper proposes a novel method to recover the sparse structure of the conditional distribution, which plays a crucial role in subsequent statistical analysis such as prediction, forecasting, conditional distribution estimation and others. Unlike most existing methods that often require explicit model assumption or suffer from computational burden, the proposed method shows great advantage by making use of some desirable properties of reproducing kernel Hilbert space (RKHS). It can be efficiently implemented by optimizing its dual form and is particularly attractive in dealing with large-scale dataset. The asymptotic consistencies of the proposed method are established under mild conditions. Its effectiveness is also supported by a variety of simulated examples and a real-life supermarket dataset from Northern China.

Research Area(s)

  • Conditional distribution, consistency, parallel computing, RKHS, sparse learning, VARIABLE SELECTION, QUANTILE REGRESSION, LIKELIHOOD, RETURNS, LASSO

Citation Format(s)

Learning sparse conditional distribution: An efficient kernel-based approach. / Chen, Fang; He, Xin; Wang, Junhui.
In: Electronic Journal of Statistics, Vol. 15, No. 1, 2021, p. 1610-1635.

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

Download Statistics

No data available