PENALIZED JACKKNIFE EMPIRICAL LIKELIHOOD IN HIGH DIMENSIONS

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)1219-1232
Journal / PublicationStatistica Sinica
Volume33
Publication statusPublished - May 2023

Link(s)

Abstract

The jackknife empirical likelihood (JEL) is an attractive approach for statistical inferences with nonlinear statistics, such as U-statistics. However, most contemporary problems involve high-dimensional model selection and, thus, the feasibility of this approach in theory and practice remains largely unexplored in situations in which the number of parameters diverges to infinity. In this paper, we propose a penalized JEL method that preserves the main advantages of the JEL and leads to reliable variable selection based on estimating equations with a U-statistic structure in high-dimensional settings. Under certain regularity conditions, we establish the asymptotic theory and oracle property for the JEL and its penalized version when the numbers of estimating equations and parameters increase with the sample size. Simulation studies and a real-data analysis are used to examine the performance of the proposed methods and illustrate its practical utility. © 2023 Institute of Statistical Science. All rights reserved.

Research Area(s)

  • Estimating equations, high-dimensional data analysis, jackknife empirical likelihood, penalized likelihood, U-statistics, variable selection

Citation Format(s)

PENALIZED JACKKNIFE EMPIRICAL LIKELIHOOD IN HIGH DIMENSIONS. / Li, Zhouping; Xu, Jinfeng; Zhao, Na et al.
In: Statistica Sinica, Vol. 33, 05.2023, p. 1219-1232.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Download Statistics

No data available