On generalized ridge regression estimators under collinearity and balanced loss

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

19 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)455-467
Journal / PublicationApplied Mathematics and Computation
Volume129
Issue number2-3
Publication statusPublished - 10 Jul 2002

Abstract

In regression analysis, ridge estimators are often used to alleviate the problem of multicollinearity. Ridge estimators have traditionally been evaluated using the risk under quadratic loss criterion, which places sole emphasis on estimators' precision. Here, we consider the balanced loss function (A. Zellner, in: S.S. Gupta, J.O. Berger (Eds.), Statistical Decision Theory and Related Topics, vol. V, Springer, New York, 1994, p. 377) which incorporates a measure for the goodness of fit of the model as well as estimation precision. By adopting this loss we derive and numerically evaluate the risks of the feasible generalized ridge and the almost unbiased feasible generalized ridge estimators. We show that in the case of severe multicollinearity, the feasible generalized ridge estimator often produces the greatest risk reductions, even if a relatively heavy weight is given to goodness of fit in the balanced loss function. © 2002 Elsevier Science Inc. All rights reserved.

Research Area(s)

  • Balanced loss, Ridge regression, Risk