TY - JOUR
T1 - Unified formulation of linear discriminant analysis methods and optimal parameter selection
AU - An, Senjian
AU - Liu, Wanquan
AU - Venkatesh, Svetha
AU - Yan, Hong
PY - 2011/2
Y1 - 2011/2
N2 - In the last decade, many variants of classical linear discriminant analysis (LDA) have been developed to tackle the under-sampled problem in face recognition. However, choosing the variants is not easy since these methods involve eigenvalue decomposition that makes cross-validation computationally expensive. In this paper, we propose to solve this problem by unifying these LDA variants in one framework: principal component analysis (PCA) plus constrained ridge regression (CRR). In CRR, one selects the target (also called class indicator) for each class, and finds a projection to locate the class centers at their class targets and the transform minimizes the within-class distances with a penalty on the transform norm as in ridge regression. Under this framework, many existing LDA methods can be viewed as PCACRR with particular regularization numbers and class indicators and we propose to choose the best LDA method as choosing the best member from the CRR family. The latter can be done by comparing their leave-one-out (LOO) errors and we present an efficient algorithm, which requires similar computations to the training process of CRR, to evaluate the LOO errors. Experiments on Yale Face B, Extended Yale B and CMU-PIE databases are conducted to demonstrate the effectiveness of the proposed methods. © 2010 Elsevier Ltd. All rights reserved.
AB - In the last decade, many variants of classical linear discriminant analysis (LDA) have been developed to tackle the under-sampled problem in face recognition. However, choosing the variants is not easy since these methods involve eigenvalue decomposition that makes cross-validation computationally expensive. In this paper, we propose to solve this problem by unifying these LDA variants in one framework: principal component analysis (PCA) plus constrained ridge regression (CRR). In CRR, one selects the target (also called class indicator) for each class, and finds a projection to locate the class centers at their class targets and the transform minimizes the within-class distances with a penalty on the transform norm as in ridge regression. Under this framework, many existing LDA methods can be viewed as PCACRR with particular regularization numbers and class indicators and we propose to choose the best LDA method as choosing the best member from the CRR family. The latter can be done by comparing their leave-one-out (LOO) errors and we present an efficient algorithm, which requires similar computations to the training process of CRR, to evaluate the LOO errors. Experiments on Yale Face B, Extended Yale B and CMU-PIE databases are conducted to demonstrate the effectiveness of the proposed methods. © 2010 Elsevier Ltd. All rights reserved.
KW - Constrained ridge regression
KW - Face recognition
KW - Linear discriminant analysis
KW - Model selection
KW - Principal component analysis
KW - Under-sampled problem
UR - http://www.scopus.com/inward/record.url?scp=77957979412&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-77957979412&origin=recordpage
U2 - 10.1016/j.patcog.2010.08.026
DO - 10.1016/j.patcog.2010.08.026
M3 - RGC 21 - Publication in refereed journal
SN - 0031-3203
VL - 44
SP - 307
EP - 319
JO - Pattern Recognition
JF - Pattern Recognition
IS - 2
ER -