TY - JOUR
T1 - Exactly Robust Kernel Principal Component Analysis
AU - Fan, Jicong
AU - Chow, Tommy W. S.
PY - 2020/3
Y1 - 2020/3
N2 - Robust principal component analysis (RPCA) can recover low-rank matrices when they are corrupted by sparse noises. In practice, many matrices are, however, of high rank and, hence, cannot be recovered by RPCA. We propose a novel method called robust kernel principal component analysis (RKPCA) to decompose a partially corrupted matrix as a sparse matrix plus a high- or full-rank matrix with low latent dimensionality. RKPCA can be applied to many problems such as noise removal and subspace clustering and is still the only unsupervised nonlinear method robust to sparse noises. Our theoretical analysis shows that, with high probability, RKPCA can provide high recovery accuracy. The optimization of RKPCA involves nonconvex and indifferentiable problems. We propose two nonconvex optimization algorithms for RKPCA. They are alternating direction method of multipliers with backtracking line search and proximal linearized minimization with adaptive step size (AdSS). Comparative studies in noise removal and robust subspace clustering corroborate the effectiveness and the superiority of RKPCA.
AB - Robust principal component analysis (RPCA) can recover low-rank matrices when they are corrupted by sparse noises. In practice, many matrices are, however, of high rank and, hence, cannot be recovered by RPCA. We propose a novel method called robust kernel principal component analysis (RKPCA) to decompose a partially corrupted matrix as a sparse matrix plus a high- or full-rank matrix with low latent dimensionality. RKPCA can be applied to many problems such as noise removal and subspace clustering and is still the only unsupervised nonlinear method robust to sparse noises. Our theoretical analysis shows that, with high probability, RKPCA can provide high recovery accuracy. The optimization of RKPCA involves nonconvex and indifferentiable problems. We propose two nonconvex optimization algorithms for RKPCA. They are alternating direction method of multipliers with backtracking line search and proximal linearized minimization with adaptive step size (AdSS). Comparative studies in noise removal and robust subspace clustering corroborate the effectiveness and the superiority of RKPCA.
KW - High rank
KW - kernel
KW - low rank
KW - noise removal
KW - robust principal component analysis (RPCA)
KW - sparse
KW - subspace clustering
UR - http://www.scopus.com/inward/record.url?scp=85065127246&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85065127246&origin=recordpage
U2 - 10.1109/TNNLS.2019.2909686
DO - 10.1109/TNNLS.2019.2909686
M3 - RGC 21 - Publication in refereed journal
SN - 2162-237X
VL - 31
SP - 749
EP - 761
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 3
ER -