Exactly Robust Kernel Principal Component Analysis

Jicong Fan, Tommy W. S. Chow*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

53 Citations (Scopus)

Abstract

Robust principal component analysis (RPCA) can recover low-rank matrices when they are corrupted by sparse noises. In practice, many matrices are, however, of high rank and, hence, cannot be recovered by RPCA. We propose a novel method called robust kernel principal component analysis (RKPCA) to decompose a partially corrupted matrix as a sparse matrix plus a high- or full-rank matrix with low latent dimensionality. RKPCA can be applied to many problems such as noise removal and subspace clustering and is still the only unsupervised nonlinear method robust to sparse noises. Our theoretical analysis shows that, with high probability, RKPCA can provide high recovery accuracy. The optimization of RKPCA involves nonconvex and indifferentiable problems. We propose two nonconvex optimization algorithms for RKPCA. They are alternating direction method of multipliers with backtracking line search and proximal linearized minimization with adaptive step size (AdSS). Comparative studies in noise removal and robust subspace clustering corroborate the effectiveness and the superiority of RKPCA.
Original languageEnglish
Pages (from-to)749-761
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume31
Issue number3
Online published29 Apr 2019
DOIs
Publication statusPublished - Mar 2020

Research Keywords

  • High rank
  • kernel
  • low rank
  • noise removal
  • robust principal component analysis (RPCA)
  • sparse
  • subspace clustering

Fingerprint

Dive into the research topics of 'Exactly Robust Kernel Principal Component Analysis'. Together they form a unique fingerprint.

Cite this