Matrix completion by least-square, low-rank, and sparse self-representations

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

49 Scopus Citations
View graph of relations

Related Research Unit(s)


Original languageEnglish
Pages (from-to)290-305
Journal / PublicationPattern Recognition
Online published15 May 2017
Publication statusPublished - Nov 2017


Conventional matrix completion methods are generally based on rank minimization. These methods assume that the given matrix is of low-rank and the data points are drawn from a single subspace of low-dimensionality. Therefore they are not effective in completing matrices where the data are drawn from multiple subspaces. In this paper, we establish a novel matrix completion framework that is based on self-representation. Specifically, least-square, low-rank, and sparse self-representations based matrix completion algorithms are provided. The underlying idea is that one data point can be efficiently reconstructed by other data points belonging to a common subspace, where the missing entries are recovered so as to fit the common subspace. The proposed algorithms actually maximize the weighted correlations among the columns of a given matrix. We prove that the proposed algorithms are approximations for rank-minimization of the incomplete matrix. In addition, they are able to complete high-rank or even full-rank matrices when the data are drawn from multiple subspaces. Comparative studies are conducted on synthetic datasets, natural image inpainting tasks, temperature prediction task, and collaborative filtering tasks. The results show that the proposed algorithms often outperform other state-of-the-art algorithms in various tasks.

Research Area(s)

  • Collaborative filtering, Image inpainting, Low-rank and sparse representations, Matrix completion, Missing value