Skip to main navigation Skip to search Skip to main content

Matrix completion by least-square, low-rank, and sparse self-representations

Jicong Fan, Tommy W.S. Chow*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Abstract

Conventional matrix completion methods are generally based on rank minimization. These methods assume that the given matrix is of low-rank and the data points are drawn from a single subspace of low-dimensionality. Therefore they are not effective in completing matrices where the data are drawn from multiple subspaces. In this paper, we establish a novel matrix completion framework that is based on self-representation. Specifically, least-square, low-rank, and sparse self-representations based matrix completion algorithms are provided. The underlying idea is that one data point can be efficiently reconstructed by other data points belonging to a common subspace, where the missing entries are recovered so as to fit the common subspace. The proposed algorithms actually maximize the weighted correlations among the columns of a given matrix. We prove that the proposed algorithms are approximations for rank-minimization of the incomplete matrix. In addition, they are able to complete high-rank or even full-rank matrices when the data are drawn from multiple subspaces. Comparative studies are conducted on synthetic datasets, natural image inpainting tasks, temperature prediction task, and collaborative filtering tasks. The results show that the proposed algorithms often outperform other state-of-the-art algorithms in various tasks.
Original languageEnglish
Pages (from-to)290-305
JournalPattern Recognition
Volume71
Online published15 May 2017
DOIs
Publication statusPublished - Nov 2017

Research Keywords

  • Collaborative filtering
  • Image inpainting
  • Low-rank and sparse representations
  • Matrix completion
  • Missing value

Policy Impact

  • Cited in Policy Documents

Fingerprint

Dive into the research topics of 'Matrix completion by least-square, low-rank, and sparse self-representations'. Together they form a unique fingerprint.

Cite this