• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Matrix Completion with Noise

Cached

  • Download as a PDF

Download Links

  • [www.acm.caltech.edu]
  • [www-stat.stanford.edu]
  • [statweb.stanford.edu]
  • [statweb.stanford.edu]
  • [www-stat.stanford.edu]
  • [arxiv.org]
  • [iweb.tntech.edu]
  • [arxiv.org]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Emmanuel J. Candès , Yaniv Plan
Citations:255 - 13 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@MISC{Candès_matrixcompletion,
    author = {Emmanuel J. Candès and Yaniv Plan},
    title = {Matrix Completion with Noise},
    year = {}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

On the heels of compressed sensing, a remarkable new field has very recently emerged. This field addresses a broad range of problems of significant practical interest, namely, the recovery of a data matrix from what appears to be incomplete, and perhaps even corrupted, information. In its simplest form, the problem is to recover a matrix from a small sample of its entries, and comes up in many areas of science and engineering including collaborative filtering, machine learning, control, remote sensing, and computer vision to name a few. This paper surveys the novel literature on matrix completion, which shows that under some suitable conditions, one can recover an unknown low-rank matrix from a nearly minimal set of entries by solving a simple convex optimization problem, namely, nuclear-norm minimization subject to data constraints. Further, this paper introduces novel results showing that matrix completion is provably accurate even when the few observed entries are corrupted with a small amount of noise. A typical result is that one can recover an unknown n × n matrix of low rank r from just about nr log 2 n noisy samples with an error which is proportional to the noise level. We present numerical results which complement our quantitative analysis and show that, in practice, nuclear norm minimization accurately fills in the many missing entries of large low-rank matrices from just a few noisy samples. Some analogies between matrix completion and compressed sensing are discussed throughout.

Keyphrases

matrix completion    noisy sample    broad range    large low-rank matrix    simple convex optimization problem    minimal set    unknown matrix    small amount    many area    novel literature    quantitative analysis    typical result    suitable condition    noise level    compressed sensing    data matrix    computer vision    unknown low-rank matrix    present numerical result    nr log    novel result    remarkable new field    small sample    data constraint    significant practical interest    collaborative filtering    low rank    machine learning    nuclear-norm minimization subject    nuclear norm minimization    observed entry   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University