• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations | Disambiguate

DMCA

Estimation of (near) low-rank matrices with noise and high-dimensional scaling

Cached

  • Download as a PDF

Download Links

  • [www.icml2010.org]
  • [www.eecs.berkeley.edu]
  • [web.mit.edu]
  • [www.eecs.berkeley.edu]
  • [icml2010.haifa.il.ibm.com]
  • [www.eecs.berkeley.edu]
  • [web.mit.edu]
  • [www.eecs.berkeley.edu]
  • [www.eecs.berkeley.edu]
  • [www.eecs.berkeley.edu]
  • [web.mit.edu]
  • [www.eecs.berkeley.edu]
  • [www.eecs.berkeley.edu]
  • [web.mit.edu]
  • [www.eecs.berkeley.edu]
  • [www.eecs.berkeley.edu]
  • [iweb.tntech.edu]
  • [arxiv.org]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Sahand Negahban , Martin J. Wainwright
Citations:95 - 14 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@MISC{Negahban_estimationof,
    author = {Sahand Negahban and Martin J. Wainwright},
    title = {Estimation of (near) low-rank matrices with noise and high-dimensional scaling},
    year = {}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

We study an instance of high-dimensional statistical inference in which the goal is to use N noisy observations to estimate a matrix Θ ∗ ∈ R k×p that is assumed to be either exactly low rank, or “near ” low-rank, meaning that it can be well-approximated by a matrix with low rank. We consider an M-estimator based on regularization by the traceornuclearnormovermatrices, andanalyze its performance under high-dimensional scaling. We provide non-asymptotic bounds on the Frobenius norm error that hold for a generalclassofnoisyobservationmodels,and apply to both exactly low-rank and approximately low-rank matrices. We then illustrate their consequences for a number of specific learning models, including low-rank multivariate or multi-task regression, system identification in vector autoregressive processes, and recovery of low-rank matrices from random projections. Simulations show excellent agreement with the high-dimensional scaling of the error predicted by our theory. 1.

Keyphrases

high-dimensional scaling    low-rank matrix    low rank    non-asymptotic bound    random projection    multi-task regression    frobenius norm error    system identification    excellent agreement    high-dimensional statistical inference    vector autoregressive process    low-rank multivariate    noisy observation   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University