• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 1,688
Next 10 →

Low-Rank Matrix Completion

by Ryan Kennedy , 2013
"... While datasets are frequently represented as matrices, real-word data is imperfect and entries are often missing. In many cases, the data are very sparse and the matrix must be filled in before any subsequent work can be done. This optimization problem, known as matrix completion, can be made well-d ..."
Abstract - Add to MetaCart
is convex and can be optimized efficiently, there has been a significant amount of research over the past few years to develop optimization algorithms that perform well. In this report, we review several methods for low-rank matrix completion. The first paper we review presents an iterative algorithm to

Low-Rank Matrix Approximation with Stability

by Dongsheng Li , Chao Chen , Qin Lv , Junchi Yan , Li Shang , Stephen M Chu
"... Abstract Low-rank matrix approximation has been widely adopted in machine learning applications with sparse data, such as recommender systems. However, the sparsity of the data, incomplete and noisy, introduces challenges to the algorithm stability -small changes in the training data may significan ..."
Abstract - Add to MetaCart
Abstract Low-rank matrix approximation has been widely adopted in machine learning applications with sparse data, such as recommender systems. However, the sparsity of the data, incomplete and noisy, introduces challenges to the algorithm stability -small changes in the training data may

DECENTRALIZED LOW-RANK MATRIX COMPLETION

by Qing Ling, Yangyang Xu, Wotao Yin, Zaiwen Wen
"... This paper introduces algorithms for the decentralized lowrank matrix completion problem. Assume a low-rank matrix W = [W1, W2,..., WL]. In a network, each agent ℓ observes some entries of Wℓ. In order to recover the unobserved entries of W via decentralized computation, we factorize the unknown mat ..."
Abstract - Cited by 5 (3 self) - Add to MetaCart
This paper introduces algorithms for the decentralized lowrank matrix completion problem. Assume a low-rank matrix W = [W1, W2,..., WL]. In a network, each agent ℓ observes some entries of Wℓ. In order to recover the unobserved entries of W via decentralized computation, we factorize the unknown

Sparse and Low-Rank Matrix Decompositions

by Venkat Chandrasekaran, Alan S. Willsky, et al. , 2009
"... Suppose we are given a matrix that is formed by adding an unknown sparse matrix to an unknown low-rank matrix. Our goal is to decompose the given matrix into its sparse and low-rank components. Such a problem arises in a number of applications in model and system identification, but obtaining an ex ..."
Abstract - Cited by 31 (2 self) - Add to MetaCart
Suppose we are given a matrix that is formed by adding an unknown sparse matrix to an unknown low-rank matrix. Our goal is to decompose the given matrix into its sparse and low-rank components. Such a problem arises in a number of applications in model and system identification, but obtaining

Concentration-based guarantees for low-rank matrix reconstruction

by Rina Foygel, Nathan Srebro, Sham Kakade, Ulrike Von Luxburg - 24th Annual Conference on Learning Theory (COLT , 2011
"... We consider the problem of approximately reconstructing a partially-observed, approximately low-rank matrix. This problem has received much attention lately, mostly using the trace-norm as a surrogate to the rank. Here we study low-rank matrix reconstruction using both the trace-norm, as well as the ..."
Abstract - Cited by 19 (5 self) - Add to MetaCart
We consider the problem of approximately reconstructing a partially-observed, approximately low-rank matrix. This problem has received much attention lately, mostly using the trace-norm as a surrogate to the rank. Here we study low-rank matrix reconstruction using both the trace-norm, as well

Information theoretic bounds for low-rank matrix completion

by Sriram Vishwanath - in 2010 IEEE International Symposium on Information Theory (ISIT 2010 , 2010
"... Abstract—This paper studies the low-rank matrix completion problem from an information theoretic perspective. The comple-tion problem is rephrased as a communication problem of an (uncoded) low-rank matrix source over an erasure channel. The paper then uses achievability and converse arguments to pr ..."
Abstract - Cited by 7 (1 self) - Add to MetaCart
Abstract—This paper studies the low-rank matrix completion problem from an information theoretic perspective. The comple-tion problem is rephrased as a communication problem of an (uncoded) low-rank matrix source over an erasure channel. The paper then uses achievability and converse arguments

Exact Low-rank Matrix Recovery via Nonconvex

by Lingchen Kong, Naihua Xiu
"... The low-rank matrix recovery (LMR) arises in many fields such as signal and image processing, statistics, computer vision, system identification and control, and it is NP-hard. It is known that under some restricted isometry property (RIP) conditions we can obtain the exact low-rank matrix solution ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
The low-rank matrix recovery (LMR) arises in many fields such as signal and image processing, statistics, computer vision, system identification and control, and it is NP-hard. It is known that under some restricted isometry property (RIP) conditions we can obtain the exact low-rank matrix solution

Robust Low-Rank Matrix Completion by Riemannian Optimization

by Léopold Cambier, P. -a. Absil
"... Low-rank matrix completion is the problem where one tries to recover a low-rank matrix from noisy observations of a subset of its entries. In this paper, we propose RMC, a new method to deal with the problem of robust low-rank matrix completion, i.e., matrix completion where a fraction of the observ ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
Low-rank matrix completion is the problem where one tries to recover a low-rank matrix from noisy observations of a subset of its entries. In this paper, we propose RMC, a new method to deal with the problem of robust low-rank matrix completion, i.e., matrix completion where a fraction

Local Low-Rank Matrix Approximation

by Joonseok Lee, Seungyeon Kim, Guy Lebanon, Yoram Singer
"... Matrix approximation is a common tool in recommendation systems, text mining, and computer vision. A prevalent assumption in constructing matrix approximations is that the partially observed matrix is of low-rank. We propose a new matrix approximation model where we assume instead that the matrix is ..."
Abstract - Cited by 5 (1 self) - Add to MetaCart
Matrix approximation is a common tool in recommendation systems, text mining, and computer vision. A prevalent assumption in constructing matrix approximations is that the partially observed matrix is of low-rank. We propose a new matrix approximation model where we assume instead that the matrix

Low-Rank Matrix Recovery With Poisson Noise

by Yao Xie, Yuejie Chi, Robert Calderbank
"... Estimating an image M ∗ ∈ Rm1×m2+ from its linear mea-surements under Poisson noise is an important problem arises from applications such as optical imaging, nuclear medicine and x-ray imaging [1]. When the image M ∗ has a low-rank structure, we can use a small number of linear measurements to reco ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
to recover M∗, also known as low-rank matrix recovery. This is related to compressed sensing, where the goal is to develop efficient data acquisition systems by exploiting sparsity of underlying signals. While there has been much success for low-rank matrix recovery and completion under Gaussian noise
Next 10 →
Results 1 - 10 of 1,688
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University