Results 1  10
of
309
Exact Matrix Completion via Convex Optimization
, 2008
"... We consider a problem of considerable practical interest: the recovery of a data matrix from a sampling of its entries. Suppose that we observe m entries selected uniformly at random from a matrix M. Can we complete the matrix and recover the entries that we have not seen? We show that one can perfe ..."
Abstract

Cited by 873 (26 self)
 Add to MetaCart
perfectly recover most lowrank matrices from what appears to be an incomplete set of entries. We prove that if the number m of sampled entries obeys m ≥ C n 1.2 r log n for some positive numerical constant C, then with very high probability, most n × n matrices of rank r can be perfectly recovered
Compressed sensing of simultaneous lowrank and jointsparse matrices
, 2012
"... In this paper we consider recovery of a high dimensional data matrix from a set of incomplete and noisy linear measurements. We introduce a new model which can efficiently restricts the degrees of freedom of data and, at the same time, is generic so that finds varieties of applications, namely, in m ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
, in multichannel signal compressed sensing (e.g. sensor networks, hyperspectral imaging) and compressive sparse principal component analysis (sPCA). We assume data matrices to have a simultaneous lowrank and joint sparse structure and based on this, we propose a novel approach for an efficient compressed sensing
SpaRCS: Recovering lowrank and sparse matrices from compressive measurements
, 2011
"... We consider the problem of recovering a matrix M that is the sum of a lowrank matrix L and a sparse matrix S from a small set of linear measurements of the form y = A(M) =A(L + S). This model subsumes three important classes of signal recovery problems: compressive sensing, affine rank minimization ..."
Abstract

Cited by 46 (4 self)
 Add to MetaCart
We consider the problem of recovering a matrix M that is the sum of a lowrank matrix L and a sparse matrix S from a small set of linear measurements of the form y = A(M) =A(L + S). This model subsumes three important classes of signal recovery problems: compressive sensing, affine rank
LOWRANK OPTIMIZATION ON THE CONE OF POSITIVE SEMIDEFINITE MATRICES ∗
"... Abstract. We propose an algorithm for solving optimization problems defined on a subset of the cone of symmetric positive semidefinite matrices. This algorithm relies on the factorization X = YYT, where the number of columns of Y fixes an upper bound on the rank of the positive semidefinite matrix X ..."
Abstract

Cited by 31 (6 self)
 Add to MetaCart
is evaluated on two applications: the maximal cut of a graph and the problem of sparse principal component analysis. Key words. lowrank constraints, cone of symmetric positive definite matrices, Riemannian quotient manifold, sparse principal component analysis, maximumcut algorithms, largescale algorithms
Lowrank approximation of tensors
, 2014
"... In many applications such as data compression, imaging or genomic data analysis, it is important to approximate a given tensor by a tensor that is sparsely representable. For matrices, i.e. 2tensors, such a representation can be obtained via the singular value decomposition, which allows to compute ..."
Abstract
 Add to MetaCart
to compute best rank kapproximations. For very big matrices a low rank approximation using SVD is not computationally feasible. In this case different approximations are available. It seems that variants of CURdecomposition are most suitable. For dmode tensors T ∈ ⊗di=1Rni, with d> 2, many
Matrix Completion with Noise
"... On the heels of compressed sensing, a remarkable new field has very recently emerged. This field addresses a broad range of problems of significant practical interest, namely, the recovery of a data matrix from what appears to be incomplete, and perhaps even corrupted, information. In its simplest ..."
Abstract

Cited by 255 (13 self)
 Add to MetaCart
numerical results which complement our quantitative analysis and show that, in practice, nuclear norm minimization accurately fills in the many missing entries of large lowrank matrices from just a few noisy samples. Some analogies between matrix completion and compressed sensing are discussed throughout.
Drawing Large Graphs by LowRank Stress Majorization
"... Optimizing a stress model is a natural technique for drawing graphs: one seeks an embedding into R d which best preserves the induced graph metric. Current approaches to solving the stress model for a graph with V  nodes and E  edges require the full allpairs shortest paths (APSP) matrix, which ..."
Abstract
 Add to MetaCart
, which takes O(V  2 logE  + VE) time and O(V  2) space. We propose a novel algorithm based on a lowrank approximation to the required matrices. The crux of our technique is an observation that it is possible to approximate the full APSP matrix, even when only a small subset of its entries
On Identity Testing of Tensors, Lowrank Recovery and Compressed Sensing
, 2011
"... We study the problem of obtaining efficient, deterministic, blackbox polynomial identity testing algorithms for depth3 setmultilinear circuits (over arbitrary fields). This class of circuits has an efficient, deterministic, whitebox polynomial identity testing algorithm (due to Raz and Shpilka [ ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
for tensors of degree 2 (matrices), and obtain quasipolynomial sized hitting sets for arbitrary tensors (but this second hitting set is less explicit). We also show connections to the task of performing lowrank recovery of matrices, which is studied in the field of compressed sensing. Lowrank recovery asks
Uniqueness of lowrank matrix completion by rigidity theory
, 2009
"... Abstract. The problem of completing a lowrank matrix from a subset of its entries is often encountered in the analysis of incomplete data sets exhibiting an underlying factor model with applications in collaborative filtering, computer vision and control. Most recent work had been focused on constr ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
the completion matrix, that serves as the analogue of the rigidity matrix. Key words. Low rank matrices, missing values, rigidity theory, rigid graphs, iterative methods.
Results 1  10
of
309