Results 1  10
of
5,391
Lowrank matrix completion by riemannian optimization
 ANCHPMATHICSE, Mathematics Section, École Polytechnique Fédérale de
"... The matrix completion problem consists of finding or approximating a lowrank matrix based on a few samples of this matrix. We propose a novel algorithm for matrix completion that minimizes the least square distance on the sampling set over the Riemannian manifold of fixedrank matrices. The algorit ..."
Abstract

Cited by 39 (3 self)
 Add to MetaCart
The matrix completion problem consists of finding or approximating a lowrank matrix based on a few samples of this matrix. We propose a novel algorithm for matrix completion that minimizes the least square distance on the sampling set over the Riemannian manifold of fixedrank matrices
Regression on fixedrank positive semidefinite matrices: a Riemannian approach
 JMLR
"... The paper addresses the problem of learning a regression model parameterized by a fixedrank positive semidefinite matrix. The focus is on the nonlinear nature of the search space and on scalability to highdimensional problems. The mathematical developments rely on the theory of gradient descent al ..."
Abstract

Cited by 18 (7 self)
 Add to MetaCart
The paper addresses the problem of learning a regression model parameterized by a fixedrank positive semidefinite matrix. The focus is on the nonlinear nature of the search space and on scalability to highdimensional problems. The mathematical developments rely on the theory of gradient descent
Robust LowRank Matrix Completion by Riemannian Optimization
"... Lowrank matrix completion is the problem where one tries to recover a lowrank matrix from noisy observations of a subset of its entries. In this paper, we propose RMC, a new method to deal with the problem of robust lowrank matrix completion, i.e., matrix completion where a fraction of the observ ..."
Abstract
 Add to MetaCart
of the observed entries are corrupted by nonGaussian noise, typically outliers. The method relies on the idea of smoothing the `1 norm and using Riemannian optimization to deal with the lowrank constraint. We first state the algorithms as the successive minimization of smooth approximations of the `1 norm
Linear regression under fixedrank constraints: a Riemannian approach
 In 28th International Conference on Machine Learning. ICML
, 2011
"... In this paper, we tackle the problem of learning a linear regression model whose parameter is a fixedrank matrix. We study the Riemannian manifold geometry of the set of fixedrank matrices and develop efficient linesearch algorithms. The proposed algorithms have many applications, scale to highdi ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
In this paper, we tackle the problem of learning a linear regression model whose parameter is a fixedrank matrix. We study the Riemannian manifold geometry of the set of fixedrank matrices and develop efficient linesearch algorithms. The proposed algorithms have many applications, scale
LOWRANK OPTIMIZATION WITH TRACE NORM PENALTY∗
"... Abstract. The paper addresses the problem of lowrank trace norm minimization. We propose an algorithm that alternates between fixedrank optimization and rankone updates. The fixedrank optimization is characterized by an efficient factorization that makes the trace norm differentiable in the sear ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
Abstract. The paper addresses the problem of lowrank trace norm minimization. We propose an algorithm that alternates between fixedrank optimization and rankone updates. The fixedrank optimization is characterized by an efficient factorization that makes the trace norm differentiable
Linesearch methods and rank increase on lowrank matrix varieties
"... Abstract—Based on an explicit characterization of tangent cones one can devise linesearch methods to minimize functions on the variety of matrices with rank bounded by some fixed value, thereby extending the Riemannian optimization techniques from the smooth manifold of fixed rank to its closure. ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract—Based on an explicit characterization of tangent cones one can devise linesearch methods to minimize functions on the variety of matrices with rank bounded by some fixed value, thereby extending the Riemannian optimization techniques from the smooth manifold of fixed rank to its closure
LOWRANK OPTIMIZATION ON THE CONE OF POSITIVE SEMIDEFINITE MATRICES ∗
"... Abstract. We propose an algorithm for solving optimization problems defined on a subset of the cone of symmetric positive semidefinite matrices. This algorithm relies on the factorization X = YYT, where the number of columns of Y fixes an upper bound on the rank of the positive semidefinite matrix X ..."
Abstract

Cited by 28 (6 self)
 Add to MetaCart
Abstract. We propose an algorithm for solving optimization problems defined on a subset of the cone of symmetric positive semidefinite matrices. This algorithm relies on the factorization X = YYT, where the number of columns of Y fixes an upper bound on the rank of the positive semidefinite matrix
Directional Statistics and Shape Analysis
, 1995
"... There have been various developments in shape analysis in the last decade. We describe here some relationships of shape analysis with directional statistics. For shape, rotations are to be integrated out or to be optimized over whilst they are the basis for directional statistics. However, various c ..."
Abstract

Cited by 775 (31 self)
 Add to MetaCart
There have been various developments in shape analysis in the last decade. We describe here some relationships of shape analysis with directional statistics. For shape, rotations are to be integrated out or to be optimized over whilst they are the basis for directional statistics. However, various
Results 1  10
of
5,391