Results 1  10
of
116
A Closed Form Solution to Robust Subspace Estimation and Clustering
"... We consider the problem of fitting one or more subspaces to a collection of data points drawn from the subspaces and corrupted by noise/outliers. We pose this problem as a rank minimization problem, where the goal is to decompose the corrupted data matrix as the sum of a clean, selfexpressive, low ..."
Abstract

Cited by 43 (3 self)
 Add to MetaCart
(Show Context)
We consider the problem of fitting one or more subspaces to a collection of data points drawn from the subspaces and corrupted by noise/outliers. We pose this problem as a rank minimization problem, where the goal is to decompose the corrupted data matrix as the sum of a clean, selfexpressive, lowrank dictionary plus a matrix of noise/outliers. Our key contribution is to show that, for noisy data, this nonconvex problem can be solved very efficiently and in closed form from the SVD of the noisy data matrix. Remarkably, this is true for both one or more subspaces. An important difference with respect to existing methods is that our framework results in a polynomial thresholding of the singular values with minimal shrinkage. Indeed, a particular case of our framework in the case of a single subspace leads to classical PCA, which requires no shrinkage. In the case of multiple subspaces, our framework provides an affinity matrix that can be used to cluster the data according to the subspaces. In the case of data corrupted by outliers, a closedform solution appears elusive. We thus use an augmented Lagrangian optimization framework, which requires a combination of our proposed polynomial thresholding operator with the more traditional shrinkagethresholding operator. 1.
Multitask lowrank affinity pursuit for image segmentation
 In ICCV
"... This paper investigates how to boost regionbased image segmentation by pursuing a new solution to fuse multiple types of image features. A collaborative image segmentation framework, called multitask lowrank affinity pursuit, is presented for such a purpose. Given an image described with mul ..."
Abstract

Cited by 28 (1 self)
 Add to MetaCart
(Show Context)
This paper investigates how to boost regionbased image segmentation by pursuing a new solution to fuse multiple types of image features. A collaborative image segmentation framework, called multitask lowrank affinity pursuit, is presented for such a purpose. Given an image described with multiple types of features, we aim at inferring a unified affinity matrix that implicitly encodes the segmentation of the image. This is achieved by seeking the sparsityconsistent lowrank affinities from the joint decompositions of multiple feature matrices into pairs of sparse and lowrank matrices, the latter of which is expressed as the production of the image feature matrix and its corresponding image affinity matrix. The inference process is formulated as a constrained nuclear norm and `2,1norm minimization problem, which is convex and can be solved efficiently with the Augmented Lagrange Multiplier method. Compared to previous methods, which are usually based on a single type of features, the proposed method seamlessly integrates multiple types of features to jointly produce the affinity matrix within a single inference step, and produces more accurate and reliable segmentation results. Experiments on the MSRC dataset and Berkeley segmentation dataset well validate the superiority of using multiple features over single feature and also the superiority of our method over conventional methods for feature fusion. Moreover, our method is shown to be very competitive while comparing to other stateoftheart methods. 1.
A Simple Priorfree Method for NonRigid StructurefromMotion Factorization
"... This paper proposes a simple “priorfree ” method for solving nonrigid structurefrommotion factorization problems. Other than using the basic lowrank condition, our method does not assume any extra prior knowledge about the nonrigid scene or about the camera motions. Yet, it runs reliably, produ ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
(Show Context)
This paper proposes a simple “priorfree ” method for solving nonrigid structurefrommotion factorization problems. Other than using the basic lowrank condition, our method does not assume any extra prior knowledge about the nonrigid scene or about the camera motions. Yet, it runs reliably, produces optimal result, and does not suffer from the inherent basisambiguity issue which plagued many conventional nonrigid factorization techniques. Our method is easy to implement, which involves solving no more than an SDP (semidefinite programming) of small and fixed size, a linear LeastSquares or tracenorm minimization. Extensive experiments have demonstrated that it outperforms most of the existing linear methods of nonrigid factorization. This paper offers not only new theoretical insight, but also a practical, everyday solution, to nonrigid structurefrommotion. 1 1.
ROBUST COMPUTATION OF LINEAR MODELS, OR HOW TO FIND A NEEDLE IN A HAYSTACK
"... Abstract. Consider a dataset of vectorvalued observations that consists of a modest number of noisy inliers, which are explained well by a lowdimensional subspace, along with a large number of outliers, which have no linear structure. This work describes a convex optimization problem, called reape ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
Abstract. Consider a dataset of vectorvalued observations that consists of a modest number of noisy inliers, which are explained well by a lowdimensional subspace, along with a large number of outliers, which have no linear structure. This work describes a convex optimization problem, called reaper, that can reliably fit a lowdimensional model to this type of data. The paper provides an efficient algorithm for solving the reaper problem, and it documents numerical experiments which confirm that reaper can dependably find linear structure in synthetic and natural data. In addition, when the inliers are contained in a lowdimensional subspace, there is a rigorous theory that describes when reaper can recover the subspace exactly. 1.
Robust Subspace Clustering
, 2013
"... Subspace clustering refers to the task of finding a multisubspace representation that best fits a collection of points taken from a highdimensional space. This paper introduces an algorithm inspired by sparse subspace clustering (SSC) [17] to cluster noisy data, and develops some novel theory demo ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
Subspace clustering refers to the task of finding a multisubspace representation that best fits a collection of points taken from a highdimensional space. This paper introduces an algorithm inspired by sparse subspace clustering (SSC) [17] to cluster noisy data, and develops some novel theory demonstrating its correctness. In particular, the theory uses ideas from geometric functional analysis to show that the algorithm can accurately recover the underlying subspaces under minimal requirements on their orientation, and on the number of samples per subspace. Synthetic as well as real data experiments complement our theoretical study, illustrating our approach and demonstrating its effectiveness.
Fixedrank representation for unsupervised visual learning
"... Subspace clustering and feature extraction are two of the most commonly used unsupervised learning techniques in computer vision and pattern recognition. Stateoftheart techniques for subspace clustering make use of recent advances in sparsity and rank minimization. However, existing techniques a ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
(Show Context)
Subspace clustering and feature extraction are two of the most commonly used unsupervised learning techniques in computer vision and pattern recognition. Stateoftheart techniques for subspace clustering make use of recent advances in sparsity and rank minimization. However, existing techniques are computationally expensive and may result in degenerate solutions that degrade clustering performance in the case of insufficient data sampling. To partially solve these problems, and inspired by existing work on matrix factorization, this paper proposes fixedrank representation (FRR) as a unified framework for unsupervised visual learning. FRR is able to reveal the structure of multiple subspaces in closedform when the data is noiseless. Furthermore, we prove that under some suitable conditions, even with insufficient observations, FRR can still reveal the true subspace memberships. To achieve robustness to outliers and noise, a sparse regularizer is introduced into the FRR framework. Beyond subspace clustering, FRR can be used for unsupervised feature extraction. As a nontrivial byproduct, a fast numerical solver is developed for FRR. Experimental results on both synthetic data and real applications validate our theoretical analysis and demonstrate the benefits of FRR for unsupervised visual learning. 1.
Exact subspace segmentation and outlier detection by lowrank representation
 Journal of Machine Learning Research
, 2011
"... In this work, we address the following matrix recovery problem: suppose we are given a set of data points containing two parts, one part consists of samples drawn from a union of multiple subspaces and the other part consists of outliers. We do not know which data points are outliers, or how many ou ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
In this work, we address the following matrix recovery problem: suppose we are given a set of data points containing two parts, one part consists of samples drawn from a union of multiple subspaces and the other part consists of outliers. We do not know which data points are outliers, or how many outliers there are. The rank and number of the subspaces are unknown either. Can we detect the outliers and segment the samples into their right subspaces, efficiently and exactly? We utilize a socalled LowRank Representation (LRR) method to solve this problem, and prove that under mild technical conditions, any solution to LRR exactly recovers the row space of the samples and detect the outliers as well. Since the subspace membership is provably determined by the row space, this further implies that LRR can perform exact subspace segmentation and outlier detection, in an efficient way. 1
Noisy sparse subspace clustering
 In International Conference on Machine Learning
, 2013
"... This paper considers the problem of subspace clustering under noise. Specifically, we study the behavior of Sparse Subspace Clustering (SSC) when either adversarial or random noise is added to the unlabelled input data points, which are assumed to lie in a union of lowdimensional subspaces. We sh ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
This paper considers the problem of subspace clustering under noise. Specifically, we study the behavior of Sparse Subspace Clustering (SSC) when either adversarial or random noise is added to the unlabelled input data points, which are assumed to lie in a union of lowdimensional subspaces. We show that a modified version of SSC is provably effective in correctly identifying the underlying subspaces, even with noisy data. This extends theoretical guarantee of this algorithm to the practical setting and provides justification to the success of SSC in a class of real applications. 1.
A novel mestimator for robust PCA
"... We study the basic problem of robust subspace recovery. That is, we assume a data set that some of its points are sampled around a fixed subspace and the rest of them are spread in the whole ambient space, and we aim to recover the fixed underlying subspace. We first estimate “robust inverse sample ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
We study the basic problem of robust subspace recovery. That is, we assume a data set that some of its points are sampled around a fixed subspace and the rest of them are spread in the whole ambient space, and we aim to recover the fixed underlying subspace. We first estimate “robust inverse sample covariance ” by solving a convex minimization procedure; we then recover the subspace by the bottom eigenvectors of this matrix (their number correspond to the number of eigenvalues close to 0). We guarantee exact subspace recovery under some conditions on the underlying data. Furthermore, we propose a fast iterative algorithm, which linearly converges to the matrix minimizing the convex problem. We also quantify the effect of noise and regularization and discuss many other practical and theoretical issues for improving the subspace recovery in various settings. When replacing the sum of terms in the convex energy function (that we minimize) with the sum of squares of terms, we obtain that the new minimizer is a scaled version of the inverse sample covariance (when exists). We thus interpret our minimizer and its subspace (spanned by its bottom eigenvectors) as robust versions of the empirical inverse covariance and the PCA subspace respectively. We compare our method with many other algorithms for robust PCA on synthetic and real data sets and demonstrate stateoftheart speed and accuracy.
Rank/Norm Regularization with ClosedForm Solutions: Application to Subspace Clustering
"... When data is sampled from an unknown subspace, principal component analysis (PCA) provides an effective way to estimate the subspace and hence reduce the dimension of the data. At the heart of PCA is the EckartYoungMirsky theorem, which characterizes the best rank k approximation of a matrix. In t ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
When data is sampled from an unknown subspace, principal component analysis (PCA) provides an effective way to estimate the subspace and hence reduce the dimension of the data. At the heart of PCA is the EckartYoungMirsky theorem, which characterizes the best rank k approximation of a matrix. In this paper, we prove a generalization of the EckartYoungMirsky theorem under all unitarily invariant norms. Using this result, we obtain closedform solutions for a set of rank/norm regularized problems, and derive closedform solutions for a general class of subspace clustering problems (where data is modelled by unions of unknown subspaces). From these results we obtain new theoretical insights and promising experimental results. 1