Results 1 -
2 of
2
Local Identification of Overcomplete Dictionaries
, 2015
"... This paper presents the first theoretical results showing that stable identification of over-complete µ-coherent dictionaries Φ ∈ Rd×K is locally possible from training signals with sparsity levels S up to the order O(µ−2) and signal to noise ratios up to O( d). In particular the dictionary is recov ..."
Abstract
-
Cited by 5 (0 self)
- Add to MetaCart
This paper presents the first theoretical results showing that stable identification of over-complete µ-coherent dictionaries Φ ∈ Rd×K is locally possible from training signals with sparsity levels S up to the order O(µ−2) and signal to noise ratios up to O( d). In particular the dictionary is recoverable as the local maximum of a new maximization criterion that generalizes the K-means criterion. For this maximization criterion results for asymptotic exact recovery for sparsity levels up to O(µ−1) and stable recovery for sparsity levels up to O(µ−2) as well as signal to noise ratios up to O( d) are provided. These asymptotic results translate to finite sample size recovery results with high probability as long as the sample size N scales as O(K3dSε̃−2), where the recovery precision ε ̃ can go down to the asymptotically achievable precision. Further, to actually find the local maxima of the new criterion, a very simple Iterative Thresholding and K (signed) Means algorithm (ITKM), which has complexity O(dKN) in each iteration, is presented and its local efficiency is demonstrated in several experiments.
Complete Dictionary Recovery Using Nonconvex Optimization
"... We consider the problem of recovering a complete (i.e., square and invertible) dictionary A0, from Y = A0X0 with Y ∈ Rn×p. This recovery set-ting is central to the theoretical understanding of dictionary learning. We give the first efficient al-gorithm that provably recoversA0 whenX0 has O (n) nonze ..."
Abstract
- Add to MetaCart
(Show Context)
We consider the problem of recovering a complete (i.e., square and invertible) dictionary A0, from Y = A0X0 with Y ∈ Rn×p. This recovery set-ting is central to the theoretical understanding of dictionary learning. We give the first efficient al-gorithm that provably recoversA0 whenX0 has O (n) nonzeros per column, under suitable proba-bility model forX0. Prior results provide recov-ery guarantees whenX0 has only O ( n) nonze-ros per column. Our algorithm is based on non-convex optimization with a spherical constraint, and hence is naturally phrased in the language of manifold optimization. Our proofs give a geomet-ric characterization of the high-dimensional objec-tive landscape, which shows that with high prob-ability there are no spurious local minima. Ex-periments with synthetic data corroborate our the-ory. Full version of this paper is available online: