• Documents
  • Authors
  • Tables

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

Sparse and spurious: dictionary learning with noise and outliers (2014)

by R Jenatton, F Bach, R Gribonval
Add To MetaCart

Tools

Sorted by:
Results 1 - 2 of 2

Local Identification of Overcomplete Dictionaries

by Karin Schnass , 2015
"... This paper presents the first theoretical results showing that stable identification of over-complete µ-coherent dictionaries Φ ∈ Rd×K is locally possible from training signals with sparsity levels S up to the order O(µ−2) and signal to noise ratios up to O( d). In particular the dictionary is recov ..."
Abstract - Cited by 5 (0 self) - Add to MetaCart
This paper presents the first theoretical results showing that stable identification of over-complete µ-coherent dictionaries Φ ∈ Rd×K is locally possible from training signals with sparsity levels S up to the order O(µ−2) and signal to noise ratios up to O( d). In particular the dictionary is recoverable as the local maximum of a new maximization criterion that generalizes the K-means criterion. For this maximization criterion results for asymptotic exact recovery for sparsity levels up to O(µ−1) and stable recovery for sparsity levels up to O(µ−2) as well as signal to noise ratios up to O( d) are provided. These asymptotic results translate to finite sample size recovery results with high probability as long as the sample size N scales as O(K3dSε̃−2), where the recovery precision ε ̃ can go down to the asymptotically achievable precision. Further, to actually find the local maxima of the new criterion, a very simple Iterative Thresholding and K (signed) Means algorithm (ITKM), which has complexity O(dKN) in each iteration, is presented and its local efficiency is demonstrated in several experiments.

Complete Dictionary Recovery Using Nonconvex Optimization

by Ju Sun, Qing Qu, John Wright
"... We consider the problem of recovering a complete (i.e., square and invertible) dictionary A0, from Y = A0X0 with Y ∈ Rn×p. This recovery set-ting is central to the theoretical understanding of dictionary learning. We give the first efficient al-gorithm that provably recoversA0 whenX0 has O (n) nonze ..."
Abstract - Add to MetaCart
We consider the problem of recovering a complete (i.e., square and invertible) dictionary A0, from Y = A0X0 with Y ∈ Rn×p. This recovery set-ting is central to the theoretical understanding of dictionary learning. We give the first efficient al-gorithm that provably recoversA0 whenX0 has O (n) nonzeros per column, under suitable proba-bility model forX0. Prior results provide recov-ery guarantees whenX0 has only O ( n) nonze-ros per column. Our algorithm is based on non-convex optimization with a spherical constraint, and hence is naturally phrased in the language of manifold optimization. Our proofs give a geomet-ric characterization of the high-dimensional objec-tive landscape, which shows that with high prob-ability there are no spurious local minima. Ex-periments with synthetic data corroborate our the-ory. Full version of this paper is available online:
(Show Context)

Citation Context

...ompactlyX0 ∼i.i.d. BG (θ). Since Y = A0X0 and A0 is complete, row (Y ) = row (X0) 2 and rows ofX0 are sparse vectors in the known subspace row (Y ). Following Spielman et al (2012), we use stability (=-=Gribonval et al., 2014-=-). 2row (·) denotes the row space. this fact to first recover the rows of X0, and subsequently recoverA0 by solving a system of linear equations. In fact, for X0 ∼i.i.d. BG (θ), rows of X0 are the n s...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University