• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

On the Identifiability of Overcomplete Dictionaries via the Minimisation Principle Underlying K-SVD (2013)

by Karin Schnass
Add To MetaCart

Tools

Sorted by:
Results 11 - 13 of 13

Error Bounds for Maximum Likelihood Matrix Completion Under Sparse Factor Models

by Akshay Soni, Swayambhoo Jain, Jarvis Haupt, Stefano Gonella
"... Abstract—This paper examines a general class of matrix completion tasks where entry wise observations of the matrix are subject to random noise or corruption. Our particular focus here is on settings where the matrix to be estimated follows a sparse factor model, in the sense that it may be expresse ..."
Abstract - Add to MetaCart
Abstract—This paper examines a general class of matrix completion tasks where entry wise observations of the matrix are subject to random noise or corruption. Our particular focus here is on settings where the matrix to be estimated follows a sparse factor model, in the sense that it may be expressed as the product of two matrices, one of which is sparse. We analyze the performance of a sparsity-penalized maximum likelihood approach to such problems to provide a general-purpose estimation result applicable to any of a number of noise/corruption models, and describe its implications in two stylized scenarios – one characterized by additive Gaussian noise, and the other by highly-quantized one-bit observations. We also provide some supporting empirical evidence to validate our theoretical claims in the Gaussian setting. Index Terms—Complexity regularization, matrix completion, maxi-mum likelihood, sparse estimation I.

unknown title

by Francis Bach , 2014
"... ar ..."
Abstract - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...e function based on ℓ1 minimization under equality constraints [Gribonval and Schnass, 2010, Geng et al., 2011], for which there is no known efficient heuristic implementation, or on an ℓ0 criterion [=-=Schnass, 2013-=-] à la K-SVD [Aharon et al., 2006]. More algorithmic approaches have also recently emerged [Spielman et al., 2012, Arora et al., 2013] demonstrating the existence of provably good algorithms of polyn...

SUBMITTED TO IEEE TRANSACTIONS ON SIGNAL PROCESSING 1 Structured Dictionary Learning for Classification

by Yuanming Suo, Student Member, Minh Dao, Student Member, Umamahesh Srinivas, Vishal Monga, Senior Member, Trac D. Tran
"... Abstract—Sparsity driven signal processing has gained tremen-dous popularity in the last decade. At its core, the assumption is that the signal of interest is sparse with respect to either a fixed transformation or a signal dependent dictionary. To better cap-ture the data characteristics, various d ..."
Abstract - Add to MetaCart
Abstract—Sparsity driven signal processing has gained tremen-dous popularity in the last decade. At its core, the assumption is that the signal of interest is sparse with respect to either a fixed transformation or a signal dependent dictionary. To better cap-ture the data characteristics, various dictionary learning methods have been proposed for both reconstruction and classification tasks. For classification particularly, most approaches proposed so far have focused on designing explicit constraints on the sparse code to improve classification accuracy while simply adopting l0-norm or l1-norm for sparsity regularization. Motivated by the success of structured sparsity in the area of Compressed Sensing, we propose a structured dictionary learning framework (StructDL) that incorporates the structure information on both group and task levels in the learning process. Its benefits are two-fold: (i) the label consistency between dictionary atoms and training data are implicitly enforced; and (ii) the classification performance is more robust in the cases of a small dictionary size or limited training data than other techniques. Using the subspace model, we derive the conditions for StructDL to guarantee the performance and show theoretically that StructDL is superior to l0-norm or l1-norm regularized dictionary learning for classification. Extensive experiments have been performed on both synthetic simulations and real world applications, such as face recognition and object classification, to demonstrate the validity of the proposed DL framework. Index Terms—dictionary learning, structured sparsity, sparse representation, compressed sensing, multitask I.
(Show Context)

Citation Context

...eless or small Gaussian noise contaminated training data, using l1-or l0-norm regularization in DL leads to a dictionary D, which is a local minimum around the groundtruth with high probability [37]– =-=[39]-=-. However, little theoretical effort is focused on analyzing the discrimination power of the learned dictionary, which we will explore in this section. The DL problem is non-convex, making the direct ...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University