Results 1 
8 of
8
Sparse and Redundant Representation Modeling  What Next?
, 2012
"... Signal processing relies heavily on data models; these are mathematical constructions imposed on the data source that force a dimensionality reduction of some sort. The vast activity in signal processing during the past several decades is essentially driven by an evolution of these models and their ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Signal processing relies heavily on data models; these are mathematical constructions imposed on the data source that force a dimensionality reduction of some sort. The vast activity in signal processing during the past several decades is essentially driven by an evolution of these models and their use in practice. In that respect, the past decade has been certainly the era of sparse and redundant representations, a popular and highly effective data model. This very appealing model led to a long series of intriguing theoretical and numerical questions, and to many innovative ideas that harness this model to real engineering problems. The new entries recently added to the IEEESPL EDICS reflect the popularity of this model and its impact on signal processing research and practice. Despite the huge success of this model so far, this field
Learning efficient sparse and low rank models
 CoRR
"... Parsimony, including sparsity and low rank, has been shown to successfully model data in numerous machine learning and signal processing tasks. Traditionally, such modeling approaches rely on an iterative algorithm that minimizes an objective function with parsimonypromoting terms. The inherently ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
Parsimony, including sparsity and low rank, has been shown to successfully model data in numerous machine learning and signal processing tasks. Traditionally, such modeling approaches rely on an iterative algorithm that minimizes an objective function with parsimonypromoting terms. The inherently sequential structure and datadependent complexity and latency of iterative optimization constitute a major limitation in many applications requiring realtime performance or involving largescale data. Another limitation encountered by these modeling techniques is the difficulty of their inclusion in discriminative learning scenarios. In this work, we propose to move the emphasis from the model to the pursuit algorithm, and develop a processcentric view of parsimonious modeling, in which a learned deterministic fixedcomplexity pursuit process is used in lieu of iterative optimization. We show a principled way to construct learnable pursuit process architectures for structured sparse and robust low rank models, derived from the iteration of proximal descent algorithms. These architectures learn to approximate the exact parsimonious representation at a fraction of the complexity of the standard optimization methods. We also show that appropriate training regimes allow to naturally extend parsimonious models to discriminative settings. Stateoftheart results are demonstrated on several challenging problems in image and audio processing with several orders of magnitude speedup compared to the exact optimization algorithms.
COSAMP AND SP FOR THE COSPARSE ANALYSIS MODEL
"... CoSaMP and SubspacePursuit (SP) are two recovery algorithms that find the sparsest representation for a given signal under a given dictionary in the presence of noise. These two methods were conceived in the context of the synthesis sparse representation modeling. The cosparse analysis model is a r ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
CoSaMP and SubspacePursuit (SP) are two recovery algorithms that find the sparsest representation for a given signal under a given dictionary in the presence of noise. These two methods were conceived in the context of the synthesis sparse representation modeling. The cosparse analysis model is a recent construction that stands as an interesting alternative to the synthesis approach. This new model characterizes signals by the space they are orthogonal to. Despite the similarity between the two, the cosparse analysis model is markedly different from the synthesis one. In this paper we propose analysis versions of the CoSaMP and the SP algorithms, and demonstrate their performance for the compressed sensing problem.
On MAP and MMSE Estimators for the Cosparse Analysis ModelI
"... The sparse synthesis model for signals has become very popular in the last decade, leading to improved performance in many signal processing applications. This model assumes that a signal may be described as a linear combination of few columns (atoms) of a given synthesis matrix (dictionary). The ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The sparse synthesis model for signals has become very popular in the last decade, leading to improved performance in many signal processing applications. This model assumes that a signal may be described as a linear combination of few columns (atoms) of a given synthesis matrix (dictionary). The CoSparse Analysis model is a recently introduced counterpart, whereby signals are assumed to be orthogonal to many rows of a given analysis dictionary. These rows are called the cosupport. The Analysis model has already led to a series of contributions that address the pursuit problem: identifying the cosupport of a corrupted signal in order to restore it. While all the existing work adopts a deterministic point of view towards the design of such pursuit algorithms, this paper introduces a Bayesian estimation point of view, starting with a random generative model for the cosparse analysis signals. This is followed by a derivation of Oracle, MinimumMeanSquaredError (MMSE), and MaximumA’posterioriProbability (MAP) based estimators. We present a comparison between the deterministic formulations and these estimators, drawing some connections between the two. We develop practical approximations to the MAP and MMSE estimators, and demonstrate the proposed reconstruction algorithms in several synthetic and real image experiments, showing their potential and applicability.
THEME Audio, Speech, and Language ProcessingTable of contents
"... Speech and sound data modeling and processing IN COLLABORATION WITH: Institut de recherche en informatique et systèmes aléatoires (IRISA) ..."
Abstract
 Add to MetaCart
(Show Context)
Speech and sound data modeling and processing IN COLLABORATION WITH: Institut de recherche en informatique et systèmes aléatoires (IRISA)
GreedyLike Algorithms for the Cosparse Analysis Model
"... The cosparse analysis model has been introduced recently as an interesting alternative to the standard sparse synthesis approach. A prominent question brought up by this new construction is the analysis pursuit problem – the need to find a signal belonging to this model, given a set of corrupted mea ..."
Abstract
 Add to MetaCart
(Show Context)
The cosparse analysis model has been introduced recently as an interesting alternative to the standard sparse synthesis approach. A prominent question brought up by this new construction is the analysis pursuit problem – the need to find a signal belonging to this model, given a set of corrupted measurements of it. Several pursuit methods have already been proposed based on ℓ1 relaxation and a greedy approach. In this work we pursue this question further, and propose a new family of pursuit algorithms for the cosparse analysis model, mimicking the greedylike methods – compressive sampling matching pursuit (CoSaMP), subspace pursuit (SP), iterative hard thresholding (IHT) and hard thresholding pursuit (HTP). Assuming the availability of a near optimal projection scheme that finds the nearest cosparse subspace to any vector, we provide performance guarantees for these algorithms. Our theoretical study relies on a restricted isometry property adapted to the context of the cosparse analysis model. We explore empirically the performance of these algorithms by adopting a plain thresholding projection, demonstrating their good performance.
On
, 2014
"... the role of total variation in compressed sensing structure dependence ..."
(Show Context)