Results 1  10
of
26
Structured compressed sensing: From theory to applications
 IEEE TRANS. SIGNAL PROCESS
, 2011
"... Compressed sensing (CS) is an emerging field that has attracted considerable research interest over the past few years. Previous review articles in CS limit their scope to standard discretetodiscrete measurement architectures using matrices of randomized nature and signal models based on standard ..."
Abstract

Cited by 98 (15 self)
 Add to MetaCart
Compressed sensing (CS) is an emerging field that has attracted considerable research interest over the past few years. Previous review articles in CS limit their scope to standard discretetodiscrete measurement architectures using matrices of randomized nature and signal models based on standard sparsity. In recent years, CS has worked its way into several new application areas. This, in turn, necessitates a fresh look on many of the basics of CS. The random matrix measurement operator must be replaced by more structured sensing architectures that correspond to the characteristics of feasible acquisition hardware. The standard sparsity prior has to be extended to include a much richer class of signals and to encode broader data models, including continuoustime signals. In our overview, the theme is exploiting signal and measurement structure in compressive sensing. The prime focus is bridging theory and practice; that is, to pinpoint the potential of structured CS strategies to emerge from the math to the hardware. Our summary highlights new directions as well as relations to more traditional CS, with the hope of serving both as a review to practitioners wanting to join this emerging field, and as a reference for researchers that attempts to put some of the existing ideas in perspective of practical applications.
Compressive Imaging using Approximate Message Passing and a
 MarkovTree Prior, Proc. Asilomar Conf. on Signals, Systems, and Computers
, 2010
"... Abstract—We propose a novel algorithm for compressive imaging that exploits both the sparsity and persistence across scales found in the 2D wavelet transform coefficients of natural images. Like other recent works, we model wavelet structure using a hidden Markov tree (HMT) but, unlike other works, ..."
Abstract

Cited by 43 (8 self)
 Add to MetaCart
Abstract—We propose a novel algorithm for compressive imaging that exploits both the sparsity and persistence across scales found in the 2D wavelet transform coefficients of natural images. Like other recent works, we model wavelet structure using a hidden Markov tree (HMT) but, unlike other works, ours is based on loopy belief propagation (LBP). For LBP, we adopt a recently proposed “turbo ” message passing schedule that alternates between exploitation of HMT structure and exploitation of compressivemeasurement structure. For the latter, we leverage Donoho, Maleki, and Montanari’s recently proposed approximate message passing (AMP) algorithm. Experiments on a large image database show that our turbo LBP approach maintains stateoftheart reconstruction performance at half the complexity. 1 I.
Breaking the coherence barrier: A new theory for compressed sensing. arXiv:1302.0561
, 2014
"... This paper provides an important extension of compressed sensing which bridges a substantial gap between ..."
Abstract

Cited by 17 (9 self)
 Add to MetaCart
This paper provides an important extension of compressed sensing which bridges a substantial gap between
Compressive Sensing MRI with Wavelet Tree Sparsity
"... In Compressive Sensing Magnetic Resonance Imaging (CSMRI), one can reconstruct a MR image with good quality from only a small number of measurements. This can significantly reduce MR scanning time. According to structured sparsity theory, the measurements can be further reduced to O(K + log n) for ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
In Compressive Sensing Magnetic Resonance Imaging (CSMRI), one can reconstruct a MR image with good quality from only a small number of measurements. This can significantly reduce MR scanning time. According to structured sparsity theory, the measurements can be further reduced to O(K + log n) for treesparse data instead of O(K +K log n) for standard Ksparse data with length n. However, few of existing algorithms have utilized this for CSMRI, while most of them model the problem with total variation and wavelet sparse regularization. On the other side, some algorithms have been proposed for tree sparse regularization, but few of them have validated the benefit of wavelet tree structure in CSMRI. In this paper, we propose a fast convex optimization algorithm to improve CSMRI. Wavelet sparsity, gradient sparsity and tree sparsity are all considered in our model for real MR images. The original complex problem is decomposed into three simpler subproblems then each of the subproblems can be efficiently solved with an iterative scheme. Numerous experiments have been conducted and show that the proposed algorithm outperforms the stateoftheart CSMRI algorithms, and gain better reconstructions results on real MR images than general tree based solvers or algorithms. 1
Boltzmann machine and meanfield approximation for structured sparse decompositions
, 2011
"... ..."
A Bayesian maxproduct EM algorithm for reconstructing structured sparse signals
 in Proc. Conf. Inform. Sci. Syst
, 2012
"... Part of the Signal Processing Commons The complete bibliographic information for this item can be found at ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Part of the Signal Processing Commons The complete bibliographic information for this item can be found at
The quest for optimal sampling: Computationally efficient, structureexploiting sampling strategies for compressed sensing. Compressed Sensing and Its Applications
, 2014
"... measurements for compressed sensing ..."
(Show Context)
1Hierarchical Infinite Divisibility for Multiscale Shrinkage
"... Abstract—A new shrinkagebased construction is developed for a compressible vector x ∈ Rn, for cases in which the components of x are naturally associated with a tree structure. Important examples are when x corresponds to the coefficients of a wavelet or blockDCT representation of data. The method ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Abstract—A new shrinkagebased construction is developed for a compressible vector x ∈ Rn, for cases in which the components of x are naturally associated with a tree structure. Important examples are when x corresponds to the coefficients of a wavelet or blockDCT representation of data. The method we consider in detail, and for which numerical results are presented, is based on the gamma distribution. The gamma distribution is a heavytailed distribution that is infinitely divisible, and these characteristics are leveraged within the model. We further demonstrate that the general framework is appropriate for many other types of infinitelydivisible heavytailed distributions. Bayesian inference is carried out by approximating the posterior with samples from an MCMC algorithm, as well as by constructing a variational approximation to the posterior. We also consider expectationmaximization (EM) for a MAP (point) solution. Stateoftheart results are manifested for compressive sensing and denoising applications, the latter with spiky (nonGaussian) noise. I.
Image Compressive Sensing Recovery via Collaborative Sparsity
"... Abstract—Compressive sensing (CS) has drawn quite an amount of attention as a joint sampling and compression approach. Its theory shows that when the signal is sparse enough in some domain, it can be decoded from many fewer measurements than suggested by the Nyquist sampling theory. So one of the mo ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Compressive sensing (CS) has drawn quite an amount of attention as a joint sampling and compression approach. Its theory shows that when the signal is sparse enough in some domain, it can be decoded from many fewer measurements than suggested by the Nyquist sampling theory. So one of the most challenging researches in CS is to seek a domain where a signal can exhibit a high degree of sparsity and hence be recovered faithfully. Most of the conventional CS recovery approaches, however, exploited a set of fixed bases (e.g., DCT, wavelet, and gradient domain) for the entirety of a signal, which are irrespective of the nonstationarity of natural signals and cannot achieve high enough degree of sparsity, thus resulting in poor ratedistortion performance. In this paper, we propose a new framework for image compressive sensing recovery via collaborative sparsity, which enforces local 2D sparsity and nonlocal 3D sparsity simultaneously in an adaptive hybrid spacetransformdomain, thus substantially utilizing intrinsic sparsity of natural images and greatly confining the CS solution space. In addition, an efficient augmented Lagrangianbased technique is developed to solve the above optimization problem. Experimental results on a wide range of natural images are presented to demonstrate the efficacy of the new CS recovery strategy. Index Terms—Augmented Lagrangian, compressive sensing (CS), image recovery, sparsity.
unknown title
"... This appendix presents the MCMC sampling, heuristically meanfield variational Bayesian (VB) [1] and ExpectationMaximization (EM) for posteriori distribution inference. The Generalized Inverse Gaussian (GIG) distribution is denoted by: GIG(x; a, b, p) = (a/b) p ..."
Abstract
 Add to MetaCart
(Show Context)
This appendix presents the MCMC sampling, heuristically meanfield variational Bayesian (VB) [1] and ExpectationMaximization (EM) for posteriori distribution inference. The Generalized Inverse Gaussian (GIG) distribution is denoted by: GIG(x; a, b, p) = (a/b) p