Results 1 
8 of
8
Compressive Sensing
, 2010
"... Compressive sensing is a new type of sampling theory, which predicts that sparse signals and images can be reconstructed from what was previously believed to be incomplete information. As a main feature, efficient algorithms such as ℓ1minimization can be used for recovery. The theory has many poten ..."
Abstract

Cited by 25 (8 self)
 Add to MetaCart
(Show Context)
Compressive sensing is a new type of sampling theory, which predicts that sparse signals and images can be reconstructed from what was previously believed to be incomplete information. As a main feature, efficient algorithms such as ℓ1minimization can be used for recovery. The theory has many potential applications in signal processing and imaging. This chapter gives an introduction and overview on both theoretical and numerical aspects of compressive sensing.
Precise Undersampling Theorems
"... Undersampling Theorems state that we may gather far fewer samples than the usual sampling theorem while exactly reconstructing the object of interest – provided the object in question obeys a sparsity condition, the samples measure appropriate linear combinations of signal values, and we reconstruc ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
Undersampling Theorems state that we may gather far fewer samples than the usual sampling theorem while exactly reconstructing the object of interest – provided the object in question obeys a sparsity condition, the samples measure appropriate linear combinations of signal values, and we reconstruct with a particular nonlinear procedure. While there are many ways to crudely demonstrate such undersampling phenomena, we know of only one approach which precisely quantifies the true sparsityundersampling tradeoff curve of standard algorithms and standard compressed sensing matrices. That approach, based on combinatorial geometry, predicts the exact location in sparsityundersampling domain where standard algorithms exhibit phase transitions in performance. We review the phase transition approach here and describe the broad range of cases where it applies. We also mention exceptions and state challenge problems for future research. Sample result: one can efficiently reconstruct a ksparse signal of length N from n measurements, provided n � 2k · log(N/n), for (k, n, N) large, k ≪ N.
Compressed sensing: how sharp is the restricted isometry property?
, 2009
"... Compressed sensing is a recent technique by which signals can be measured at a rate proportional to their information content, combining the important task of compression directly into the measurement process. Since its introduction in 2004 there have been hundreds of manuscripts on compressed sens ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
(Show Context)
Compressed sensing is a recent technique by which signals can be measured at a rate proportional to their information content, combining the important task of compression directly into the measurement process. Since its introduction in 2004 there have been hundreds of manuscripts on compressed sensing, a large fraction of which have focused on the design and analysis of algorithms to recover a signal from its compressed measurements. The Restricted Isometry Property (RIP) has become a ubiquitous property assumed in their analysis. We present the best known bounds on the RIP, and in the process illustrate the way in which the combinatorial nature of compressed sensing is controlled. Our quantitative bounds on the RIP allow precise statements as to how aggressively a signal can be undersampled, the essential question for practitioners.
IMPROVED BOUNDS ON RESTRICTED ISOMETRY CONSTANTS FOR GAUSSIAN MATRICES
"... Abstract. The Restricted Isometry Constants (RIC) of a matrix A measures how close to an isometry is the action of A on vectors with few nonzero entries, measured in the ℓ2 norm. Specifically, the upper and lower RIC of a matrix A of size n × N is the maximum and the minimum deviation from unity (on ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The Restricted Isometry Constants (RIC) of a matrix A measures how close to an isometry is the action of A on vectors with few nonzero entries, measured in the ℓ2 norm. Specifically, the upper and lower RIC of a matrix A of size n × N is the maximum and the minimum deviation from unity (one) of the largest and smallest, respectively, square of singular values of all `N ´ matrices formed by taking k columns from A. Calculation of the k RIC is intractable for most matrices due to its combinatorial nature; however, many random matrices typically have bounded RIC in some range of problem sizes (k, n, N). We provide the best known bound on the RIC for Gaussian matrices, which is also the smallest known bound on the RIC for any large rectangular matrix. Improvements over prior bounds are achieved by exploiting similarity of singular values for matrices which share a substantial number of columns. Key words. Wishart Matrices, Compressed sensing, sparse approximation, restricted isometry constant, phase transitions, Gaussian matrices, singular values of random matrices.
Optimal phase transitions in compressed sensing
 IEEE Trans. Inf. Theory
"... Abstract—Compressed sensing deals with efficient recovery of analog signals from linear encodings. This paper presents a statistical study of compressed sensing by modeling the input signal as an i.i.d. process with known distribution. Three classes of encoders are considered, namely optimal nonline ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Abstract—Compressed sensing deals with efficient recovery of analog signals from linear encodings. This paper presents a statistical study of compressed sensing by modeling the input signal as an i.i.d. process with known distribution. Three classes of encoders are considered, namely optimal nonlinear, optimal linear, and random linear encoders. Focusing on optimal decoders, we investigate the fundamental tradeoff between measurement rate and reconstruction fidelity gauged by error probability and noise sensitivity in the absence and presence of measurement noise, respectively. The optimal phasetransition threshold is determined as a functional of the input distribution and compared to suboptimal thresholds achieved by popular reconstruction algorithms. In particular, we show that Gaussian sensing matrices incur no penalty on the phasetransition threshold with respect to optimal nonlinear encoding. Our results also provide a rigorous justification of previous results based on replica heuristics in the weaknoise regime. Index Terms—Compressed sensing, joint sourcechannel coding, minimum meansquare error (MMSE) dimension, phase transition, random matrix, Rényi information dimension, Shannon theory.
On Support Sizes of Restricted Isometry Constants
"... A generic tool for analyzing sparse approximation algorithms is the restricted isometry property (RIP) introduced by Candès and Tao. For qualitative comparison of sufficient conditions derived from an RIP analysis, the support size of the RIP constants is generally reduced as much as possible with t ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
A generic tool for analyzing sparse approximation algorithms is the restricted isometry property (RIP) introduced by Candès and Tao. For qualitative comparison of sufficient conditions derived from an RIP analysis, the support size of the RIP constants is generally reduced as much as possible with the goal of achieving a support size of twice the sparsity of the target signal. Using a quantitative comparison via phase transitions for Gaussian measurement matrices, three examples from the literature of such support size reduction are investigated. In each case, utilizing a larger support size for the RIP constants results in a sufficient condition for exact sparse recovery satisfied, with high probability, by a significantly larger subset of Gaussian matrices. Key words: Compressed sensing, restricted isometry constants, restricted isometry property, sparse
Geometry of Simplicity Compressed Sensing Encoder–Decoder pair Compressed Sensing Applications
"... When it is “costly ” to acquire information use CS Transform workload from sensor to computing resources Reduced sampling possible by exploiting simplicity ◮ Transformative Applications: ..."
Abstract
 Add to MetaCart
When it is “costly ” to acquire information use CS Transform workload from sensor to computing resources Reduced sampling possible by exploiting simplicity ◮ Transformative Applications:
BOUNDS OF RESTRICTED ISOMETRY CONSTANTS IN EXTREME ASYMPTOTICS: FORMULAE FOR GAUSSIAN MATRICES By Bubacarr Bah and
, 2011
"... (will be inserted by the editor) Bounds of restricted isometry constants in extreme asymptotics: formulae for Gaussian matrices Bubacarr Bah · Jared Tanner the date of receipt and acceptance should be inserted later Abstract Restricted Isometry Constants (RICs) provide a measure of how far from an i ..."
Abstract
 Add to MetaCart
(Show Context)
(will be inserted by the editor) Bounds of restricted isometry constants in extreme asymptotics: formulae for Gaussian matrices Bubacarr Bah · Jared Tanner the date of receipt and acceptance should be inserted later Abstract Restricted Isometry Constants (RICs) provide a measure of how far from an isometry a matrix can be when acting on sparse vectors. This, and related quantities, provide a mechanism by which standard eigenanalysis can be applied to topics relying on sparsity. RIC bounds have been presented for a variety of random matrices and matrix dimension and sparsity ranges. We provide explicitly formulae for RIC bounds, of n ×N Gaussian matrices with sparsity k, in three settings: a) n/N fixed and k/n approaching zero, b) k/n fixed and n/N approaching zero, and c) n/N approaching zero with k/n decaying inverse logrithmically in N/n; in these three settings the RICs a) decay to zero, b) become unbounded (or approach inherent bounds), and c) approach a nonzero constant. Implications of these results for RIC based analysis of compressed sensing algorithms are presented.