## Sampling Bounds for Sparse Support Recovery in the Presence of Noise

Citations: | 15 - 1 self |

### BibTeX

@MISC{Reeves_samplingbounds,

author = {Galen Reeves and et al.},

title = {Sampling Bounds for Sparse Support Recovery in the Presence of Noise},

year = {}

}

### OpenURL

### Abstract

It is well known that the support of a sparse signal can be recovered from a small number of random projections. However, in the presence of noise all known sufficient conditions require that the per-sample signal-to-noise ratio (SNR) grows without bound with the dimension of the signal. If the noise is due to quantization of the samples, this means that an unbounded rate per sample is needed. In this paper, it is shown that an unbounded SNR is also a necessary condition for perfect recovery, but any fraction (less than one) of the support can be recovered with bounded SNR. This means that a finite rate per sample is sufficient for partial support recovery. Necessary and sufficient conditions are given for both stochastic and non-stochastic signal models. This problem arises in settings such as compressive sensing, model selection, and signal denoising.

### Citations

1865 | Compressed sensing - Donoho - 2006 |

1778 | Atomic decomposition by basis pursuit
- Chen, Donoho, et al.
- 1998
(Show Context)
Citation Context ...ess setting, perfect support recovery requires m = k + 1 samples using optimal, but computationally expensive, recovery algorithms [7], and requires m = O(k log(n/k)) samples using linear programming =-=[8]-=-–[10]. In the presence of noise, Compressive Sensing [4], [5] shows that for m = O(k log(n/k)) samples, quadratic programming can provide a signal estimate ˆx that is stable; that is, ||ˆx − x|| is bo... |

888 | Near-optimal signal recovery from random projections: Universal encoding strategies
- Candès, Tao
(Show Context)
Citation Context ...setting, perfect support recovery requires m = k + 1 samples using optimal, but computationally expensive, recovery algorithms [7], and requires m = O(k log(n/k)) samples using linear programming [8]–=-=[10]-=-. In the presence of noise, Compressive Sensing [4], [5] shows that for m = O(k log(n/k)) samples, quadratic programming can provide a signal estimate ˆx that is stable; that is, ||ˆx − x|| is bounded... |

803 |
Stable signal recovery from incomplete and inaccurate measurements
- Candès, Romberg, et al.
(Show Context)
Citation Context ...ing. Typically, optimal estimation is computationally hard but for certain tasks, such as approximating x in the ℓ2 sense, efficient relaxations have been shown to produce near optimal solutions [4], =-=[5]-=-. The task considered in this paper is estimation of the support K = {i ∈ {1, · · ·n} : xi ̸= 0} (2) where k = |K| is the number of non-zero elements of x. Our goal is to give fundamental (information... |

323 | Just relax: Convex programming methods for identifying sparse signals
- Tropp
(Show Context)
Citation Context ...Sensing [4], [5] shows that for m = O(k log(n/k)) samples, quadratic programming can provide a signal estimate ˆx that is stable; that is, ||ˆx − x|| is bounded with respect to ||w||. The papers [4], =-=[11]-=-–[13] give sufficient conditions for the support of ˆx to be contained inside the support of x. The work in [1], [3], [14] addresses the asymptotic performance of a particular quadratic program, the L... |

319 | Stable recovery of sparse overcomplete representations in the presence of noise
- Donoho, Elad, et al.
(Show Context)
Citation Context ...llenging. Typically, optimal estimation is computationally hard but for certain tasks, such as approximating x in the ℓ2 sense, efficient relaxations have been shown to produce near optimal solutions =-=[4]-=-, [5]. The task considered in this paper is estimation of the support K = {i ∈ {1, · · ·n} : xi ̸= 0} (2) where k = |K| is the number of non-zero elements of x. Our goal is to give fundamental (inform... |

240 | On model selection consistency of lasso
- Zhao, Yu
(Show Context)
Citation Context ...lem arises in settings such as compressive sensing, model selection, and signal denoising. I. INTRODUCTION The task of support recovery (also known as recovery of sparsity [1], [2] or model selection =-=[3]-=-) is to determine which elements of some unknown sparse signal x ∈ R n are nonzero based on a set of noisy linear observations. This problem arises in areas, such as compressive sensing, graphical mod... |

187 |
Distribution of Eigenvalues for Some Sets of Random Matrices
- Marchenko, Pastur
- 1967
(Show Context)
Citation Context ...)−1) > α 2 /e 3 (11) where the Lambert-W function W(z) is the inverse function of f(z) = z e z . Furthermore, the asymptotic spectrum of the random matrix Φ T U ΦU is given by the Marcenko-Pastur law =-=[21]-=-, and the following lemma follows from results in [22]. Lemma 2: For the Gaussian signal class G where 1 nI(x; y|K) → V(SNR(G); ρ/Ω) as n → ∞, (12) V(γ; r) = log ( 1 + γ − F(γ, r) ) + 1 rγ F(γ, r) + 1... |

181 |
Random matrix theory and wireless communications
- Tulino, Verdú
- 2004
(Show Context)
Citation Context ... is the inverse function of f(z) = z e z . Furthermore, the asymptotic spectrum of the random matrix Φ T U ΦU is given by the Marcenko-Pastur law [21], and the following lemma follows from results in =-=[22]-=-. Lemma 2: For the Gaussian signal class G where 1 nI(x; y|K) → V(SNR(G); ρ/Ω) as n → ∞, (12) V(γ; r) = log ( 1 + γ − F(γ, r) ) + 1 rγ F(γ, r) + 1 r log ( 1 + rγ − F(γ, r) ) and F(γ, r) = 1 4 (√ γ(1 +... |

169 | Sharp thresholds for high-dimensional and noisy sparsity recovery using ℓ1 constrained quadratic programming
- WAINWRIGHT
- 2009
(Show Context)
Citation Context ...tic signal models. This problem arises in settings such as compressive sensing, model selection, and signal denoising. I. INTRODUCTION The task of support recovery (also known as recovery of sparsity =-=[1]-=-, [2] or model selection [3]) is to determine which elements of some unknown sparse signal x ∈ R n are nonzero based on a set of noisy linear observations. This problem arises in areas, such as compre... |

84 | Recovery of exact sparse representations in the presence of bounded noise - Fuchs - 2005 |

48 | Necessary and sufficient conditions for sparsity pattern recovery
- Fletcher, Rangan, et al.
- 2009
(Show Context)
Citation Context ...t al. [15] lower bound the probability of success, and Wainwright [2] gives necessary and sufficient conditions for an exhaustive search algorithm. Since the submission of this paper, Fletcher et al. =-=[16]-=- have generalized the necessary conditions given below in Theorem 1 for all scalings of (n, k, m). More generally, support recovery with respect to some distortion measure has also been considered [17... |

39 |
Spectrum-blind minimum-rate sampling and reconstruction of multiband signals
- Feng, Bresler
- 1996
(Show Context)
Citation Context ...cribes an optimal estimation algorithm. A. Related Work In the noiseless setting, perfect support recovery requires m = k + 1 samples using optimal, but computationally expensive, recovery algorithms =-=[7]-=-, and requires m = O(k log(n/k)) samples using linear programming [8]–[10]. In the presence of noise, Compressive Sensing [4], [5] shows that for m = O(k log(n/k)) samples, quadratic programming can p... |

31 | Denoising by sparse approximation: Error bounds based on rate–distortion theory - Fletcher, Rangan, et al. - 2006 |

26 | Measurements vs. bits: Compressed sensing meets information theory
- Sarvotham, Baron, et al.
- 2006
(Show Context)
Citation Context ...16] have generalized the necessary conditions given below in Theorem 1 for all scalings of (n, k, m). More generally, support recovery with respect to some distortion measure has also been considered =-=[17]-=-–[20]. Aeron et al. [19] derive bounds similar to Theorems 2 and 3 in this paper for the special setting in which each element of x has only a finite number of values. B. Partial Support Recovery Give... |

19 |
Theoretic Limits on Noisy Compressive Sampling
- Akçakaya, Tarokh, et al.
- 2010
(Show Context)
Citation Context ...ave generalized the necessary conditions given below in Theorem 1 for all scalings of (n, k, m). More generally, support recovery with respect to some distortion measure has also been considered [17]–=-=[20]-=-. Aeron et al. [19] derive bounds similar to Theorems 2 and 3 in this paper for the special setting in which each element of x has only a finite number of values. B. Partial Support Recovery Given the... |

16 | Sparse signal sampling using noisy linear projections
- Reeves
- 2008
(Show Context)
Citation Context ...ork [1], [2] where Φ is chosen such that E[〈φi, x〉 2 ] = ||x|| 2 and hence Φ amplifies the signal x. Section II-A gives a brief summary of relevant research (a more more extensive summary is given in =-=[6]-=-), Section IIB describes our error metric, and Section II-C describes an optimal estimation algorithm. A. Related Work In the noiseless setting, perfect support recovery requires m = k + 1 samples usi... |

13 |
On sensing capacity of sensor networks for the class of linear observation, fixed SNR models
- Aeron, Zhao, et al.
- 2007
(Show Context)
Citation Context ... necessary conditions given below in Theorem 1 for all scalings of (n, k, m). More generally, support recovery with respect to some distortion measure has also been considered [17]–[20]. Aeron et al. =-=[19]-=- derive bounds similar to Theorems 2 and 3 in this paper for the special setting in which each element of x has only a finite number of values. B. Partial Support Recovery Given the true support K and... |

9 |
On the necessary density for spectrum-blind nonuniform sampling subject to quantization
- Gastpar, Bresler
(Show Context)
Citation Context ...→ ∞ or xmin → ∞. Another line of research has considered informationtheoretic bounds on the asymptotic performance of optimal support recovery algorithms. For perfect support recovery, Gastpar et al. =-=[15]-=- lower bound the probability of success, and Wainwright [2] gives necessary and sufficient conditions for an exhaustive search algorithm. Since the submission of this paper, Fletcher et al. [16] have ... |

9 | Rate-distortion bounds for sparse approximation - Fletcher, Rangan, et al. - 2007 |

5 |
of exact sparse representations in the presence of bounded noise
- “Recovery
- 2005
(Show Context)
Citation Context ...ng [4], [5] shows that for m = O(k log(n/k)) samples, quadratic programming can provide a signal estimate ˆx that is stable; that is, ||ˆx − x|| is bounded with respect to ||w||. The papers [4], [11]–=-=[13]-=- give sufficient conditions for the support of ˆx to be contained inside the support of x. The work in [1], [3], [14] addresses the asymptotic performance of a particular quadratic program, the Lasso.... |

3 |
limitations on sparsity recovery in the high-dimensional and noisy setting
- “Information-theoretic
- 2009
(Show Context)
Citation Context ...d to the performance of a maximum likelihood (ML) decoder which uses no information about the assumed signal class X . This is the same estimator studied (for the special case of α = 0) in Wainwright =-=[2]-=- and is given by ˆKML(y) = arg min |U|=k ||[Im − ΦU(Φ T UΦU) −1 Φ T U] y|| 2 , where ΦU corresponds to the columns of Φ indexed by U. We remark that ML decoding is computationally expensive for any pr... |

3 |
Consistent neighborhood selection for high-dimensional graphs with lasso
- Meinshausen, Buhlmann
(Show Context)
Citation Context ...stable; that is, ||ˆx − x|| is bounded with respect to ||w||. The papers [4], [11]–[13] give sufficient conditions for the support of ˆx to be contained inside the support of x. The work in [1], [3], =-=[14]-=- addresses the asymptotic performance of a particular quadratic program, the Lasso. Results are formulated in terms of scaling conditions for (n, k, m) and the magnitude of the smallest non-zero compo... |

2 | Sensing Capacity, Diversity and Sparsity: Fundamental Tradeoffs - Aeron, Zhao, et al. - 2007 |