#### DMCA

## Orthogonal Matching Pursuit with Replacement

Citations: | 9 - 1 self |

### Citations

3542 | Compressed sensing
- Donoho
- 2006
(Show Context)
Citation Context ... smalI number of measurements. It has important applications in imsging, computer vision andsmachine learning (see, for example, [9,24, 14]).sIn this paper, we focus on the compressed sensing setting =-=[3, 7]-=- where we want to design a measurement matrixsA E R=xn such that a sparse vector x* E Rn with Ilx*llo := I BUpp(X*)I ::; k < n can be efficiently recovered fromsthe measurements b = Ax* E R=. Initial ... |

1367 | Decoding by linear programming - Candes, Tao - 2005 |

747 | Cosamp: Iterative signal recovery from incomplete and inaccurate samples
- Needell, Tropp
- 2009
(Show Context)
Citation Context ...tion.sSeveral other iterative approaches have been proposed that include Iterative Soft Thresholding (1ST) [17], IterativesHard Thresholding (!BT) [I], Compressive Santpling Matching Pursuit (CoSaMP) =-=[19]-=-, Subspace Pursuit (SP) [4],sIterative Thresholding with Inversion (IT!) [16], Hard Thresholding Pursuit (HTP) [10] and many others. In the familysofiterative hard thresholding algorithms, we can iden... |

683 |
The restricted isometry property and its implications for compressed sensing
- Candés
- 2008
(Show Context)
Citation Context ...uch that, for all x with Ilxllo ::; k,swe havesIsSeveral random matrix ensembles are known to satisfY 00> < {} with high probability provided one choosessm ~ 0 (~ log ~) measurements. It was shown in =-=[2]-=- that i,-minimization recovers all k-sparse vectors provided Assatisfies t.k < 0.414 although the conditioohas been recently intproved to 02k < 0.473 [11]. Note that, in compressedssensing, the goal i... |

618 | Orthogonal matching pursuit: Recursive function approximation with applications to wavelet decomposition
- PATI, REZAIIFAR, et al.
- 1993
(Show Context)
Citation Context ...le, whensthe dimensionality is in the millions. This has sparked a huge interest in other iterative methods for sparse recovery.sAn early classic iterative method is Orthogooal Matching Pursuit (OMP) =-=[21, 6]-=- that greedily chooses elements to addsto the support. It is a natural, easy-to-intplement and fast method but unfortuoately lacks stroug theoretical guarantees.sIndeed, it is known that, if run for k... |

319 | Iterative hard thresholding for compressed sensing
- Blumensath, Davies
- 2009
(Show Context)
Citation Context ...n the ones required by other methodsslike i,-minimization.sSeveral other iterative approaches have been proposed that include Iterative Soft Thresholding (1ST) [17], IterativesHard Thresholding (!BT) =-=[I]-=-, Compressive Santpling Matching Pursuit (CoSaMP) [19], Subspace Pursuit (SP) [4],sIterative Thresholding with Inversion (IT!) [16], Hard Thresholding Pursuit (HTP) [10] and many others. In the family... |

292 | Single-pixel imaging via compressive sampling - Duarte, Davenport, et al. - 2008 |

284 | Subspace pursuit for compressive sensing signal reconstruction - Dai, Milenkovic - 2008 |

202 | A unified framework for high-dimensional analysis of M -estimators with decomposable regularizers,”
- Negahban, Ravikumar, et al.
- 2012
(Show Context)
Citation Context ...to 02k < 0.473 [11]. Note that, in compressedssensing, the goal is to recover all, or most, k-sparse signals using the same measurement matrix A. Hence, weakerscooditioos such as restricted coovexity =-=[20]-=- studied in the statistical literature (where the aint is to recover a singlessparse vector from noisy linear measurements) typically do not suffice. In fact, if RIP is not satisfied then multiplesspa... |

160 | Message passing algorithms for compressed sensing: II. Analysis and validation
- Donoho, Maleki, et al.
- 2010
(Show Context)
Citation Context ...s RIP guarantees, i,-minimizatioo can guarantee recovery using just O(k log(n/ k») measurements, but itshas been observed in practice that i,-minimization is too expensive in large scale applications =-=[8]-=-, for example, whensthe dimensionality is in the millions. This has sparked a huge interest in other iterative methods for sparse recovery.sAn early classic iterative method is Orthogooal Matching Pur... |

143 | Sparse representation for computer vision and pattern recognition
- Wright, Ma, et al.
- 2010
(Show Context)
Citation Context ...algorithms, such that almost all sparse signals cansbe recovered from a smalI number of measurements. It has important applications in imsging, computer vision andsmachine learning (see, for example, =-=[9,24, 14]-=-).sIn this paper, we focus on the compressed sensing setting [3, 7] where we want to design a measurement matrixsA E R=xn such that a sparse vector x* E Rn with Ilx*llo := I BUpp(X*)I ::; k < n can be... |

99 | Adaptive forward-backward greedy algorithm for learning sparse representations
- Zhang
- 2008
(Show Context)
Citation Context .... Assecood least squares problem is then solved 00 the reduced support. These algorithms typically enlarge and reducesthe support set by k or 2k elements. An exceptioo is the two-stage algorithm FoBa =-=[25]-=- that adds and removes singleselements from the support. However, it differs from our proposed methods as its analysis requires very restrictive RIPscooditioos (08k < 0.1 as quoted in [14]) and the co... |

96 | Guaranteed rank minimization via singular value projection - Jain, Meka, et al. |

93 | Multi-label prediction via compressed sensing
- Hsu, Kakade, et al.
- 2009
(Show Context)
Citation Context ...algorithms, such that almost all sparse signals cansbe recovered from a smalI number of measurements. It has important applications in imsging, computer vision andsmachine learning (see, for example, =-=[9,24, 14]-=-).sIn this paper, we focus on the compressed sensing setting [3, 7] where we want to design a measurement matrixsA E R=xn such that a sparse vector x* E Rn with Ilx*llo := I BUpp(X*)I ::; k < n can be... |

77 |
Greedy adaptive approximation
- Davis, Mallat, et al.
- 1997
(Show Context)
Citation Context ...le, whensthe dimensionality is in the millions. This has sparked a huge interest in other iterative methods for sparse recovery.sAn early classic iterative method is Orthogooal Matching Pursuit (OMP) =-=[21, 6]-=- that greedily chooses elements to addsto the support. It is a natural, easy-to-intplement and fast method but unfortuoately lacks stroug theoretical guarantees.sIndeed, it is known that, if run for k... |

76 | Analysis of orthogonal matching pursuit using the restricted isometry property
- Davenport, Wakin
- 2010
(Show Context)
Citation Context ... set As a result, the residnal becomes orthogonal to the columos ofsA that correspond to the current support set. Thus, the least squares s1ep is also referred to as orthogonalization byssome authors =-=[5]-=-.sLet us briefly explain some of our notation. We use the MATI..AB notation:sA\b:= argmin IIAx - bl1 2 •szsThe hard thresholding operator H.O sorts its argument vector in decreasing order (in absolute... |

65 | Optimally tuned iterative reconstruction algorithms for compressed sensing,” Selected Topics in Signal Processing
- Maleki, Donoho
- 2010
(Show Context)
Citation Context ...opose a novelspartial hard-thresholding operator that leads to a general family of iterative algorithms. While onesextreme of the family yields well known hard thresholding algorithms like ITI and HTP=-=[17, 10]-=-, thesother end of the spectrum leads to a novel algorithm that we call Orthogonal Matching Pursnit withsReplacement (OMPR). OMPR, like the classic greedy algorithm OMP, adds exactly one coordinatesto... |

62 | thresholding pursuit: an algorithm for compressive sensing
- Hard
(Show Context)
Citation Context ...opose a novelspartial hard-thresholding operator that leads to a general family of iterative algorithms. While onesextreme of the family yields well known hard thresholding algorithms like ITI and HTP=-=[17, 10]-=-, thesother end of the spectrum leads to a novel algorithm that we call Orthogonal Matching Pursnit withsReplacement (OMPR). OMPR, like the classic greedy algorithm OMP, adds exactly one coordinatesto... |

62 | Sparse recovery with orthogonal matching pursuit under RIP
- Zhang
(Show Context)
Citation Context ...oug theoretical guarantees.sIndeed, it is known that, if run for k iterations, OMP cannot uoiformly recover all k-sparse vectors assumiug RIPscooditioo of the form 02k :'0 IJ [22, 18]. However, Zhang =-=[26]-=- showed that OMP, if run for 30k iterations, recovers thesoptimal solution when 03'k :'0 1/3; a significantly more restrictive cooditioo than the ones required by other methodsslike i,-minimization.sS... |

43 | Trading accuracy for sparsity in optimization problems with sparsity constraints - Shalev-Shwartz, Srebro, et al. - 2010 |

33 | On the impossibility of uniform sparse reconstruction using greedy methods. Sampling Theory
- Rauhut
- 2008
(Show Context)
Citation Context ...t unfortuoately lacks stroug theoretical guarantees.sIndeed, it is known that, if run for k iterations, OMP cannot uoiformly recover all k-sparse vectors assumiug RIPscooditioo of the form 02k :'0 IJ =-=[22, 18]-=-. However, Zhang [26] showed that OMP, if run for 30k iterations, recovers thesoptimal solution when 03'k :'0 1/3; a significantly more restrictive cooditioo than the ones required by other methodssli... |

28 | A note on guaranteed sparse recovery via ℓ1-minimization - Foucart - 2010 |

19 | Remarks on the restricted isometry property in orthogonal matching pursuit algorithm, 2011. preprint arXiv - Mo, Shen |

15 |
Coherence analysis of iterative thresholding algorithms
- Maleki
- 2009
(Show Context)
Citation Context ...ative Soft Thresholding (1ST) [17], IterativesHard Thresholding (!BT) [I], Compressive Santpling Matching Pursuit (CoSaMP) [19], Subspace Pursuit (SP) [4],sIterative Thresholding with Inversion (IT!) =-=[16]-=-, Hard Thresholding Pursuit (HTP) [10] and many others. In the familysofiterative hard thresholding algorithms, we can identifY two major subfamilies [17]: one- and two-stage algorithms.sAs their nant... |

6 | 879-approximation algorithms for - Goemans, Williamson - 1994 |

2 |
Similarity search in high dimensions using hashing
- Giouis, Indyk, et al.
- 1999
(Show Context)
Citation Context ...o r, = Ax' - b, i.e., this may be viewed as the similarity search task for queries of the formsr, and -r, from a database of N vectors IAI"'" ANI.sTo this end, we use locality sensitive hashiug (LSH) =-=[12]-=-, a well known data-structore for approximate nearestneighbor retrieval. Note that while LSH is designed for nearest neighbor search (iu terms of Euclidean distances) andsiu general might not have any... |

1 |
Decoding by lioear programming
- Candes, Tao
(Show Context)
Citation Context ... smalI number of measurements. It has important applications in imsging, computer vision andsmachine learning (see, for example, [9,24, 14]).sIn this paper, we focus on the compressed sensing setting =-=[3, 7]-=- where we want to design a measurement matrixsA E R=xn such that a sparse vector x* E Rn with Ilx*llo := I BUpp(X*)I ::; k < n can be efficiently recovered fromsthe measurements b = Ax* E R=. Initial ... |

1 |
Subspace pursuit for compressive seosing signal reconstruction
- Dai, Milenkovic
(Show Context)
Citation Context ...e approaches have been proposed that include Iterative Soft Thresholding (1ST) [17], IterativesHard Thresholding (!BT) [I], Compressive Santpling Matching Pursuit (CoSaMP) [19], Subspace Pursuit (SP) =-=[4]-=-,sIterative Thresholding with Inversion (IT!) [16], Hard Thresholding Pursuit (HTP) [10] and many others. In the familysofiterative hard thresholding algorithms, we can identifY two major subfamilies ... |

1 |
Single-pixel imaging via compressive sarnpliog
- Laska, Sun, et al.
- 2008
(Show Context)
Citation Context ...algorithms, such that almost all sparse signals cansbe recovered from a smalI number of measurements. It has important applications in imsging, computer vision andsmachine learning (see, for example, =-=[9,24, 14]-=-).sIn this paper, we focus on the compressed sensing setting [3, 7] where we want to design a measurement matrixsA E R=xn such that a sparse vector x* E Rn with Ilx*llo := I BUpp(X*)I ::; k < n can be... |

1 | A note on guaranteed sparse recovery via i,-minimi'ation. Applied and Computational Harmonic Analysis - Foucart - 2010 |

1 |
WIlliamson .. 879-approximation algorithms for
- Goemans, P
(Show Context)
Citation Context ...or our problem, we use the following hash function: h,.(a) = sign(uT a), where u ~ N(O, J) is asrandom hyper-plane generated from the standard multivariate Gaussian distribution. It can be shown that =-=[13]-=-s() () I -I ( af a2 ) Pr[hu al = hu a. ] = 1-;;: cos Iladlla211'sNow, an .-bit hash key is created by randoruly sampling hash functions h,., i.e., g( a)s[hu,(a),hu,(a), ... ,hu.(a)], where each Ui is ... |

1 |
Gusranteed rank minimiUltion via singular value projection
- Jain, Meka, et al.
- 2010
(Show Context)
Citation Context ...orithm at the other extreme of 1 = k has appeared at least three times in the receot literature: as Iterative (hard)sThresholding with Inversioo (IT!) in [16], as SVP-Newton (in its matrix avatar) in =-=[15]-=-, and as Hard ThresholdingsPursuit (HTP) in [10]). Let us call it IHT-Newton as the least squares step can be viewed as a Newton step for thesquadrstic objective. The above geoera1 result for the OMPR... |

1 |
Trading accuracy for sparsity in optimiUltion problems with sparsity constraints
- Shalev-Shwartz, Srebro, et al.
- 2010
(Show Context)
Citation Context ...e RIPscooditioos (08k < 0.1 as quoted in [14]) and the connection to locality sensitive hashing (see below) is not made.sAnother algorithm with replacentent steps was studied by Shalev-Shwartz et al. =-=[23]-=-. However, the algorithm and thessettiug under which it is analyzed are different from ours.sIn this paper, we present and provide a unified analysis for a family of one-stage iterative hard threshold... |