Results 1 
7 of
7
Signal recovery from random measurements via Orthogonal Matching Pursuit
 IEEE Trans. Inform. Theory
, 2007
"... Abstract. This technical report demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal. This is a massive improvement ove ..."
Abstract

Cited by 292 (9 self)
 Add to MetaCart
Abstract. This technical report demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal. This is a massive improvement over previous results for OMP, which require O(m 2) measurements. The new results for OMP are comparable with recent results for another algorithm called Basis Pursuit (BP). The OMP algorithm is faster and easier to implement, which makes it an attractive alternative to BP for signal recovery problems. 1.
Signal recovery from partial information via Orthogonal Matching Pursuit.” Submitted to
 IEEE Trans. Inform. Theory
, 2005
"... Abstract. This article demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal. This is a massive improvement over previou ..."
Abstract

Cited by 149 (8 self)
 Add to MetaCart
Abstract. This article demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal. This is a massive improvement over previous results for OMP, which require O(m 2) measurements. The new results for OMP are comparable with recent results for another algorithm called Basis Pursuit (BP). The OMP algorithm is much faster and much easier to implement, which makes it an attractive alternative to BP for signal recovery problems. 1.
Subadditivity of the entropy and its relation to BrascampLieb type inequalities. Preprint. Available under http://arxiv.org/abs/0710.0870
"... We prove a general duality result showing that a Brascamp–Lieb type inequality is equivalent to an inequality expressing subadditivity of the entropy, with a complete correspondence of best constants and cases of equality. This open a new approach to the proof of Brascamp–Lieb type inequalities, via ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
We prove a general duality result showing that a Brascamp–Lieb type inequality is equivalent to an inequality expressing subadditivity of the entropy, with a complete correspondence of best constants and cases of equality. This open a new approach to the proof of Brascamp–Lieb type inequalities, via subadditivity of the entropy. We illustrate the utility of this approach by proving a general inequality expressing the subadditivity property of the entropy on R n, and fully determining the cases of equality. As a consequence of the duality mentioned above, we obtain a simple new proof of the classical Brascamp–Lieb inequality, and also a fully explicit determination of all of the cases of equality. We also deduce several other consequences of the general subadditivity inequality, including a generalization of Hadamard’s inequality for determinants. Finally, we also prove a second duality theorem relating superadditivity of the Fisher information and a sharp convolution type inequality for the fundamental eigenvalues of Schrödinger operators. Though we focus mainly on the case of random variables in R n in this paper, we discuss extensions to other settings as well.
THE GEOMETRY OF EUCLIDEAN CONVOLUTION INEQUALITIES AND ENTROPY
, 907
"... Abstract. The goal of this note is to show that some convolution type inequalities from Harmonic Analysis and Information Theory, such as Young’s convolution inequality (with sharp constant), Nelson’s hypercontractivity of the Hermite semigroup or Shannon’s inequality, can be reduced to a simple ge ..."
Abstract
 Add to MetaCart
Abstract. The goal of this note is to show that some convolution type inequalities from Harmonic Analysis and Information Theory, such as Young’s convolution inequality (with sharp constant), Nelson’s hypercontractivity of the Hermite semigroup or Shannon’s inequality, can be reduced to a simple geometric study of frames of R 2. We shall derive directly entropic inequalities, which were recently proved to be dual to the BrascampLieb convolution type inequalities. 1.
A GENERALIZATION OF THE LÖWNERJOHN’S ELLIPSOID THEOREM
, 2013
"... Abstract. We address the following generalization P of the LöwnerJohn ellipsoid problem. Given a (non necessarily convex) compact set K ⊂ R n and an even integer d ∈ N, find an homogeneous polynomial g of degree d such that K ⊂ G: = {x: g(x) ≤ 1} and G has minimum volume among all such sets. We sh ..."
Abstract
 Add to MetaCart
Abstract. We address the following generalization P of the LöwnerJohn ellipsoid problem. Given a (non necessarily convex) compact set K ⊂ R n and an even integer d ∈ N, find an homogeneous polynomial g of degree d such that K ⊂ G: = {x: g(x) ≤ 1} and G has minimum volume among all such sets. We show that P is a convex optimization problem even if neither K nor G are convex! We next show that P has a unique optimal solution and a characterization with at most ( n+d−1 d contacts points in K∩G is also provided. This is the analogue for d> 2 of the LöwnerJohn’s theorem in the quadratic case d = 2, but importantly, we neither require the set K nor the sublevel set G to be convex. More generally, there is also an homogeneous polynomial g of even degree d and a point a ∈ R n such that K ⊂ Ga: = {x: g(x − a) ≤ 1} and Ga has minimum volume among all such sets (but uniqueness is not guaranteed). Finally, we also outline a numerical scheme to approximate as closely as desired the optimal value and an optimal solution. It consists of solving a hierarchy of convex optimization problems with strictly convex objective function and Linear Matrix Inequality (LMI) constraints. 1.
mail code 21750 · pasadena, ca 91125SIGNAL RECOVERY FROM RANDOM MEASUREMENTS VIA ORTHOGONAL MATCHING PURSUIT: THE GAUSSIAN CASE
, 2007
"... Abstract. This report demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal. This is a massive improvement over previous ..."
Abstract
 Add to MetaCart
Abstract. This report demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal. This is a massive improvement over previous results, which require O(m 2) measurements. The new results for OMP are comparable with recent results for another approach called Basis Pursuit (BP). In some settings, the OMP algorithm is faster and easier to implement, so it is an attractive alternative to BP for signal recovery problems. 1.