Results 1  10
of
42
Signal recovery from random measurements via Orthogonal Matching Pursuit
 IEEE Trans. Inform. Theory
, 2007
"... Abstract. This technical report demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal. This is a massive improvement ove ..."
Abstract

Cited by 292 (9 self)
 Add to MetaCart
Abstract. This technical report demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal. This is a massive improvement over previous results for OMP, which require O(m 2) measurements. The new results for OMP are comparable with recent results for another algorithm called Basis Pursuit (BP). The OMP algorithm is faster and easier to implement, which makes it an attractive alternative to BP for signal recovery problems. 1.
Signal recovery from partial information via Orthogonal Matching Pursuit.” Submitted to
 IEEE Trans. Inform. Theory
, 2005
"... Abstract. This article demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal. This is a massive improvement over previou ..."
Abstract

Cited by 149 (8 self)
 Add to MetaCart
Abstract. This article demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal. This is a massive improvement over previous results for OMP, which require O(m 2) measurements. The new results for OMP are comparable with recent results for another algorithm called Basis Pursuit (BP). The OMP algorithm is much faster and much easier to implement, which makes it an attractive alternative to BP for signal recovery problems. 1.
Smallest singular value of random matrices and geometry of random polytopes
 Adv. Math
, 2005
"... geometry of random polytopes ..."
The littlewoodofford problem and invertibility of random matrices
 Adv. Math
"... Abstract. We prove two basic conjectures on the distribution of the smallest singular value of random n×n matrices with independent entries. Under minimal moment assumptions, we show that the smallest singular value is of order n −1/2, which is optimal for Gaussian matrices. Moreover, we give a opti ..."
Abstract

Cited by 44 (10 self)
 Add to MetaCart
Abstract. We prove two basic conjectures on the distribution of the smallest singular value of random n×n matrices with independent entries. Under minimal moment assumptions, we show that the smallest singular value is of order n −1/2, which is optimal for Gaussian matrices. Moreover, we give a optimal estimate on the tail probability. This comes as a consequence of a new and essentially sharp estimate in the LittlewoodOfford problem: for i.i.d. random variables Xk and real numbers ak, determine the probability p that the sum � k akXk lies near some number v. For arbitrary coefficients ak of the same order of magnitude, we show that they essentially lie in an arithmetic progression of length 1/p. 1.
Nonasymptotic theory of random matrices: extreme singular values
 PROCEEDINGS OF THE INTERNATIONAL CONGRESS OF MATHEMATICIANS
, 2010
"... ..."
On the singularity probability of random Bernoulli matrices, to appear
 Soc Department of Mathematics, UCSD, La Jolla, CA 92093 Email address: kcostell@ucsd.edu Department of Mathematics, Rutgers, Piscataway, NJ 08854 Email address: vanvu@math.rutgers.edu
"... Abstract. Let n be a large integer and Mn be a random n by n matrix whose entries are i.i.d. Bernoulli random variables (each entry is±1 with probability 1/2). We show that the probability that Mn is singular is at most (3/4+o(1)) n, improving an earlier estimate of Kahn, Komlós and Szemerédi [11], ..."
Abstract

Cited by 32 (13 self)
 Add to MetaCart
Abstract. Let n be a large integer and Mn be a random n by n matrix whose entries are i.i.d. Bernoulli random variables (each entry is±1 with probability 1/2). We show that the probability that Mn is singular is at most (3/4+o(1)) n, improving an earlier estimate of Kahn, Komlós and Szemerédi [11], as well as earlier work by the authors [17]. The key new ingredient is the applications of Freiman type inverse theorems and other tools from additive combinatorics. 1.
Inverse LittlewoodOfford theorems and the condition number of random discrete matrices
 Annals of Mathematics
"... Abstract. Consider a random sum η1v1 +... + ηnvn, where η1,..., ηn are i.i.d. random signs and v1,..., vn are integers. The LittlewoodOfford problem asks to maximize concentration probabilities such as P(η1v1+...+ηnvn = 0) subject to various hypotheses on the v1,..., vn. In this paper we develop an ..."
Abstract

Cited by 29 (12 self)
 Add to MetaCart
Abstract. Consider a random sum η1v1 +... + ηnvn, where η1,..., ηn are i.i.d. random signs and v1,..., vn are integers. The LittlewoodOfford problem asks to maximize concentration probabilities such as P(η1v1+...+ηnvn = 0) subject to various hypotheses on the v1,..., vn. In this paper we develop an inverse LittlewoodOfford theory (somewhat in the spirit of Freiman’s inverse theory) in additive combinatorics, which starts with the hypothesis that a concentration probability is large, and concludes that almost all of the v1,..., vn are efficiently contained in a generalized arithmetic progression. As an application we give a new bound on the magnitude of the least singular value of a random Bernoulli matrix, which in turn provides upper tail estimates on the condition number. 1.
Complexity measures of sign matrices
 In Proceedings of the 39th ACM Symposium on the Theory of Computing. ACM
, 2007
"... In this paper we consider four previously known parameters of sign matrices from a complexitytheoretic perspective. The main technical contributions are tight (or nearly tight) inequalities that we establish among these parameters. Several new open problems are raised as well. 1. ..."
Abstract

Cited by 28 (10 self)
 Add to MetaCart
In this paper we consider four previously known parameters of sign matrices from a complexitytheoretic perspective. The main technical contributions are tight (or nearly tight) inequalities that we establish among these parameters. Several new open problems are raised as well. 1.
Invertibility of random matrices: Norm of the inverse
 Annals of Mathematics
"... Abstract. Let A be an n × n matrix, whose entries are independent copies of a centered random variable satisfying the subgaussian tail estimate. We prove that the operator norm of A −1 does not exceed Cn 3/2 with probability close to 1. 1. Introduction. Let A be an n × n matrix, whose entries are in ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
Abstract. Let A be an n × n matrix, whose entries are independent copies of a centered random variable satisfying the subgaussian tail estimate. We prove that the operator norm of A −1 does not exceed Cn 3/2 with probability close to 1. 1. Introduction. Let A be an n × n matrix, whose entries are independent identically distributed random variables. The spectral properties of such matrices, in particular invertibility, have been extensively studied (see, e.g. [M] and the survey [DS]). While A is almost surely invertible whenever its
Random matrices: The circular law
, 2008
"... Let x be a complex random variable with mean zero and bounded variance σ². Let Nn be a random matrix of order n with entries being i.i.d. 1 copies of x. Let λ1,..., λn be the eigenvalues of ..."
Abstract

Cited by 22 (9 self)
 Add to MetaCart
Let x be a complex random variable with mean zero and bounded variance σ². Let Nn be a random matrix of order n with entries being i.i.d. 1 copies of x. Let λ1,..., λn be the eigenvalues of