Results 1  10
of
79
COMBINING GEOMETRY AND COMBINATORICS: A UNIFIED APPROACH TO SPARSE SIGNAL RECOVERY
"... Abstract. There are two main algorithmic approaches to sparse signal recovery: geometric and combinatorial. The geometric approach starts with a geometric constraint on the measurement matrix Φ and then uses linear programming to decode information about x from Φx. The combinatorial approach constru ..."
Abstract

Cited by 161 (15 self)
 Add to MetaCart
(Show Context)
Abstract. There are two main algorithmic approaches to sparse signal recovery: geometric and combinatorial. The geometric approach starts with a geometric constraint on the measurement matrix Φ and then uses linear programming to decode information about x from Φx. The combinatorial approach constructs Φ and a combinatorial decoding algorithm to match. We present a unified approach to these two classes of sparse signal recovery algorithms. The unifying elements are the adjacency matrices of highquality unbalanced expanders. We generalize the notion of Restricted Isometry Property (RIP), crucial to compressed sensing results for signal recovery, from the Euclidean norm to the ℓp norm for p ≈ 1, and then show that unbalanced expanders are essentially equivalent to RIPp matrices. From known deterministic constructions for such matrices, we obtain new deterministic measurement matrix constructions and algorithms for signal recovery which, compared to previous deterministic algorithms, are superior in either the number of measurements or in noise tolerance. 1.
Extractors: Optimal up to Constant Factors
 STOC'03
, 2003
"... This paper provides the first explicit construction of extractors which are simultaneously optimal up to constant factors in both seed length and output length. More precisely, for every n, k, our extractor uses a random seed of length O(log n) to transform any random source on n bits with (min)ent ..."
Abstract

Cited by 55 (12 self)
 Add to MetaCart
This paper provides the first explicit construction of extractors which are simultaneously optimal up to constant factors in both seed length and output length. More precisely, for every n, k, our extractor uses a random seed of length O(log n) to transform any random source on n bits with (min)entropy k, into a distribution on (1 − α)k bits that is ɛclose to uniform. Here α and ɛ can be taken to be any positive constants. (In fact, ɛ can be almost polynomially small). Our improvements are obtained via three new techniques, each of which may be of independent interest. The first is a general construction of mergers [22] from locally decodable errorcorrecting codes. The second introduces new condensers that have constant seed length (and retain a constant fraction of the minentropy in the random source). The third is a way to augment the “winwin repeated condensing” paradigm of [17] with error reduction techniques like [15] so that the our constant seedlength condensers can be used without error accumulation.
Compressive Sensing
, 2010
"... Compressive sensing is a new type of sampling theory, which predicts that sparse signals and images can be reconstructed from what was previously believed to be incomplete information. As a main feature, efficient algorithms such as ℓ1minimization can be used for recovery. The theory has many poten ..."
Abstract

Cited by 50 (13 self)
 Add to MetaCart
Compressive sensing is a new type of sampling theory, which predicts that sparse signals and images can be reconstructed from what was previously believed to be incomplete information. As a main feature, efficient algorithms such as ℓ1minimization can be used for recovery. The theory has many potential applications in signal processing and imaging. This chapter gives an introduction and overview on both theoretical and numerical aspects of compressive sensing.
Extractors for a constant number of polynomially small minentropy independent sources
 In Proceedings of the 38th Annual ACM Symposium on Theory of Computing
, 2006
"... We consider the problem of randomness extraction from independent sources. We construct an extractor that can extract from a constant number of independent sources of length n, each of which have minentropy n γ for an arbitrarily small constant γ> 0. Our extractor is obtained by composing seeded ..."
Abstract

Cited by 48 (9 self)
 Add to MetaCart
We consider the problem of randomness extraction from independent sources. We construct an extractor that can extract from a constant number of independent sources of length n, each of which have minentropy n γ for an arbitrarily small constant γ> 0. Our extractor is obtained by composing seeded extractors in simple ways. We introduce a new technique to condense independent somewhererandom sources which looks like a useful way to manipulate independent sources. Our techniques are different from those used in recent work [BIW04, BKS + 05, Raz05, Bou05] for this problem in the sense that they do not rely on any results from additive number theory. Using Bourgain’s extractor [Bou05] as a black box, we obtain a new extractor for 2 independent blocksources with few blocks, even when the minentropy is as small as polylog(n). We also show how to modify the 2 source disperser for linear minentropy of Barak et al. [BKS + 05] and the 3 source extractor of Raz [Raz05] to get dispersers/extractors with exponentially small error and linear output length where previously both were constant. In terms of Ramsey Hypergraphs, for every constant 1> γ> 0 our construction gives a family of explicit O(1/γ)uniform hypergraphs on N vertices that avoid cliques and independent sets of (log N)γ size 2.
Simulating Independence: New Constructions of Condensers, Ramsey Graphs, Dispersers, and Extractors
 In Proceedings of the 37th Annual ACM Symposium on Theory of Computing
, 2005
"... We present new explicit constructions of deterministic randomness extractors, dispersers and related objects. More precisely, a distribution X over binary strings of length n is called a δsource if it assigns probability at most 2 −δn to any string of length n, and for any δ> 0 we construct the ..."
Abstract

Cited by 47 (10 self)
 Add to MetaCart
(Show Context)
We present new explicit constructions of deterministic randomness extractors, dispersers and related objects. More precisely, a distribution X over binary strings of length n is called a δsource if it assigns probability at most 2 −δn to any string of length n, and for any δ> 0 we construct the following poly(n)time computable functions: 2source disperser: D: ({0, 1} n) 2 → {0, 1} such that for any two independent δsources X1, X2 we have that the support of D(X1, X2) is {0, 1}. Bipartite Ramsey graph: Let N = 2 n. A corollary is that the function D is a 2coloring of the edges of KN,N (the complete bipartite graph over two sets of N vertices) such that any induced subgraph of size N δ by N δ is not monochromatic. 3source extractor: E: ({0, 1} n) 2 → {0, 1} such that for any three independent δsources X1, X2, X3 we have that E(X1, X2, X3) is (o(1)close to being) an unbiased random bit. No previous explicit construction was known for either of these, for any δ < 1/2 and these results constitute major progress to longstanding open problems. A component in these results is a new construction of condensers that may be of independent
Cryptography in NC0
, 2006
"... We study the parallel timecomplexity of basic cryptographic primitives such as oneway functions (OWFs) and pseudorandom generators (PRGs). Specifically, we study the possibility of implementing instances of these primitives by NC 0 functions, namely by functions in which each output bit depends on ..."
Abstract

Cited by 47 (11 self)
 Add to MetaCart
We study the parallel timecomplexity of basic cryptographic primitives such as oneway functions (OWFs) and pseudorandom generators (PRGs). Specifically, we study the possibility of implementing instances of these primitives by NC 0 functions, namely by functions in which each output bit depends on a constant number of input bits. Despite previous efforts in this direction, there has been no convincing theoretical evidence supporting this possibility, which was posed as an open question in several previous works. We essentially settle this question by providing strong positive evidence for the possibility of cryptography in NC 0. Our main result is that every “moderately easy ” OWF (resp., PRG), say computable in NC 1, can be compiled into a corresponding OWF (resp., “lowstretch ” PRG) in which each output bit depends on at most 4 input bits. The existence of OWF and PRG in NC 1 is a relatively mild assumption, implied by most numbertheoretic or algebraic intractability assumptions commonly used in cryptography. A similar compiler can also be obtained for other cryptographic primitives such as oneway permutations, encryption, signatures, commitment, and collisionresistant hashing. Our techniques can also be applied to obtain (unconditional) constructions of “noncryptographic ” PRGs. In particular, we obtain ɛbiased generators and a PRG for spacebounded computation in which each output bit depends on only 3 input bits. Our results make use of the machinery of randomizing polynomials (Ishai and Kushilevitz, 41st FOCS, 2000), which was originally motivated by questions in the domain of informationtheoretic secure multiparty computation. 1
Extracting Randomness via Repeated Condensing
 In Proceedings of the 41st Annual IEEE Symposium on Foundations of Computer Science
, 2000
"... On an input probability distribution with some (min)entropy an extractor outputs a distribution with a (near) maximum entropy rate (namely the uniform distribution). A natural weakening of this concept is a condenser, whose output distribution has a higher entropy rate than the input distribution ( ..."
Abstract

Cited by 46 (14 self)
 Add to MetaCart
On an input probability distribution with some (min)entropy an extractor outputs a distribution with a (near) maximum entropy rate (namely the uniform distribution). A natural weakening of this concept is a condenser, whose output distribution has a higher entropy rate than the input distribution (without losing much of the initial entropy). In this paper we construct efficient explicit condensers. The condenser constructions combine (variants or more efficient versions of) ideas from several works, including the block extraction scheme of [NZ96], the observation made in [SZ94, NT99] that a failure of the block extraction scheme is also useful, the recursive "winwin" case analysis of [ISW99, ISW00], and the error correction of random sources used in [Tre99]. As a natural byproduct, (via repeated iterating of condensers), we obtain new extractor constructions. The new extractors give significant qualitative improvements over previous ones for sources of arbitrary minentropy; they...
2source dispersers for subpolynomial entropy and Ramsey graphs beating the FranklWilson construction
 Proceedings of STOC06
, 2006
"... The main result of this paper is an explicit disperser for two independent sources on n bits, each of entropy k = n o(1). Put differently, setting N = 2 n and K = 2 k, we construct explicit N × N Boolean matrices for which no K × K submatrix is monochromatic. Viewed as adjacency matrices of bipartit ..."
Abstract

Cited by 29 (6 self)
 Add to MetaCart
(Show Context)
The main result of this paper is an explicit disperser for two independent sources on n bits, each of entropy k = n o(1). Put differently, setting N = 2 n and K = 2 k, we construct explicit N × N Boolean matrices for which no K × K submatrix is monochromatic. Viewed as adjacency matrices of bipartite graphs, this gives an explicit construction of KRamsey bipartite graphs of size N. This greatly improves the previous bound of k = o(n) of Barak, Kindler, Shaltiel, Sudakov and Wigderson [4]. It also significantly improves the 25year record of k = Õ( √ n) on the special case of Ramsey graphs, due to Frankl and Wilson [9]. The construction uses (besides ”classical ” extractor ideas) almost all of the machinery developed in the last couple of years for extraction from independent sources, including: • Bourgain’s extractor for 2 independent sources of some entropy rate < 1/2 [5] • Raz’s extractor for 2 independent sources, one of which has any entropy rate> 1/2 [18] • Rao’s extractor for 2 independent blocksources of entropy n Ω(1) [17]
Dense error correction via ℓ1 minimization
, 2009
"... This paper studies the problem of recovering a nonnegative sparse signal x ∈ Rn from highly corrupted linear measurements y = Ax + e ∈ Rm, where e is an unknown error vector whose nonzero entries may be unbounded. Motivated by an observation from face recognition in computer vision, this paper prov ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
(Show Context)
This paper studies the problem of recovering a nonnegative sparse signal x ∈ Rn from highly corrupted linear measurements y = Ax + e ∈ Rm, where e is an unknown error vector whose nonzero entries may be unbounded. Motivated by an observation from face recognition in computer vision, this paper proves that for highly correlated (and possibly overcomplete) dictionaries A, any nonnegative, sufficiently sparse signal x can be recovered by solving an ℓ1minimization problem: min ‖x‖1 + ‖e‖1 subject to y = Ax + e. More precisely, if the fraction ρ of errors is bounded away from one and the support of x grows sublinearly in the dimension m of the observation, then as m goes to infinity, the above ℓ1minimization succeeds for all signals x and almost all signandsupport patterns of e. This result suggests that accurate recovery of sparse signals is possible and computationally feasible even with nearly 100 % of the observations corrupted. The proof relies on a careful characterization of the faces of a convex polytope spanned together by the standard crosspolytope and a set of iid Gaussian vectors with nonzero mean and small variance, which we call the “crossandbouquet ” model. Simulations and experimental results corroborate the findings, and suggest extensions to the result.