Results 1  10
of
112
PseudoRandom Generation from OneWay Functions
 PROC. 20TH STOC
, 1988
"... Pseudorandom generators are fundamental to many theoretical and applied aspects of computing. We show howto construct a pseudorandom generator from any oneway function. Since it is easy to construct a oneway function from a pseudorandom generator, this result shows that there is a pseudorandom gene ..."
Abstract

Cited by 725 (21 self)
 Add to MetaCart
Pseudorandom generators are fundamental to many theoretical and applied aspects of computing. We show howto construct a pseudorandom generator from any oneway function. Since it is easy to construct a oneway function from a pseudorandom generator, this result shows that there is a pseudorandom generator iff there is a oneway function.
The NPcompleteness column: an ongoing guide
 Journal of Algorithms
, 1985
"... This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NPCompleteness,’ ’ W. H. Freeman & Co ..."
Abstract

Cited by 188 (0 self)
 Add to MetaCart
This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NPCompleteness,’ ’ W. H. Freeman & Co., New York, 1979 (hereinafter referred to as ‘‘[G&J]’’; previous columns will be referred to by their dates). A background equivalent to that provided by [G&J] is assumed, and, when appropriate, crossreferences will be given to that book and the list of problems (NPcomplete and harder) presented there. Readers who have results they would like mentioned (NPhardness, PSPACEhardness, polynomialtimesolvability, etc.) or open problems they would like publicized, should
Reductions in Streaming Algorithms, with an Application to Counting Triangles in Graphs
"... We introduce reductions in the streaming model as a tool in the design of streaming algorithms. We develop the concept of listefficient streaming algorithms that are essential to the design of efficient streaming algorithms through reductions. Our results include a suite of listefficient streaming ..."
Abstract

Cited by 116 (5 self)
 Add to MetaCart
We introduce reductions in the streaming model as a tool in the design of streaming algorithms. We develop the concept of listefficient streaming algorithms that are essential to the design of efficient streaming algorithms through reductions. Our results include a suite of listefficient streaming algorithms for basic statistical primitives. Using the reduction paradigm along with these tools, we design streaming algorithms for approximately counting the number of triangles in a graph presented as a stream. A specific highlight of our work is the first algorithm for the number of distinct elements in a data stream that achieves arbitrary approximation factors. (Independently, Trevisan [Tre01] has solved this problem via a different approach; our algorithm has the advantage of being listefficient.)
On The Power Of TwoPoints Based Sampling
 Journal of Complexity
, 1989
"... The purpose of this note is to present a new sampling technique and to demonstrate some of its properties. The new technique consists of picking two elements at random, and deterministically generating (from them) a long sequence of pairwise independent elements. The sequence is guarantees to inters ..."
Abstract

Cited by 90 (16 self)
 Add to MetaCart
The purpose of this note is to present a new sampling technique and to demonstrate some of its properties. The new technique consists of picking two elements at random, and deterministically generating (from them) a long sequence of pairwise independent elements. The sequence is guarantees to intersect, with high probability, any set of nonnegligible density. 1. Introduction In recent years the role of randomness in computation has become more and more dominant. Randomness was used to speed up sequential computations (e.g. primality testing, testing polynomial identities etc.), but its effect on parallel and distributed computation is even more impressive. In either cases the solutions are typically presented such that they are guarateed to produce the desired result with some nonnegligible probability. It is implicitly suggested that if a higher degree of confidence is required the algorithm should be run several times, each time using different coin tosses. Since the coin tosses f...
The Complexity of Perfect ZeroKnowledge
, 1987
"... A Perfect ZeroKnowledge interactive proof system convinces a verifier that a string is in a language without revealing any additional knowledge in an informationtheoretic sense. We show that for any language that has a perfect zeroknowledge proof system, its complement has a short interactive pro ..."
Abstract

Cited by 86 (3 self)
 Add to MetaCart
A Perfect ZeroKnowledge interactive proof system convinces a verifier that a string is in a language without revealing any additional knowledge in an informationtheoretic sense. We show that for any language that has a perfect zeroknowledge proof system, its complement has a short interactive protocol. This result implies that there are not any perfect zeroknowledge protocols for NPcomplete languages unless the polynomial time hierarchy collapses. This paper demonstrates that knowledge complexity can be used to show that a language is easy to prove. 1 Introduction Interactive protocols and zeroknowledge, as described by Goldwasser, Micali and Rackoff [GMR], have in recent years proven themselves to be important models of computation in both complexity and cryptography. Interactive proof systems are a randomized extension to NP which give us a greater understanding of what an infinitely powerful machine can prove to a probabilistic polynomial one. Recent results about interactive...
Oracles and Queries that are Sufficient for Exact Learning
 Journal of Computer and System Sciences
, 1996
"... We show that the class of all circuits is exactly learnable in randomized expected polynomial time using weak subset and weak superset queries. This is a consequence of the following result which we consider to be of independent interest: circuits are exactly learnable in randomized expected poly ..."
Abstract

Cited by 83 (5 self)
 Add to MetaCart
We show that the class of all circuits is exactly learnable in randomized expected polynomial time using weak subset and weak superset queries. This is a consequence of the following result which we consider to be of independent interest: circuits are exactly learnable in randomized expected polynomial time with equivalence queries and the aid of an NPoracle. We also show that circuits are exactly learnable in deterministic polynomial time with equivalence queries and a \Sigma 3 oracle. The hypothesis class for the above learning algorithms is the class of circuits of largerbut polynomially relatedsize. Also, the algorithms can be adapted to learn the class of DNF formulas with hypothesis class consisting of depth3  formulas (by the work of Angluin [A90], this is optimal in the sense that the hypothesis class cannot be reduced to DNF formulas, i.e. depth2  formulas).
Statistical ZeroKnowledge Languages Can Be Recognized in Two Rounds
 Journal of Computer and System Sciences
, 1991
"... : Recently, a hierarchy of probabilistic complexity classes generalizing NP has emerged in the work of Babai [B], and Goldwasser, Micali, and Rackoff [GMR1], and Goldwasser and Sipser [GS]. The class IP is defined through the computational model of an interactive proververifier pair. Both Turing ma ..."
Abstract

Cited by 64 (2 self)
 Add to MetaCart
: Recently, a hierarchy of probabilistic complexity classes generalizing NP has emerged in the work of Babai [B], and Goldwasser, Micali, and Rackoff [GMR1], and Goldwasser and Sipser [GS]. The class IP is defined through the computational model of an interactive proververifier pair. Both Turing machines in a pair receive a common input and exchange messages. Every move of the verifier as well as its final determination of whether to accept or reject w are the result of random polynomial time computations on the input and all messages sent so far. The prover has no resource bounds. A language, L, is in IP if there is a proververifier pair such that: 1.) when w 2 L, the verifier accepts with probability at least 1 \Gamma 2 \Gammajwj and, 2.) when w 62 L, the verifier interacting with any prover accepts with probability at most 2 \Gammajwj . Such a proververifier pair is called an interactive proof for L. In addition to defining interactive proofs, Goldwasser, Micali, and Rackoff...
The Computational Complexity of Universal Hashing
 Theoretical Computer Science
, 2002
"... Any implementation of CarterWegman universal hashing from nbit strings to mbit strings requires a timespace tradeoff of TS = Ω(nm). The bound holds in the general boolean branching program model, and thus in essentially any model of computation. As a corollary, computing a+b*c in any field ..."
Abstract

Cited by 58 (3 self)
 Add to MetaCart
Any implementation of CarterWegman universal hashing from nbit strings to mbit strings requires a timespace tradeoff of TS = Ω(nm). The bound holds in the general boolean branching program model, and thus in essentially any model of computation. As a corollary, computing a+b*c in any field F requires a quadratic timespace tradeoff, and the bound holds for any representation of the elements of the field. Other lower bounds on the...