Results 1  10
of
38
Multiparty Communication Complexity
, 1989
"... A given Boolean function has its input distributed among many parties. The aim is to determine which parties to tMk to and what information to exchange with each of them in order to evaluate the function while minimizing the total communication. This paper shows that it is possible to obtain the Boo ..."
Abstract

Cited by 614 (21 self)
 Add to MetaCart
A given Boolean function has its input distributed among many parties. The aim is to determine which parties to tMk to and what information to exchange with each of them in order to evaluate the function while minimizing the total communication. This paper shows that it is possible to obtain the Boolean answer deterministically with only a polynomial increase in communication with respect to the information lower bound given by the nondeterministic communication complexity of the function.
Complexity Measures and Decision Tree Complexity: A Survey
 Theoretical Computer Science
, 2000
"... We discuss several complexity measures for Boolean functions: certificate complexity, sensitivity, block sensitivity, and the degree of a representing or approximating polynomial. We survey the relations and biggest gaps known between these measures, and show how they give bounds for the decision tr ..."
Abstract

Cited by 122 (15 self)
 Add to MetaCart
We discuss several complexity measures for Boolean functions: certificate complexity, sensitivity, block sensitivity, and the degree of a representing or approximating polynomial. We survey the relations and biggest gaps known between these measures, and show how they give bounds for the decision tree complexity of Boolean functions on deterministic, randomized, and quantum computers. 1 Introduction Computational Complexity is the subfield of Theoretical Computer Science that aims to understand "how much" computation is necessary and sufficient to perform certain computational tasks. For example, given a computational problem it tries to establish tight upper and lower bounds on the length of the computation (or on other resources, like space). Unfortunately, for many, practically relevant, computational problems no tight bounds are known. An illustrative example is the well known P versus NP problem: for all NPcomplete problems the current upper and lower bounds lie exponentially ...
Dispersers, Deterministic Amplification, and Weak Random Sources.
, 1989
"... We use a certain type of expanding bipartite graphs, called disperser graphs, to design procedures for picking highly correlated samples from a finite set, with the property that the probability of hitting any sufficiently large subset is high. These procedures require a relatively small number of r ..."
Abstract

Cited by 93 (11 self)
 Add to MetaCart
We use a certain type of expanding bipartite graphs, called disperser graphs, to design procedures for picking highly correlated samples from a finite set, with the property that the probability of hitting any sufficiently large subset is high. These procedures require a relatively small number of random bits and are robust with respect to the quality of the random bits. Using these sampling procedures to sample random inputs of polynomial time probabilistic algorithms, we can simulate the performance of some probabilistic algorithms with less random bits or with low quality random bits. We obtain the following results: 1. The error probability of an RP or BPP algorithm that operates with a constant error bound and requires n random bits, can be made exponentially small (i.e. 2 \Gamman ), with only (3 + ffl)n random bits, as opposed to standard amplification techniques that require \Omega\Gamma n 2 ) random bits for the same task. This result is nearly optimal, since the informati...
Polynomial degree vs. quantum query complexity
 Proceedings of FOCS’03
"... The degree of a polynomial representing (or approximating) a function f is a lower bound for the quantum query complexity of f. This observation has been a source of many lower bounds on quantum algorithms. It has been an open problem whether this lower bound is tight. We exhibit a function with pol ..."
Abstract

Cited by 57 (8 self)
 Add to MetaCart
The degree of a polynomial representing (or approximating) a function f is a lower bound for the quantum query complexity of f. This observation has been a source of many lower bounds on quantum algorithms. It has been an open problem whether this lower bound is tight. We exhibit a function with polynomial degree M and quantum query complexity Ω(M 1.321...). This is the first superlinear separation between polynomial degree and quantum query complexity. The lower bound is shown by a new, more general version of quantum adversary method. 1
Finding a BetterthanClassical Quantum AND/OR Algorithm using Genetic Programming
, 1999
"... This paper documents the discovery of a new, betterthanclassical quantum algorithm for the depthtwo AND/OR tree problem. We describe the genetic programming system that was constructed specifically for this work, the quantum computer simulator that is used to evaluate the fitness of evolving quant ..."
Abstract

Cited by 32 (3 self)
 Add to MetaCart
This paper documents the discovery of a new, betterthanclassical quantum algorithm for the depthtwo AND/OR tree problem. We describe the genetic programming system that was constructed specifically for this work, the quantum computer simulator that is used to evaluate the fitness of evolving quantum algorithms, and the newly discovered algorithm. 1 Introduction Quantum computers use the dynamics of atomicscale objects to store and manipulate information. The behavior of atomicscale objects is governed by quantum mechanics rather than by classical physics, and the quantum mechanical properties of these systems can be harnessed to compute certain functions more efficiently than is possible on any classical computer [1]. For example, Shor's quantum factoring algorithm finds the prime factors of an ndigit number in time O(n 2 log(n) log log(n)) [2], while the best known classical factoring algorithms require time O(2 n 1 3 log(n) 2 3 ) and many researchers doubt the existence...
All quantum adversary methods are equivalent
 THEORY OF COMPUTING
, 2006
"... The quantum adversary method is one of the most versatile lowerbound methods for quantum algorithms. We show that all known variants of this method are equivalent: spectral adversary (Barnum, Saks, and Szegedy, 2003), weighted adversary (Ambainis, 2003), strong weighted adversary (Zhang, 2005), an ..."
Abstract

Cited by 29 (5 self)
 Add to MetaCart
The quantum adversary method is one of the most versatile lowerbound methods for quantum algorithms. We show that all known variants of this method are equivalent: spectral adversary (Barnum, Saks, and Szegedy, 2003), weighted adversary (Ambainis, 2003), strong weighted adversary (Zhang, 2005), and the Kolmogorov complexity adversary (Laplante and Magniez, 2004). We also present a few new equivalent formulations of the method. This shows that there is essentially one quantum adversary method. From our approach, all known limitations of these versions of the quantum adversary method easily follow.
Any ANDOR formula of size n can be evaluated in time N 1/2+o(1) on a quantum computer
 In Proceedings of 48th IEEE FOCS
, 2007
"... For any ANDOR formula of size N, there exists a boundederror N 1 2 +o(1)time quantum algorithm, based on a discretetime quantum walk, that evaluates this formula on a blackbox input. Balanced, or “approximately balanced,” formulas can be evaluated in O ( √ N) queries, which is optimal. It foll ..."
Abstract

Cited by 23 (12 self)
 Add to MetaCart
For any ANDOR formula of size N, there exists a boundederror N 1 2 +o(1)time quantum algorithm, based on a discretetime quantum walk, that evaluates this formula on a blackbox input. Balanced, or “approximately balanced,” formulas can be evaluated in O ( √ N) queries, which is optimal. It follows that the (2 − o(1))th power of the quantum query complexity is a lower bound on the formula size, almost solving in the positive an open problem posed by Laplante, Lee and Szegedy. 1
Two Applications of Information Complexity
, 2003
"... We show the following new lower bounds in two concrete complexity models: (1) In the twoparty communication complexity model, we show that the tribes function on n inputs [6] has twosided error randomized complexity # n), while its nondeterminstic complexity and conondeterministic complexity are ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
We show the following new lower bounds in two concrete complexity models: (1) In the twoparty communication complexity model, we show that the tribes function on n inputs [6] has twosided error randomized complexity # n), while its nondeterminstic complexity and conondeterministic complexity are both #( # n). This separation between randomized and nondeterministic complexity is the best possible and it settles an open problem in Kushilevitz and Nisan [17], which was also posed by Beame and Lawry [5].