Results 1  10
of
48
Complexity Measures and Decision Tree Complexity: A Survey
 Theoretical Computer Science
, 2000
"... We discuss several complexity measures for Boolean functions: certificate complexity, sensitivity, block sensitivity, and the degree of a representing or approximating polynomial. We survey the relations and biggest gaps known between these measures, and show how they give bounds for the decision tr ..."
Abstract

Cited by 123 (14 self)
 Add to MetaCart
We discuss several complexity measures for Boolean functions: certificate complexity, sensitivity, block sensitivity, and the degree of a representing or approximating polynomial. We survey the relations and biggest gaps known between these measures, and show how they give bounds for the decision tree complexity of Boolean functions on deterministic, randomized, and quantum computers. 1 Introduction Computational Complexity is the subfield of Theoretical Computer Science that aims to understand "how much" computation is necessary and sufficient to perform certain computational tasks. For example, given a computational problem it tries to establish tight upper and lower bounds on the length of the computation (or on other resources, like space). Unfortunately, for many, practically relevant, computational problems no tight bounds are known. An illustrative example is the well known P versus NP problem: for all NPcomplete problems the current upper and lower bounds lie exponentially ...
Exponential lower bound for 2query locally decodable codes via a quantum argument
 Journal of Computer and System Sciences
, 2003
"... Abstract A locally decodable code encodes nbit strings x in mbit codewords C(x) in such a way that one can recover any bit xi from a corrupted codeword by querying only a few bits of that word. We use a quantum argument to prove that LDCs with 2 classical queries require exponential length: m = 2 ..."
Abstract

Cited by 123 (18 self)
 Add to MetaCart
Abstract A locally decodable code encodes nbit strings x in mbit codewords C(x) in such a way that one can recover any bit xi from a corrupted codeword by querying only a few bits of that word. We use a quantum argument to prove that LDCs with 2 classical queries require exponential length: m = 2 \Omega (n). Previously this was known only for linear codes (Goldreich et al. 02). The
Randomizing Polynomials: A New Representation with Applications to RoundEfficient Secure Computation
 In Proc. 41st FOCS
, 2000
"... Motivated by questions about secure multiparty computation, we introduce and study a new natural representation of functions by polynomials, which we term randomizing polynomials. "Standard" lowdegree polynomials over a finite field are easy to compute with a small number of communication rounds i ..."
Abstract

Cited by 47 (17 self)
 Add to MetaCart
Motivated by questions about secure multiparty computation, we introduce and study a new natural representation of functions by polynomials, which we term randomizing polynomials. "Standard" lowdegree polynomials over a finite field are easy to compute with a small number of communication rounds in virtually any setting for secure computation. However, most Boolean functions cannot be evaluated by a polynomial whose degree is smaller than their input size. We get around this barrier by relaxing the requirement of evaluatingf into a weaker requirement of randomizing f: mapping the inputs of f along with independent random inputs into a vector of outputs, whose distribution depends only on the value of f . We show that degree3 polynomials are sufficient to randomize any function f , relating the efficiency of such a randomization to the branching program size of f . On the other hand, by characterizing the exact class of Boolean functio...
On the Power of NumberTheoretic Operations with Respect to Counting
 IN PROCEEDINGS 10TH STRUCTURE IN COMPLEXITY THEORY
, 1995
"... We investigate function classes h#Pi f which are defined as the closure of #P under the operation f and a set of known closure properties of #P, e.g. summation over an exponential range. First, we examine operations f under which #P is closed (i.e., h#Pi f = #P) in every relativization. We obtain t ..."
Abstract

Cited by 32 (9 self)
 Add to MetaCart
We investigate function classes h#Pi f which are defined as the closure of #P under the operation f and a set of known closure properties of #P, e.g. summation over an exponential range. First, we examine operations f under which #P is closed (i.e., h#Pi f = #P) in every relativization. We obtain the following complete characterization of these operations: #P is closed under f in every relativization if and only if f is a finite sum of binomial coefficients over constants. Second, we characterize operations f with respect to their power in the counting context in the unrelativized case. For closure properties f of #P, we have h#Pi f = #P. The other end of the range is marked by operations f for which h#Pi f corresponds to the counting hierarchy. We call these operations counting hard and give general criteria for hardness. For many operations f we show that h#Pi f corresponds to some subclass C of the counting hierarchy. This will then imply that #P is closed under f if and only if ...
Circuit Complexity before the Dawn of the New Millennium
, 1997
"... The 1980's saw rapid and exciting development of techniques for proving lower bounds in circuit complexity. This pace has slowed recently, and there has even been work indicating that quite different proof techniques must be employed to advance beyond the current frontier of circuit lower bounds. Al ..."
Abstract

Cited by 30 (3 self)
 Add to MetaCart
The 1980's saw rapid and exciting development of techniques for proving lower bounds in circuit complexity. This pace has slowed recently, and there has even been work indicating that quite different proof techniques must be employed to advance beyond the current frontier of circuit lower bounds. Although this has engendered pessimism in some quarters, there have in fact been many positive developments in the past few years showing that significant progress is possible on many fronts. This paper is a (necessarily incomplete) survey of the state of circuit complexity as we await the dawn of the new millennium.
Polylogarithmic independence fools AC0 circuits
 Electronic Colloquium on Computational Complexity
"... Abstract—We prove that polysized AC 0 circuits cannot distinguish a polylogarithmically independent distribution from the uniform one. This settles the 1990 conjecture by Linial and Nisan [LN90]. The only prior progress on the problem was by Bazzi [Baz07], who showed that O(log 2 n)independent dis ..."
Abstract

Cited by 29 (0 self)
 Add to MetaCart
Abstract—We prove that polysized AC 0 circuits cannot distinguish a polylogarithmically independent distribution from the uniform one. This settles the 1990 conjecture by Linial and Nisan [LN90]. The only prior progress on the problem was by Bazzi [Baz07], who showed that O(log 2 n)independent distributions fool polysize DNF formulas. Razborov [Raz08] has later given a much simpler proof for Bazzi’s theorem. A. The problem I.
New Degree Bounds for Polynomial Threshold Functions
"... We give new upper and lower bounds on the degree of real multivariate polynomials which signrepresent Boolean functions. Our upper bounds for Boolean formulas yield the rst known subexponential time learning algorithms for formulas of superconstant depth. Our lower bounds for constantdepth cir ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
We give new upper and lower bounds on the degree of real multivariate polynomials which signrepresent Boolean functions. Our upper bounds for Boolean formulas yield the rst known subexponential time learning algorithms for formulas of superconstant depth. Our lower bounds for constantdepth circuits and intersections of halfspaces are the rst new degree lower bounds since 1968, improving results of Minsky and Papert. The lower bounds are proved constructively; we give explicit dual solutions to the necessary linear programs.
A Lower Bound On The Mod 6 Degree Of The Or Function
 Computational Complexity
, 1995
"... We examine the computational power of modular counting, where the modulus m is not a prime power, in the setting of polynomials in boolean variables over Zm . In particular, we say that a polynomial P weakly represents a boolean function f (both have n variables) if for any inputs x and y in f0; ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
We examine the computational power of modular counting, where the modulus m is not a prime power, in the setting of polynomials in boolean variables over Zm . In particular, we say that a polynomial P weakly represents a boolean function f (both have n variables) if for any inputs x and y in f0; 1g n we have P (x) 6= P (y) whenever f(x) 6= f(y).
Relating Polynomial Time to Constant Depth
 THEORETICAL COMPUTER SCIENCE
, 1998
"... Going back to the seminal paper [FSS84] by Furst, Saxe, and Sipser, analogues between polynomial time classes and constant depth circuit classes have been considered in a number of papers. Oracles separating polynomial time classes have been obtained by diagonalization making essential use of lower ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
Going back to the seminal paper [FSS84] by Furst, Saxe, and Sipser, analogues between polynomial time classes and constant depth circuit classes have been considered in a number of papers. Oracles separating polynomial time classes have been obtained by diagonalization making essential use of lower bounds for circuit classes. In this note we show how separating oracles can be obtained uniformly from circuit lower bounds without the need of carrying out a particular diagonalization. Our technical tool is the leaf language approach to the definition of complexity classes.