Results 11  20
of
34
Short PCPs verifiable in polylogarithmic time
 in Proceedings of the 20th IEEE Conference on Computational Complexity
, 2004
"... We show that every language in NP has a probabilistically checkable proof of proximity (i.e., proofs asserting that an instance is “close ” to a member of the language), where the verifier’s running time is polylogarithmic in the input size and the length of the probabilistically checkable proof is ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
We show that every language in NP has a probabilistically checkable proof of proximity (i.e., proofs asserting that an instance is “close ” to a member of the language), where the verifier’s running time is polylogarithmic in the input size and the length of the probabilistically checkable proof is only polylogarithmically larger that the length of the classical proof. (Such a verifier can only query polylogarithmically many bits of the input instance and the proof. Thus it needs oracle access to the input as well as the proof, and cannot guarantee that the input is in the language — only that it is close to some string in the language.) If the verifier is restricted further in its query complexity and only allowed q queries, then the proof size blows up by (log n)c/q a factor of 2 where the constant c depends only on the language (and is independent of q). Our results thus give efficient (in the sense of running time) versions of the shortest known PCPs, due to BenSasson et al. (STOC ’04) and BenSasson and Sudan (STOC ’05), respectively. The time complexity of the verifier and the size of the proof were the original emphases in the definition of holographic proofs, due to Babai et al.
Secure Commitment Against A Powerful Adversary  A security primitive based on average intractability (Extended Abstract)
, 1992
"... Secure commitment is a primitive enabling information hiding, which is one of the most basic tools in cryptography. Specifically, it is a twoparty partialinformation game between a "committer" and a "receiver", in which a secure envelope is first implemented and later opened. The committer has a b ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
Secure commitment is a primitive enabling information hiding, which is one of the most basic tools in cryptography. Specifically, it is a twoparty partialinformation game between a "committer" and a "receiver", in which a secure envelope is first implemented and later opened. The committer has a bit in mind which he commits to by putting it in a "secure envelope". The receiver cannot guess what the value is until the opening stage and the committer can not change his mind once committed. In this paper, we investigate the feasibility of bit commitment when one of the participants (either committer or receiver) has an unfair computational advantage. That is, we consider commitment to a strong receiver with a To appear in Symposium on Theoretical Aspects of Computer Science (STACS) 92, February 1315, Paris, France. y MIT Laboratory for Computer Science, 545 Technology Square, Cambridge MA 02139, USA. Supported by IBM Graduate Fellowship. Part of this work done while at IBM T.J. W...
Randomizing Reductions Of Search Problems
 SIAM Journal of Computing
, 1993
"... . This paper closes a gap in the foundations of the theory of average case complexity. First, we clarify the notion of a feasible solution for a search problem and prove its robustness. Second, we give a general and usable notion of manyone randomizing reductions of search problems and prove that i ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
. This paper closes a gap in the foundations of the theory of average case complexity. First, we clarify the notion of a feasible solution for a search problem and prove its robustness. Second, we give a general and usable notion of manyone randomizing reductions of search problems and prove that it has desirable properties. All reductions of search problems to search problems in the literature on average case complexity can be viewed as such manyone randomizing reductions; this includes those reductions in the literature that use iterations and therefore do not look manyone. As an illustration, we present a careful proof in our framework of a theorem of Impagliazzo and Levin. Key words. Average case, search problems, reduction, randomization. 1. Introduction and results. Reduction theory for average case computational complexity was pioneered by Leonid Levin [?]. Recently, one of us wrote a survey on the subject [?], and we refer the reader there for a general background. However,...
Questions and answers  a category arising in linear logic, complexity theory, and set theory
 In Girard et al
"... analysis of cardinal characteristics of the continuum. Its morphisms have been used in describing reductions between search problems in complexity theory. We describe this category and how it arises in these various contexts. We also show how these contexts suggest certain new multiplicative connect ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
analysis of cardinal characteristics of the continuum. Its morphisms have been used in describing reductions between search problems in complexity theory. We describe this category and how it arises in these various contexts. We also show how these contexts suggest certain new multiplicative connectives for linear logic. Perhaps the most interesting of these is a sequential composition suggested by the settheoretic application.
Complete distributional problems, hard languages, and resourcebounded measure
 Theoretical Computer Science
, 2000
"... We say that a distribution µ is reasonable if there exists a constant s ≥ 0 such that µ({x  x  ≥ n}) = Ω ( 1 ns). We prove the following result, which suggests that all DistNPcomplete problems have reasonable distributions. If NP contains a DTIME(2 n)biimmune set, then every DistNPcomplete ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
We say that a distribution µ is reasonable if there exists a constant s ≥ 0 such that µ({x  x  ≥ n}) = Ω ( 1 ns). We prove the following result, which suggests that all DistNPcomplete problems have reasonable distributions. If NP contains a DTIME(2 n)biimmune set, then every DistNPcomplete set has a reasonable distribution. It follows from work of Mayordomo [May94] that the consequent holds if the pmeasure of NP is not zero. Cai and Selman [CS96] defined a modification and extension of Levin’s notion of average polynomial time to arbitrary timebounds and proved that if L is Pbiimmune, then L is distributionally hard, meaning, that for every polynomialtime computable distribution µ, the distributional problem (L, µ) is not polynomial on the µaverage. We prove the following results, which suggest that distributional hardness is closely related to more traditional notions of hardness. 1. If NP contains a distributionally hard set, then NP contains a Pimmune set. 2. There exists a language L that is distributionally hard but not Pbiimmune if and only if P contains a set that is immune to all Pprintable sets. The following corollaries follow readily 1. If the pmeasure of NP is not zero, then there exists a language L that is distributionally hard but not Pbiimmune. 2. If the p2measure of NP is not zero, then there exists a language L in NP that is distributionally hard but not Pbiimmune. 1
The ChallengerSolver game: Variations On The Theme . . .
 BULLETIN OF EURO. ASSOC. FOR THEOR. COMPUTER SCIENCE
, 1989
"... ..."
All Natural NPC Problems Have AverageCase Complete Versions
 IN 35TH ACM SYMPOSIUM ON THE THEORY OF COMPUTING
, 2006
"... In 1984 Levin put forward a suggestion for a theory of average case complexity. In this theory a problem, called a distributional problem, is defined as a pair consisting of a decision problem and a probability distribution over the instances. Introducing adequate notions of ”efficiencyonaverage”, ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
In 1984 Levin put forward a suggestion for a theory of average case complexity. In this theory a problem, called a distributional problem, is defined as a pair consisting of a decision problem and a probability distribution over the instances. Introducing adequate notions of ”efficiencyonaverage”, simple distributions and efficiencyonaverage preserving reductions, Levin developed a theory analogous to the theory of N Pcompleteness. In particular, he showed that there exists a simple distributional problem that is complete under these reductions. But since then very few distributional problems were shown to be complete in this sense. In this paper we show a simple sufficient condition for an N Pcomplete decision problem to have a distributional version that is complete under these reductions (and thus to be ”hard on the average ” with respect to some simple probability distribution). Apparently all known N Pcomplete decision problems meet this condition.
Efficient AverageCase Algorithms for the Modular Group
 In the Proceedings of The 35th Annual Symposium on Foundations of Computer Science
, 1994
"... The modular group occupies a central position in many branches of mathematical sciences. In this paper we give average polynomialtime algorithms for the unbounded and bounded membership problems for finitely generated subgroups of the modular group. The latter result affirms a conjecture of Gurevic ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
The modular group occupies a central position in many branches of mathematical sciences. In this paper we give average polynomialtime algorithms for the unbounded and bounded membership problems for finitely generated subgroups of the modular group. The latter result affirms a conjecture of Gurevich [5]. 1 Introduction 1.1 The Modular Group The modular group \Gamma is a remarkable mathematical object. It has several equivalent characterizations: (i) SL 2 ()= \Sigma I, the quotient of the group SL 2 () of 2 \Theta 2 integer matrices with determinant 1 modulo its central subgroup f\SigmaI g; (ii) the group of complex fractional linear transformations z 7! az + b cz + d with integer coefficients satisfying ad \Gamma bc = 1; (iii) the free product of cyclic groups of order 2 and 3; i.e., the group presented by generators R; S and relations R 2 j S 3 j 1; (iv) the group of automorphisms of a certain regular tesselation of the hyperbolic plane (Figure 1); Proc. 35th IEEE Symp...