Results 1  10
of
11
A Parallel Repetition Theorem
 SIAM Journal on Computing
, 1998
"... We show that a parallel repetition of any twoprover oneround proof system (MIP(2, 1)) decreases the probability of error at an exponential rate. No constructive bound was previously known. The constant in the exponent (in our analysis) depends only on the original probability of error and on the t ..."
Abstract

Cited by 318 (10 self)
 Add to MetaCart
We show that a parallel repetition of any twoprover oneround proof system (MIP(2, 1)) decreases the probability of error at an exponential rate. No constructive bound was previously known. The constant in the exponent (in our analysis) depends only on the original probability of error and on the total number of possible answers of the two provers. The dependency on the total number of possible answers is logarithmic, which was recently proved to be almost the best possible [U. Feige and O. Verbitsky, Proc. 11th Annual IEEE Conference on Computational Complexity, IEEE Computer Society Press, Los Alamitos, CA, 1996, pp. 7076].
Zero Knowledge and the Chromatic Number
 Journal of Computer and System Sciences
, 1996
"... We present a new technique, inspired by zeroknowledge proof systems, for proving lower bounds on approximating the chromatic number of a graph. To illustrate this technique we present simple reductions from max3coloring and max3sat, showing that it is hard to approximate the chromatic number wi ..."
Abstract

Cited by 176 (8 self)
 Add to MetaCart
We present a new technique, inspired by zeroknowledge proof systems, for proving lower bounds on approximating the chromatic number of a graph. To illustrate this technique we present simple reductions from max3coloring and max3sat, showing that it is hard to approximate the chromatic number within \Omega\Gamma N ffi ), for some ffi ? 0. We then apply our technique in conjunction with the probabilistically checkable proofs of Hastad, and show that it is hard to approximate the chromatic number to within\Omega\Gamma N 1\Gammaffl ) for any ffl ? 0, assuming NP 6` ZPP. Here, ZPP denotes the class of languages decidable by a random expected polynomialtime algorithm that makes no errors. Our result matches (up to low order terms) the known gap for approximating the size of the largest independent set. Previous O(N ffi ) gaps for approximating the chromatic number (such as those by Lund and Yannakakis, and by Furer) did not match the gap for independent set, and do not extend...
Proof Checking and Approximation: Towards Tight Results
 SIGACT News
, 1996
"... Introduction The last few years have seen much progress in proving "nonapproximability results" for wellknown NPhard optimization problems. As we know, the breakthrough has come by the application of results from probabilistic proof checking. It is an area that seems to continue to su ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
Introduction The last few years have seen much progress in proving "nonapproximability results" for wellknown NPhard optimization problems. As we know, the breakthrough has come by the application of results from probabilistic proof checking. It is an area that seems to continue to surprise: since the connection was discovered in 1991 (Feige et. al. [21]), not only have nonapproximability results emerged for a wide range of problems, but the factors shown hard steadily increase. Today, tight results are known for central problems like MaxClique and MinSetCover. (That is, the approximation algorithms we have for these problems can be shown to be the best possible). Such results also seem to be in sight for ChromNum. These are remarkable things, especially in the light of our knowledge of just five years ago. And meanwhile we continue to make progress on the MaxSNP front, where both the algor
Automatic generation of sound zeroknowledge protocols (Extended Poster Abstract)
, 2008
"... Efficient zeroknowledge proofs of knowledge (ZKPoK) are basic building blocks of many practical cryptographic applications such as identification schemes, group signatures, and secure multiparty computation. Currently, first applications that critically rely on ZKPoKs are being deployed in the re ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
Efficient zeroknowledge proofs of knowledge (ZKPoK) are basic building blocks of many practical cryptographic applications such as identification schemes, group signatures, and secure multiparty computation. Currently, first applications that critically rely on ZKPoKs are being deployed in the real world. The most prominent example is Direct Anonymous Attestation (DAA), which was adopted by the Trusted Computing Group (TCG) and implemented as one of the functionalities of the cryptographic chip Trusted Platform Module (TPM). Implementing systems using ZKPoK turns out to be challenging, since ZKPoK are, loosely speaking, significantly more complex than standard crypto primitives, such as encryption and signature schemes. As a result, implementation cycles of ZKPoK are timeconsuming and errorprone, in particular for developers with minor or no cryptographic skills. In this paper we report on our ongoing and future research vision with the goal to bring ZKPoK to practice by automatically generating sound ZKPoK protocols and make them accessible to crypto and security engineers. To this end we are developing protocols and compilers that support and automate the design and generation of secure and efficient implementation of ZKPoK protocols.
Bringing zeroknowledge proofs of knowledge to practice
 In 17th International Workshop on Security Protocols
, 2009
"... Abstract. Efficient zeroknowledge proofs of knowledge (ZKPoK) are basic building blocks of many practical cryptographic applications such as identification schemes, group signatures, and secure multiparty computation. Currently, first applications that critically rely on ZKPoKs are being deployed ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Abstract. Efficient zeroknowledge proofs of knowledge (ZKPoK) are basic building blocks of many practical cryptographic applications such as identification schemes, group signatures, and secure multiparty computation. Currently, first applications that critically rely on ZKPoKs are being deployed in the real world. The most prominent example is Direct Anonymous Attestation (DAA), which was adopted by the Trusted Computing Group (TCG) and implemented as one of the functionalities of the cryptographic Trusted Platform Module (TPM) chip. Implementing systems using ZKPoK turns out to be challenging, since ZKPoK are, loosely speaking, significantly more complex than standard crypto primitives, such as encryption and signature schemes. As a result, implementation cycles of ZKPoK are timeconsuming and errorprone, in particular for developers with minor or no cryptographic skills. In this paper we report on our ongoing and future research vision with the goal to bring ZKPoK to practice by making them accessible to crypto and security engineers. To this end we are developing compilers and related tools that support and partially automate the design, implementation, verification and secure implementation of ZKPoK protocols. 1
Probabilistic Proof Systems  A Survey
 IN SYMPOSIUM ON THEORETICAL ASPECTS OF COMPUTER SCIENCE
, 1996
"... Various types of probabilistic proof systems have played a central role in the development of computer science in the last decade. In this exposition, we concentrate on three such proof systems  interactive proofs, zeroknowledge proofs, and probabilistic checkable proofs  stressing the essen ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Various types of probabilistic proof systems have played a central role in the development of computer science in the last decade. In this exposition, we concentrate on three such proof systems  interactive proofs, zeroknowledge proofs, and probabilistic checkable proofs  stressing the essential role of randomness in each of them.
On the Role of Shared Randomness in Two Prover Proof Systems
, 1995
"... In this paper we consider which aspects of the two prover model are necessary for their striking language recognition and zeroknowledge capabilities. We approach this question by looking at an alternative, more symmetric model which we call the double verifier model. We find that in this model the ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
In this paper we consider which aspects of the two prover model are necessary for their striking language recognition and zeroknowledge capabilities. We approach this question by looking at an alternative, more symmetric model which we call the double verifier model. We find that in this model the shared randomness of the verifiers is key to the language recognition power: if the verifiers don't share randomness the power is PSPACE; otherwise it is MIP = NEXPTIME. We find that the shared randomness of the provers is necessary for zeroknowledge: if the provers don't share randomness, statistical zeroknowledge is only possible for languages in BPP NP ; else it is possible for all of NEXPTIME. These results have immediate implications for the standard twoprover model. We see that correlations between the verifier's queries is crucial for the language recognition power of two prover proofs. In particular, the natural analog of IP = AM does not hold in the twoprover model unless NEX...
On the Complexity of Statistical Reasoning
 In Proceedings, Israeli Symposium on Theory of Computing and Systems
, 1998
"... We show that basic problems in reasoning about statistics are NPhard to even approximately solve. We consider the problem of detecting internal inconsistencies in a set of statistics. We say that a set of statistics is fflinconsistent if one of the probabilities must be off by at least ffl. For a ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We show that basic problems in reasoning about statistics are NPhard to even approximately solve. We consider the problem of detecting internal inconsistencies in a set of statistics. We say that a set of statistics is fflinconsistent if one of the probabilities must be off by at least ffl. For a positive constant ffl, We show that it is NPhard to distinguish fflinconsistent statistics from selfconsistent statistics. This result holds when restricted to complete sets of pairwise statistics over Boolean domains.
On the Complexity of Statistical Reasoning
"... We show that basic problems in reasoning about statistics are N Phard to even approximately solve. We consider the problem of detecting internal inconsistencies in a set of statistics. We say that a set of statistics is fflinconsistent if one of the probabilities must be off by at least ffl. For a ..."
Abstract
 Add to MetaCart
We show that basic problems in reasoning about statistics are N Phard to even approximately solve. We consider the problem of detecting internal inconsistencies in a set of statistics. We say that a set of statistics is fflinconsistent if one of the probabilities must be off by at least ffl. For a positive constant ffl, We show N Phard to distinguish fflinconsistent statistics from selfconsistent statistics. This result holds when restricted to complete sets of pairwise statistics over Boolean domains.
Probabilistically Checkable Proofs with Zero Knowledge
"... We construct PCPs with strong zeroknowledge properties. First, we construct polynomially bounded (in size) PCP's for NP which can be checked using polylogarithmic queries, with polynomially low error, yet are statistical zeroknowledge against an adversary that makes U arbitrary queries, w ..."
Abstract
 Add to MetaCart
We construct PCPs with strong zeroknowledge properties. First, we construct polynomially bounded (in size) PCP's for NP which can be checked using polylogarithmic queries, with polynomially low error, yet are statistical zeroknowledge against an adversary that makes U arbitrary queries, where U can be set to any polynomial. Second, we construct PCPs for NEXPTIME that can be checked using polynomially many queries, yet are statistically zeroknowledge against any polynomially bounded adversary. These PCPs are exponential in size and have exponentially low error. Previously, it was only known how to construct zeroknowledge PCPs with a constant error probability. In the course of constructing these PCP's we abstract a tool we call locking systems. We provide the definition and also a locking system with very efficient parameters. This mechanism may be useful in other settings as well. 1