Results 1  10
of
53
How to Go Beyond the BlackBox Simulation Barrier
 In 42nd FOCS
, 2001
"... The simulation paradigm is central to cryptography. A simulator is an algorithm that tries to simulate the interaction of the adversary with an honest party, without knowing the private input of this honest party. Almost all known simulators use the adversary’s algorithm as a blackbox. We present t ..."
Abstract

Cited by 243 (13 self)
 Add to MetaCart
(Show Context)
The simulation paradigm is central to cryptography. A simulator is an algorithm that tries to simulate the interaction of the adversary with an honest party, without knowing the private input of this honest party. Almost all known simulators use the adversary’s algorithm as a blackbox. We present the first constructions of nonblackbox simulators. Using these new nonblackbox techniques we obtain several results that were previously proven to be impossible to obtain using blackbox simulators. Specifically, assuming the existence of collision resistent hash functions, we construct a new zeroknowledge argument system for NP that satisfies the following properties: 1. This system has a constant number of rounds with negligible soundness error. 2. It remains zero knowledge even when composed concurrently n times, where n is the security parameter. Simultaneously obtaining 1 and 2 has been recently proven to be impossible to achieve using blackbox simulators. 3. It is an ArthurMerlin (public coins) protocol. Simultaneously obtaining 1 and 3 was known to be impossible to achieve with a blackbox simulator. 4. It has a simulator that runs in strict polynomial time, rather than in expected polynomial time. All previously known constantround, negligibleerror zeroknowledge arguments utilized expected polynomialtime simulators.
A Sanctuary for Mobile Agents
, 1997
"... The Sanctuary project at UCSD is building a secure infrastructure for mobile agents, and examining ..."
Abstract

Cited by 129 (4 self)
 Add to MetaCart
(Show Context)
The Sanctuary project at UCSD is building a secure infrastructure for mobile agents, and examining
Indifferentiability, impossibility results on reductions, and applications to the random oracle methodology
 Theory of Cryptography  TCC 2004, Lecture Notes in Computer Science
, 2004
"... Abstract. The goals of this paper are threefold. First we introduce and motivate a generalization of the fundamental concept of the indistinguishability of two systems, called indifferentiability. This immediately leads to a generalization of the related notion of reducibility of one system to anot ..."
Abstract

Cited by 93 (2 self)
 Add to MetaCart
(Show Context)
Abstract. The goals of this paper are threefold. First we introduce and motivate a generalization of the fundamental concept of the indistinguishability of two systems, called indifferentiability. This immediately leads to a generalization of the related notion of reducibility of one system to another. Second, we prove that indifferentiability is the necessary and sufficient condition on two systems S and T such that the security of any cryptosystem using T as a component is not affected when T is substituted by S. In contrast to indistinguishability, indifferentiability is applicable in settings where a possible adversary is assumed to have access to additional information about the internal state of the involved systems, for instance the public parameter selecting a member from a family of hash functions. Third, we state an easily verifiable criterion for a system U not to be reducible (according to our generalized definition) to another system V and, as an application, prove that a random oracle is not reducible to a weaker primitive, called asynchronous beacon, and also that an asynchronous beacon is not reducible to a finitelength random string. Each of these irreducibility results alone implies the main theorem of Canetti, Goldreich and Halevi stating that there exist cryptosystems that are secure in the random oracle model but for which replacing the random oracle by any implementation leads to an insecure cryptosystem. Key words. Indistinguishability, reductions, indifferentiability, security proofs, random oracle methodology, hash functions.
New and improved constructions of nonmalleable cryptographic protocols
 In 37th Annual ACM Symposium on Theory of Computing
, 2005
"... We present a new constant round protocol for nonmalleable zeroknowledge. Using this protocol as a subroutine, we obtain a new constantround protocol for nonmalleable commitments. Our constructions rely on the existence of (standard) collision resistant hash functions. Previous constructions eith ..."
Abstract

Cited by 52 (17 self)
 Add to MetaCart
(Show Context)
We present a new constant round protocol for nonmalleable zeroknowledge. Using this protocol as a subroutine, we obtain a new constantround protocol for nonmalleable commitments. Our constructions rely on the existence of (standard) collision resistant hash functions. Previous constructions either relied on the existence of trapdoor permutations and hash functions that are collision resistant against subexponential sized circuits, or required a superconstant number of rounds. Additional results are the first construction of a nonmalleable commitment scheme that is statistically hiding (with respect to opening), and the first nonmalleable commitments that satisfy a strict polynomialtime simulation requirement. Our approach differs from the approaches taken in previous works in that we view nonmalleable zeroknowledge as a buildingblock rather than an end goal. This gives rise to a modular construction of nonmalleable commitments and results in a somewhat simpler analysis.
BoundedConcurrent Secure TwoParty Computation in a Constant Number of Rounds
 In 44th FOCS
, 2003
"... We consider the problem of constructing a general protocol for secure twoparty computation in a way that preserves security under concurrent composition. In our treatment, we focus on the case where an apriori bound on the number of concurrent sessions is specified before the protocol is construct ..."
Abstract

Cited by 51 (15 self)
 Add to MetaCart
We consider the problem of constructing a general protocol for secure twoparty computation in a way that preserves security under concurrent composition. In our treatment, we focus on the case where an apriori bound on the number of concurrent sessions is specified before the protocol is constructed (a.k.a. bounded concurrency). We make no setup assumptions. Lindell (STOC 2003) has shown that any protocol for boundedconcurrent secure twoparty computation, whose security is established via blackbox simulation, must have round complexity that is strictly larger than the bound on the number of concurrent sessions. In this paper, we construct a (non blackbox) protocol for realizing boundedconcurrent secure twoparty computation in a constant number of rounds. The only previously known protocol for realizing the above task required more rounds than the prespecified bound on the number of sessions (despite usage of non blackbox simulation techniques). Our constructions rely on the existence of enhanced trapdoor permutations, as well as on the existence of hash functions that are collisionresistant against subexponential sized circuits. 1
Does Parallel Repetition Lower the Error in Computationally Sound Protocols
 In Proceedings of 38th Annual Symposium on Foundations of Computer Science, IEEE
, 1997
"... Whether or not parallel repetition lowers the error has been a fundamental question in the theory of protocols, with applications in many di erent areas. It is well known that parallel repetition reduces the error at an exponential rate in interactive proofs and ArthurMerlin games. It seems to have ..."
Abstract

Cited by 40 (7 self)
 Add to MetaCart
Whether or not parallel repetition lowers the error has been a fundamental question in the theory of protocols, with applications in many di erent areas. It is well known that parallel repetition reduces the error at an exponential rate in interactive proofs and ArthurMerlin games. It seems to have been taken for granted that the same is true in arguments, or other proofs where the soundness only holds with respect to computationally bounded parties. We show that this is not the case. Surprisingly, parallel repetition can actually fail in this setting. We present fourround protocols whose error does not decrease under parallel repetition. This holds for any (polynomial) number of repetitions. These protocols exploit nonmalleable encryption and can be based on any trapdoor permutation. On the other hand we show that for threeround protocols the error does go down exponentially fast. The question of parallel error reduction is particularly important when the protocol is used in cryptographic settings like identi cation, and the error represent the probability that an intruder succeeds.
Concurrent nonmalleable commitments
 In FOCS
, 2005
"... We present a nonmalleable commitment scheme that retains its security properties even when concurrently executed a polynomial number of times. That is, a maninthemiddle adversary who is simultaneously participating in multiple concurrent commitment phases of our scheme, both as a sender and as a ..."
Abstract

Cited by 40 (13 self)
 Add to MetaCart
(Show Context)
We present a nonmalleable commitment scheme that retains its security properties even when concurrently executed a polynomial number of times. That is, a maninthemiddle adversary who is simultaneously participating in multiple concurrent commitment phases of our scheme, both as a sender and as a receiver, cannot make the values he commits to depend on the values he receives commitments to. Our result is achieved without assuming an apriori bound on the number of executions and without relying on any setup assumptions. Our construction relies on the existence of standard clawfree permutations and only requires a constant number of communication rounds. 1
On the compressibility of NP instances and cryptographic applications
 In Electronic Colloquium on Computational Complexity (ECCC
, 2006
"... We initiate the study of compression that preserves the solution to an instance of a problem rather than preserving the instance itself. Our focus is on the compressibility of NP decision problems. We consider NP problems that have long instances but relatively short witnesses. The question is, can ..."
Abstract

Cited by 34 (1 self)
 Add to MetaCart
(Show Context)
We initiate the study of compression that preserves the solution to an instance of a problem rather than preserving the instance itself. Our focus is on the compressibility of NP decision problems. We consider NP problems that have long instances but relatively short witnesses. The question is, can one efficiently compress an instance and store a shorter representation that maintains the information of whether the original input is in the language or not. We want the length of the compressed instance to be polynomial in the length of the witness rather than the length of original input. Such compression enables to succinctly store instances until a future setting will allow solving them, either via a technological or algorithmic breakthrough or simply until enough time has elapsed. We give a new classification of NP with respect to compression. This classification forms a stratification of NP that we call the VC hierarchy. The hierarchy is based on a new type of reduction called Wreduction and there are compressioncomplete problems for each class. Our motivation for studying this issue stems from the vast cryptographic implications compressibility has. For example, we say that SAT is compressible if there exists a polynomial p(·, ·) so that given a formula consisting of m clauses over n variables it is possible to come up with an equivalent (w.r.t satisfiability) formula of size at most p(n, logm). Then given a compression algorithm for SAT we provide a construction of collision resistant hash functions from any oneway function. This task was shown to be impossible via blackbox reductions [57], and indeed the construction presented is inherently nonblackbox. Another application of SAT compressibility is a cryptanalytic result concerning the limitation of everlasting security in the bounded storage model when mixed with (time) complexity based cryptography. In addition, we study an approach to constructing an Oblivious Transfer Protocol from any oneway function. This approach is based on compression for SAT that also has a property that we call witness retrievability. However, we mange to prove severe limitations on the ability to achieve witness retrievable compression of SAT. 1
How to play almost any mental game over the net  concurrent composition via superpolynomial simulation
 In Proceedings of the 46th Annual Symposium on Foundations of Computer Science  FOCS’05
, 2005
"... We construct a secure protocol for any multiparty functionality that remains secure (under a relaxed definition of security introduced by Prabhakaran and Sahai (STOC ’04)) when executed concurrently with multiple copies of itself and other protocols, without any assumptions on existence of trusted ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
(Show Context)
We construct a secure protocol for any multiparty functionality that remains secure (under a relaxed definition of security introduced by Prabhakaran and Sahai (STOC ’04)) when executed concurrently with multiple copies of itself and other protocols, without any assumptions on existence of trusted parties, common reference string, honest majority or synchronicity of the network. The relaxation of security is obtained by allowing the idealmodel simulator to run in quaipolynomial (as opposed to polynomial) time. Quasipolynomial simulation suffices to ensure security for most applications of multiparty computation. Furthermore, Lindell (FOCS ’03, TCC ’ 04) recently showed that such a protocol is impossible to obtain under the more standard definition of polynomialtime simulation by an ideal adversary.