Results 1  10
of
20
Magic Functions
, 1999
"... We consider three apparently unrelated fundamental problems in distributed computing, cryptography and complexity theory and prove that they are essentially the same problem. ..."
Abstract

Cited by 55 (0 self)
 Add to MetaCart
We consider three apparently unrelated fundamental problems in distributed computing, cryptography and complexity theory and prove that they are essentially the same problem.
The shortest vector in a lattice is hard to approximate to within some constant
 in Proc. 39th Symposium on Foundations of Computer Science
, 1998
"... Abstract. We show that approximating the shortest vector problem (in any ℓp norm) to within any constant factor less than p √ 2 is hardfor NP under reverse unfaithful random reductions with inverse polynomial error probability. In particular, approximating the shortest vector problem is not in RP (r ..."
Abstract

Cited by 51 (4 self)
 Add to MetaCart
Abstract. We show that approximating the shortest vector problem (in any ℓp norm) to within any constant factor less than p √ 2 is hardfor NP under reverse unfaithful random reductions with inverse polynomial error probability. In particular, approximating the shortest vector problem is not in RP (random polynomial time), unless NP equals RP. We also prove a proper NPhardness result (i.e., hardness under deterministic manyone reductions) under a reasonable number theoretic conjecture on the distribution of squarefree smooth numbers. As part of our proof, we give an alternative construction of Ajtai’s constructive variant of Sauer’s lemma that greatly simplifies Ajtai’s original proof. Key words. NPhardness, shortest vector problem, point lattices, geometry of numbers, sphere packing
On the Existence of 3Round ZeroKnowledge Protocols
 In Crypto98, Springer LNCS 1462
, 1999
"... In this paper, we construct a 3round zeroknowledge protocol for any NP language. Our protocol achieves weaker notions of zeroknowledge than blackbox simulation zeroknowledge. Therefore, our result does not contradict the triviality result of Goldreich and Krawczyk [GoKr96] which shows that 3ro ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
In this paper, we construct a 3round zeroknowledge protocol for any NP language. Our protocol achieves weaker notions of zeroknowledge than blackbox simulation zeroknowledge. Therefore, our result does not contradict the triviality result of Goldreich and Krawczyk [GoKr96] which shows that 3round blackbox simulation zeroknowledge exist only for BPP languages. Our main contribution is to provide a nonblackbox simulation technique. Whether there exists such a simulation technique was a major open problem in the theory of zeroknowledge. Our simulation technique is based on a nonstandard computational assumption related to the Di#eHellman problem, which was originally proposed by Damgard [Da91]. This assumption, which we call the DA1, says that, given randomly chosen instance of the discrete logarithm problem (p, q, g, g a ), it is infeasible to compute (B, X) such that X = B a mod p without knowing the value b satisfying B = g b mod p. Our protocol achieves di#erent no...
The (True) Complexity of Statistical Zero Knowledge (Extended Abstract)
 Proceedings of the 22nd Annual ACM Symposium on the Theory of Computing, ACM
, 1990
"... ) Mihir Bellare Silvio Micali y Rafail Ostrovsky z MIT Laboratory for Computer Science 545 Technology Square Cambridge, MA 02139 Abstract Statistical zeroknowledge is a very strong privacy constraint which is not dependent on computational limitations. In this paper we show that given a comp ..."
Abstract

Cited by 40 (17 self)
 Add to MetaCart
) Mihir Bellare Silvio Micali y Rafail Ostrovsky z MIT Laboratory for Computer Science 545 Technology Square Cambridge, MA 02139 Abstract Statistical zeroknowledge is a very strong privacy constraint which is not dependent on computational limitations. In this paper we show that given a complexity assumption a much weaker condition suffices to attain statistical zeroknowledge. As a result we are able to simplify statistical zeroknowledge and to better characterize, on many counts, the class of languages that possess statistical zeroknowledge proofs. 1 Introduction An interactive proof involves two parties, a prover and a verifier, who talk back and forth. The prover, who is computationally unbounded, tries to convince the probabilistic polynomial time verifier that a given theorem is true. A zeroknowledge proof is an interactive proof with an additional privacy constraint: the verifier does not learn why the theorem is true [11]. That is, whatever the polynomialtime verif...
Everything in NP can be argued in perfect zeroknowledge in a bounded number of rounds
, 1989
"... A perfect zeroknowledge interactive protocol allows a prover to convince a verifier of the validity of a statement in a way that does not give the verifier any additional information [GMR,GMW]. Such protocols take place by the exchange of messages back and forth between the prover and the verifier. ..."
Abstract

Cited by 34 (5 self)
 Add to MetaCart
A perfect zeroknowledge interactive protocol allows a prover to convince a verifier of the validity of a statement in a way that does not give the verifier any additional information [GMR,GMW]. Such protocols take place by the exchange of messages back and forth between the prover and the verifier. An important measure of efficiency for these protocols is the number of rounds in the interaction. In previously known perfect zeroknowledge protocols for statements concerning NPcomplete problems [BCC], at least k rounds were necessary in order to prevent one party from having a probability of undetected cheating greater than 2 k . In this paper, we give the first perfect zeroknowledge protocol that offers arbitrarily high security for any statement in NP with a constant number of rounds (under the assumption that it is possible to find a prime p with known factorization of p 1 such that it is infeasible to compute discrete logarithms modulo p even for someone who knows the factors o...
LubyRackoff backwards: Increasing security by making block ciphers noninvertible
 ADVANCES IN CRYPTOLOGYEUROCRYPT '98 PROCEEDINGS
, 1998
"... We argue that the invertibility of a block cipher can reduce the security of schemes that use it, and a better starting point for scheme design is the noninvertible analog of a block cipher, that is, a pseudorandom function (PRF). Since a block cipher may be viewed as a pseudorandom permutation, ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
We argue that the invertibility of a block cipher can reduce the security of schemes that use it, and a better starting point for scheme design is the noninvertible analog of a block cipher, that is, a pseudorandom function (PRF). Since a block cipher may be viewed as a pseudorandom permutation, we are led to investigate the reverse of the problem studied by Luby and Rackoff, and ask: "how can one transform a PRP into a PRF in as securitypreserving a way as possible?" The solution we propose is datadependent rekeying. As an illustrative special case, let E:f0; 1g nf0;1g n!f0;1g n be the block cipher. Then we can construct the PRF F from the PRP E by setting F (k; x) =E(E(k; x);x). We generalize this to allow for arbitrary block and key lengths, and to improve e ciency. We prove strong quantitative bounds on the value of datadependent rekeying in the Shannon model of an ideal cipher, and take some initial steps towards an analysis in the standard model.
Monomial Representations for Gröbner Bases Computations
 Proceedings of ISSAC 1998, ACM Press
, 1998
"... Monomial representations and operations for Grobner bases computations are investigated from an implementation point of view. The technique of vectorized monomial operations is introduced and it is shown how it expedites computations of Grobner bases. Furthermore, a rankbased monomial representatio ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
Monomial representations and operations for Grobner bases computations are investigated from an implementation point of view. The technique of vectorized monomial operations is introduced and it is shown how it expedites computations of Grobner bases. Furthermore, a rankbased monomial representation and comparison technique is examined and it is concluded that this technique does not yield an additional speedup over vectorized comparisons. Extensive benchmark tests with the Computer Algebra System Singular are used to evaluate these concepts. 1 Introduction The method of Grobner bases (GB) (see, for example, [8] for an introduction) is undoubtly one of the most important and prominent success stories of the field of Computer Algebra. Starting in the 1960's, an unsolved problem has developed into an essential computational tool with a great variety of applications and more and more powerful implementations. The heart of the GB method are computations of Grobner or Standard bases with...
Interactive Hashing Simplifies ZeroKnowledge Protocol Design (Extended Abstract)
 Proc. of EuroCrypt 93
, 1998
"... Often the core difficulty in designing zeroknowledge protocols arises from having to consider every possible cheating verifier trying to extract aAditional information. ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
Often the core difficulty in designing zeroknowledge protocols arises from having to consider every possible cheating verifier trying to extract aAditional information.
Languages that are Easier than their Proofs
, 1991
"... A basic question about NP is whether or not search reduces in polynomial time to decision. We indicate that the answer is negative: under a complexity assumption (that deterministic and nondeterministic doubleexponential time are unequal) we construct a language in NP for which search does not reduc ..."
Abstract

Cited by 13 (7 self)
 Add to MetaCart
A basic question about NP is whether or not search reduces in polynomial time to decision. We indicate that the answer is negative: under a complexity assumption (that deterministic and nondeterministic doubleexponential time are unequal) we construct a language in NP for which search does not reduce to decision. These ideas extend in a natural way to interactive proofs and program checking. Under similar assumptions we present languages in NP for which it is harder to prove membership interactively than it is to decide this membership. Similarly we present languages where checking is harder than computing membership. Each of the following properties  checkability, randomselfreducibility, reduction from search to decision, and interactive proofs in which the prover's power is limited to deciding membership in the language itself  implies coherence, one of the weakest forms of selfreducibility. Under assumptions about tripleexponential time, we construct incoherent sets in NP....
Derandomizing Homomorphism Testing in General Groups
 In Proc. 36th STOC
, 2004
"... 1993). We show that for any groups G and \Gamma, and any expanding generating set S of G, the naturalderamdomized version of the BLR test in which we pick an element x randomly from G and y randomly from S and test whether f(x) * f (y) = f (x * y), performs nearly as well (depending of courseon th ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
1993). We show that for any groups G and \Gamma, and any expanding generating set S of G, the naturalderamdomized version of the BLR test in which we pick an element x randomly from G and y randomly from S and test whether f(x) * f (y) = f (x * y), performs nearly as well (depending of courseon the expansion) as the original test. Moreover we show that the underlying homomorphism can be found by the natural local "belief propagation decoding".We note that the original BLR test uses 2 log 2 G  random bits, whereas the derandomized testuses only (1 + o(1)) log 2 G  random bits. This factor of 2 savings in the randomness complexitytranslates to a near quadratic savings in the length of the tables in the related locally testable codes (and possibly probabilistically checkable proofs which may use them).Our result is a significant generalization of recent results that either refer to the special case of the groups G = Zmp and \Gamma = Zp or are nonconstructive. We use simple combinatorial argumentsand the transitivity of Cayley graphs (and this analysis gives optimal results up to constant factors). Previous techniques used the Fourier transform, a method which seems unextendable togeneral groups (and furthermore gives suboptimal bounds). Finally, we provide a polynomial time (in G) construction of a (somewhat) small (Gffl) set ofexpanding generators for every group