Results 11  20
of
242
Almost All Primes Can be Quickly Certified
"... This paper presents a new probabilistic primality test. Upon termination the test outputs "composite" or "prime", along with a short proof of correctness, which can be verified in deterministic polynomial time. The test is different from the tests of Miller [M], SolovayStrassen ..."
Abstract

Cited by 77 (4 self)
 Add to MetaCart
This paper presents a new probabilistic primality test. Upon termination the test outputs "composite" or "prime", along with a short proof of correctness, which can be verified in deterministic polynomial time. The test is different from the tests of Miller [M], SolovayStrassen [SSI, and Rabin [R] in that its assertions of primality are certain, rather than being correct with high probability or dependent on an unproven assumption. Thc test terminates in expected polynomial time on all but at most an exponentially vanishing fraction of the inputs of length k, for every k. This result implies: • There exist an infinite set of primes which can be recognized in expected polynomial time. • Large certified primes can be generated in expected polynomial time. Under a very plausible condition on the distribution of primes in "small" intervals, the proposed algorithm can be shown'to run in expected polynomial time on every input. This
Robust Efficient Distributed RSAKey Generation
 Proceedings of the 30th ACM Aymposium on Theory of Computing
, 1998
"... ..."
Extracting randomness from samplable distributions
 In Proceedings of the 41st Annual IEEE Symposium on Foundations of Computer Science
, 2000
"... The standard notion of a randomness extractor is a procedure which converts any weak source of randomness into an almost uniform distribution. The conversion necessarily uses a small amount of pure randomness, which can be eliminated by complete enumeration in some, but not all, applications. Here, ..."
Abstract

Cited by 60 (6 self)
 Add to MetaCart
The standard notion of a randomness extractor is a procedure which converts any weak source of randomness into an almost uniform distribution. The conversion necessarily uses a small amount of pure randomness, which can be eliminated by complete enumeration in some, but not all, applications. Here, we consider the problem of deterministically converting a weak source of randomness into an almost uniform distribution. Previously, deterministic extraction procedures were known only for sources satisfying strong independence requirements. In this paper, we look at sources which are samplable, i.e. can be generated by an efficient sampling algorithm. We seek an efficient deterministic procedure that, given a sample from any samplable distribution of sufficiently large minentropy, gives an almost uniformly distributed output. We explore the conditions under which such deterministic extractors exist. We observe that no deterministic extractor exists if the sampler is allowed to use more computational resources than the extractor. On the other hand, if the extractor is allowed (polynomially) more resources than the sampler, we show that deterministic extraction becomes possible. This is true unconditionally in the nonuniform setting (i.e., when the extractor can be computed by a small circuit), and (necessarily) relies on complexity assumptions in the uniform setting. One of our uniform constructions is as follows: assuming that there are problems in���ÌÁÅ�ÇÒthat are not solvable by subexponentialsize circuits with¦� gates, there is an efficient extractor that transforms any samplable distribution of lengthÒand minentropy Ò into an output distribution of length ÇÒ, whereis any sufficiently small constant. The running time of the extractor is polynomial inÒand the circuit complexity of the sampler. These extractors are based on a connection be
Improved online/offline signature schemes
, 2001
"... Abstract. The notion of online/offline signature schemes was introduced in 1990 by Even, Goldreich and Micali. They presented a general method for converting any signature scheme into an online/offline signature scheme, but their method is not very practical as it increases the length of each si ..."
Abstract

Cited by 59 (0 self)
 Add to MetaCart
Abstract. The notion of online/offline signature schemes was introduced in 1990 by Even, Goldreich and Micali. They presented a general method for converting any signature scheme into an online/offline signature scheme, but their method is not very practical as it increases the length of each signature by a quadratic factor. In this paper we use the recently introduced notion of a trapdoor hash function to develop a new paradigm called hashsignswitch, which can convert any signature scheme into a highly efficient online/offline signature scheme: In its recommended implementation, the online complexity is equivalent to about 0.1 modular multiplications, and the size of each signature increases only by a factor of two. In addition, the new paradigm enhances the security of the original signature scheme since it is only used to sign random strings chosen offline by the signer. This makes the converted scheme secure against adaptive chosen message attacks even if the original scheme is secure only against generic chosen message attacks or against random message attacks.
New PublicKey Schemes Based on Elliptic Curves over the Ring Z_n
, 1991
"... Three new trapdoor oneway functions are proposed that are based on elliptic curves over the ring Z_n. The first class of functions is a naive construction, which can be used only in a digital signature scheme, and not in a publickey cryptosystem. The second, preferred class of function, does not s ..."
Abstract

Cited by 47 (0 self)
 Add to MetaCart
Three new trapdoor oneway functions are proposed that are based on elliptic curves over the ring Z_n. The first class of functions is a naive construction, which can be used only in a digital signature scheme, and not in a publickey cryptosystem. The second, preferred class of function, does not suffer from this problem and can be used for the same applications as the RSA trapdoor oneway function, including zeroknowledge identification protocols. The third class of functions has similar properties to the Rabin trapdoor oneway functions. Although the security of these proposed schemes is based on the difficulty of factoring n, like the RSA and Rabin schemes, these schemes seem to be more secure than those schemes from the viewpoint of attacks without factoring such as low multiplier attacks.
A polynomialtime theory of blackbox groups I
, 1998
"... We consider the asymptotic complexity of algorithms to manipulate matrix groups over finite fields. Groups are given by a list of generators. Some of the rudimentary tasks such as membership testing and computing the order are not expected to admit polynomialtime solutions due to number theoretic o ..."
Abstract

Cited by 41 (6 self)
 Add to MetaCart
We consider the asymptotic complexity of algorithms to manipulate matrix groups over finite fields. Groups are given by a list of generators. Some of the rudimentary tasks such as membership testing and computing the order are not expected to admit polynomialtime solutions due to number theoretic obstacles such as factoring integers and discrete logarithm. While these and other “abelian obstacles ” persist, we demonstrate that the “nonabelian normal structure ” of matrix groups over finite fields can be mapped out in great detail by polynomialtime randomized (Monte Carlo) algorithms. The methods are based on statistical results on finite simple groups. We indicate the elements of a project under way towards a more complete “recognition” of such groups in polynomial time. In particular, under a now plausible hypothesis, we are able to determine the names of all nonabelian composition factors of a matrix group over a finite field. Our context is actually far more general than matrix groups: most of the algorithms work for “blackbox groups ” under minimal assumptions. In a blackbox group, the group elements are encoded by strings of uniform length, and the group operations are performed by a “black box.”
PSelective Sets, and Reducing Search to Decision vs. SelfReducibility
, 1993
"... We obtain several results that distinguish selfreducibility of a language L with the question of whether search reduces to decision for L. These include: (i) If NE 6= E, then there exists a set L in NP \Gamma P such that search reduces to decision for L, search does not nonadaptively reduces to de ..."
Abstract

Cited by 39 (9 self)
 Add to MetaCart
We obtain several results that distinguish selfreducibility of a language L with the question of whether search reduces to decision for L. These include: (i) If NE 6= E, then there exists a set L in NP \Gamma P such that search reduces to decision for L, search does not nonadaptively reduces to decision for L, and L is not selfreducible. Funding for this research was provided by the National Science Foundation under grant CCR9002292. y Department of Computer Science, State University of New York at Buffalo, 226 Bell Hall, Buffalo, NY 14260 z Department of Computer Science, State University of New York at Buffalo, 226 Bell Hall, Buffalo, NY 14260 x Research performed while visiting the Department of Computer Science, State University of New York at Buffalo, Jan. 1992Dec. 1992. Current address: Department of Computer Science, University of ElectroCommunications, Chofushi, Tokyo 182, Japan.  Department of Computer Science, State University of New York at Buffalo, 226...
Traitor Tracing with Constant Transmission Rate
, 2002
"... Abstract. An important open problem in the area of Traitor Tracing is designing a scheme with constant expansion of the size of keys (users’ keys and the encryption key) and of the size of ciphertexts with respect to the size of the plaintext. This problem is known from the introduction of Traitor T ..."
Abstract

Cited by 34 (4 self)
 Add to MetaCart
Abstract. An important open problem in the area of Traitor Tracing is designing a scheme with constant expansion of the size of keys (users’ keys and the encryption key) and of the size of ciphertexts with respect to the size of the plaintext. This problem is known from the introduction of Traitor Tracing by Chor, Fiat and Naor. We refer to such schemes as traitor tracing with constant transmission rate. Here we present a general methodology and two protocol constructions that result in the first two publickey traitor tracing schemes with constant transmission rate in settings where plaintexts can be calibrated to be sufficiently large. Our starting point is the notion of “copyrighted function ” which was presented by Naccache, Shamir and Stern. We first solve the open problem of discretelogbased and publickeybased “copyrighted function.” Then, we observe the simple yet crucial relation between (publickey) copyrighted encryption and (publickey) traitor tracing, which we exploit by introducing a generic design paradigm for designing constant