Results 1  10
of
106
Key Establishment in Large Dynamic Groups Using OneWay Function Trees
, 1998
"... We present and analyze a new algorithm for establishing shared cryptographic keys in large, dynamically changing groups. Our algorithm is based on a novel application of oneway function trees. In comparison with previously published methods, our algorithm achieves a new minimum in the number of bit ..."
Abstract

Cited by 220 (3 self)
 Add to MetaCart
We present and analyze a new algorithm for establishing shared cryptographic keys in large, dynamically changing groups. Our algorithm is based on a novel application of oneway function trees. In comparison with previously published methods, our algorithm achieves a new minimum in the number of bits that need to be broadcast to members in order to rekey after a member is added or evicted. The number of keys stored by group members, the number of keys broadcast to the group when new members are added or evicted, and the computational efforts of group members, are logarithmic in the number of group members. Our algorithm provides complete forward and backwards security: newly admitted group members cannot read previous messages, and evicted members cannot read future messages, even with collusion by arbitrarily many evicted members. This algorithm offers a new scalable method for establishing group session keys for secure largegroup applications such as electronic conferences, multica...
Moderately Hard, Memorybound Functions
 In NDSS
, 2003
"... A resource may be abused if its users incur little or no cost. For example, email abuse is rampant because sending an email has negligible cost for the sender. It has been suggested that such abuse may be discouraged by introducing an artificial cost in the form of a moderately expensive computati ..."
Abstract

Cited by 110 (1 self)
 Add to MetaCart
(Show Context)
A resource may be abused if its users incur little or no cost. For example, email abuse is rampant because sending an email has negligible cost for the sender. It has been suggested that such abuse may be discouraged by introducing an artificial cost in the form of a moderately expensive computation. Thus, the sender of an email might be required to pay by computing for a few seconds before the email is accepted. Unfortunately, because of sharp disparities across computer systems, this approach may be ineffective against malicious users with highend systems, prohibitively slow for legitimate users with lowend systems, or both. Starting from this observation, we research moderately hard functions that most recent systems will evaluate at about the same speed. For this purpose, we rely on memorybound computations. We describe and analyze a family of moderately hard, memorybound functions, and we explain how to use them for protecting against abuses. 1.
Boltzmann Samplers For The Random Generation Of Combinatorial Structures
 Combinatorics, Probability and Computing
, 2004
"... This article proposes a surprisingly simple framework for the random generation of combinatorial configurations based on what we call Boltzmann models. The idea is to perform random generation of possibly complex structured objects by placing an appropriate measure spread over the whole of a combina ..."
Abstract

Cited by 108 (3 self)
 Add to MetaCart
(Show Context)
This article proposes a surprisingly simple framework for the random generation of combinatorial configurations based on what we call Boltzmann models. The idea is to perform random generation of possibly complex structured objects by placing an appropriate measure spread over the whole of a combinatorial class  an object receives a probability essentially proportional to an exponential of its size. As demonstrated here, the resulting algorithms based on realarithmetic operations often operate in linear time. They can be implemented easily, be analysed mathematically with great precision, and, when suitably tuned, tend to be very efficient in practice.
Large deviations of combinatorial distributions II: Local limit theorems
, 1997
"... This paper is a sequel to our paper [17] where we derived a general central limit theorem for probabilities of large deviations applicable to many classes of combinatorial structures and arithmetic functions; we consider corresponding local limit theorems in this paper. More precisely, given a seq ..."
Abstract

Cited by 38 (5 self)
 Add to MetaCart
This paper is a sequel to our paper [17] where we derived a general central limit theorem for probabilities of large deviations applicable to many classes of combinatorial structures and arithmetic functions; we consider corresponding local limit theorems in this paper. More precisely, given a sequence of integral random variables n#1 each of maximal span 1 (see below for definition), we are interested in the asymptotic behavior of the probabilities n = m} (m N, m = n x n # n , n := n , # n := n ), ##, where x n can tend to with n at a rate that is restricted to O(# n ). Our interest here is not to derive asymptotic expression for n = m} valid for the widest possible range of m, but to show that for m lying in the interval n O(# n ), very precise asymptotic formulae can be obtained. These formulae are in close connection with our results in [17]. Although local limit theorems receive a constant research interest [2, 3, 7, 14, 13, 24], our approach and results, especially Theorem 1, seem rarely discussed in a systematic manner. Recall that a lattice random variable X is said to be of maximal span h if X takes only values of the form b + hk, k Z, for some constants b and h > 0; and there does not exist b # and h # > h such that X takes only values of the form b # + h # k
Speeding Up the Discrete Log Computation on Curves With Automorphisms
, 1999
"... We show how to speed up the discrete log computations on curves having automorphisms of large order, thus generalizing the attacks on ABC elliptic curves. This includes the first known attack on CM (hyper)elliptic curves, as well as most of the hyperelliptic curves described in the literature. ..."
Abstract

Cited by 38 (2 self)
 Add to MetaCart
We show how to speed up the discrete log computations on curves having automorphisms of large order, thus generalizing the attacks on ABC elliptic curves. This includes the first known attack on CM (hyper)elliptic curves, as well as most of the hyperelliptic curves described in the literature.
Independent process approximations for random combinatorial structures
 Advances in mathematics
"... Many random combinatorial objects have a component structure whose joint distribution is equal to that of a process of mutually independent random variables, conditioned on the value of a weighted sum of the variables. It is interesting to compare the combinatorial structure directly to the independ ..."
Abstract

Cited by 38 (8 self)
 Add to MetaCart
Many random combinatorial objects have a component structure whose joint distribution is equal to that of a process of mutually independent random variables, conditioned on the value of a weighted sum of the variables. It is interesting to compare the combinatorial structure directly to the independent discrete process, without renormalizing. The quality of approximation can often be conveniently quantified in terms of total variation distance, for functionals which observe part, but not all, of the combinatorial and independent processes. Among the examples are combinatorial assemblies (e.g., permutations, random mapping functions, and partitions of a set), multisets (e.g, polynomials over a finite field, mapping patterns and partitions of an integer), and selections (e.g., partitions of an integer into distinct parts, and squarefree polynomials over finite fields). We consider issues common to all the above examples, including equalities and upper bounds for total variation distances, existence of limiting processes, heuristics for good approximations, the relation to standard generating functions, moment formulas and recursions for computing densities, refinement to the process which counts the number of parts of each possible type, the effect of further conditioning on events of moderate probability, large deviation theory and nonuniform measures on combinatorial objects, and the possibility of getting useful results by overpowering the conditioning. 0 1994 Amdcmic Pres, Inc. Contents. 1. Introduction. 1.1.
On the Analysis of Linear Probing Hashing
, 1998
"... This paper presents moment analyses and characterizations of limit distributions for the construction cost of hash tables under the linear probing strategy. Two models are considered, that of full tables and that of sparse tables with a fixed filling ratio strictly smaller than one. For full tables, ..."
Abstract

Cited by 26 (8 self)
 Add to MetaCart
This paper presents moment analyses and characterizations of limit distributions for the construction cost of hash tables under the linear probing strategy. Two models are considered, that of full tables and that of sparse tables with a fixed filling ratio strictly smaller than one. For full tables, the construction cost has expectation O(n3/2), the standard deviation is of the same order, and a limit law of the Airy type holds. (The Airy distribution is a semiclassical distribution that is defined in terms of the usual Airy functions or equivalently in terms of Bessel functions of indices − 1 2 3, 3.) For sparse tables, the construction cost has expectation O(n), standard deviation O ( √ n), and a limit law of the Gaussian type. Combinatorial relations with other problems leading to Airy phenomena (like graph connectivity, tree inversions, tree path length, or area under excursions) are also briefly discussed.
Limit theorems for combinatorial structures via discrete process approximations. Random Structures and Algorithms
, 1992
"... Discrete functional limit theorems, which give independent process approximations for the joint distribution of the component structure of combinatorial objects such as permutations and mappings, have recently become available. In this article, we demonstrate the power of these theorems to provide e ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
Discrete functional limit theorems, which give independent process approximations for the joint distribution of the component structure of combinatorial objects such as permutations and mappings, have recently become available. In this article, we demonstrate the power of these theorems to provide elementary proofs of a variety of new and old limit theorems, including results previously proved by complicated analytical methods. Among the examples we treat are Brownian motion limit theorems for the cycle counts of a random permutation or the component counts of a random mapping, a Poisson limit law for the core of a random mapping, a generalization of the ErdosTurin Law for the logorder of a random permutation and the smallest component size of a random permutation, approximations to the joint laws of the smallest cycle sizes of a random mapping, and a limit distribution for the difference between the total number of cycles and the number of
Improving implementable meetinthemiddle attacks by orders of magnitude
 of LNCS
, 1996
"... Abstract. Meetinthemiddle attacks, where problems and the secrets being sought are decomposed into two pieces, have many applications in cryptanalysis. A wellknown such attack on doubleDES requires 2 56 time and memory; a naive key search would take 2112 time. However, when the attacker is limi ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Meetinthemiddle attacks, where problems and the secrets being sought are decomposed into two pieces, have many applications in cryptanalysis. A wellknown such attack on doubleDES requires 2 56 time and memory; a naive key search would take 2112 time. However, when the attacker is limited to a practical amount of memory, the time savings are much less dramatic. For n the cardinality of the space that each half of the secret is chosen from (n=2 56 for doubleDES), and w the number of words of memory available for an attack, a technique based on parallel collision search is described which requires O ( ) times fewer operations and O ( ) times fewer memory accesses than previous approaches to meetinthemiddle attacks. For the example of doubleDES, an attacker with 16 Gbytes of memory could recover a pair of DES keys in a knownplaintext attack with 570 times fewer encryptions and 3.7×106 n ⁄ w n ⁄ w times fewer memory accesses compared to previous techniques using the same amount of memory. Key words. Meetinthemiddle attack, parallel collision search, cryptanalysis, DES, low Hamming weight exponents.