Results 21  30
of
226
Cryptographic Hash Functions: A Survey
, 1995
"... This paper gives a survey on cryptographic hash functions. It gives an overview of all types of hash functions and reviews design principals and possible methods of attacks. It also focuses on keyed hash functions and provides the applications, requirements, and constructions of keyed hash functions ..."
Abstract

Cited by 35 (7 self)
 Add to MetaCart
This paper gives a survey on cryptographic hash functions. It gives an overview of all types of hash functions and reviews design principals and possible methods of attacks. It also focuses on keyed hash functions and provides the applications, requirements, and constructions of keyed hash functions.
Randomness extraction and key derivation using the cbc, cascade and hmac modes
 In Franklin [14
"... Abstract. We study the suitability of common pseudorandomness modes associated with cryptographic hash functions and block ciphers (CBCMAC, Cascade and HMAC) for the task of “randomness extraction”, namely, the derivation of keying material from semisecret and/or semirandom sources. Important appl ..."
Abstract

Cited by 34 (5 self)
 Add to MetaCart
Abstract. We study the suitability of common pseudorandomness modes associated with cryptographic hash functions and block ciphers (CBCMAC, Cascade and HMAC) for the task of “randomness extraction”, namely, the derivation of keying material from semisecret and/or semirandom sources. Important applications for such extractors include the derivation of strong cryptographic keys from nonuniform sources of randomness (for example, to extract a seed for a pseudorandom generator from a weak source of physical or digital noise), and the derivation of pseudorandom keys from a DiffieHellman value. Extractors are closely related in their applications to pseudorandom functions and thus it is attractive to (re)use the common pseudorandom modes as randomness extractors. Yet, the crucial difference between pseudorandom generation and randomness extraction is that the former uses random secret keys while the latter uses random but known keys. We show that under a variety of assumptions on the underlying primitives (block ciphers and compression functions), ranging from ideal randomness assumptions to realistic universalhashing properties, these modes induce good extractors. Hence, these schemes represent a more practical alternative to combinatorial extractors (that are seldom used in practice), and a betteranalyzed alternative to the common practice of using SHA1 or MD5 (as a single unkeyed function) for randomness extraction. In particular, our results serve to validate the method of key extraction and key derivation from DiffieHellman values used in the IKE (IPsec’s Key Exchange) protocol.
On the Need for Multipermutations: Cryptanalysis of MD4 and SAFER
, 1994
"... Cryptographic primitives are usually based on a network with some gates. In [SV94], it is claimed that all gates should be multipermutations. In this paper, we investigate a few combinatorial properties of multipermutations. We argue that gates which fail to be multipermutations can open the way to ..."
Abstract

Cited by 33 (2 self)
 Add to MetaCart
Cryptographic primitives are usually based on a network with some gates. In [SV94], it is claimed that all gates should be multipermutations. In this paper, we investigate a few combinatorial properties of multipermutations. We argue that gates which fail to be multipermutations can open the way to unsuspected attacks. We illustrate this statement with two examples. Firstly, we show how to construct collisions to MD4 restricted to its first two rounds. This allows to forge digests close to each other using the full compression function of MD4. Secondly, we show that some generalizations of SAFER are subject to attack faster than exhaustive search in 6:1% cases. This attack can be implemented if we decrease the number of rounds from 6 to 4. In [SV94], multipermutations are introduced as formalization of perfect diffusion. The aim of this paper is to show that the concept of multipermutation is a basic tool in the design of dedicated cryptographic functions, as functions that do not rea...
Design principles for iterated hash functions
 CRYPTOLOGY EPRINT ARCHIVE
, 2004
"... This paper deals with the security of iterated hash functions against generic attacks, such as, e.g., Joux’ multicollision attacks from Crypto 04 [6]. The core idea is to increase the size of the internal state of an nbit hash function to w> n bit. Variations of this core idea allow the use of a c ..."
Abstract

Cited by 29 (1 self)
 Add to MetaCart
This paper deals with the security of iterated hash functions against generic attacks, such as, e.g., Joux’ multicollision attacks from Crypto 04 [6]. The core idea is to increase the size of the internal state of an nbit hash function to w> n bit. Variations of this core idea allow the use of a compression function with n output bits, even if the compression function itself is based on a block cipher. In a formal model, it is shown that these modifications quantifiably improve the security of iterated hash functions against generic attacks.
Limits on the Efficiency of OneWay PermutationBased Hash Functions
 In Proceedings of the 40th Annual IEEE Symposium on Foundations of Computer Science
, 1999
"... Naor and Yung ([NY89]) show that a onebit compressing universal oneway hash function (UOWHF) can be constructed based on a oneway permutation. This construction can be iterated to build a UOWHF which compresses by "n bits, at the cost of "n invocations of the oneway permutation. We show that thi ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
Naor and Yung ([NY89]) show that a onebit compressing universal oneway hash function (UOWHF) can be constructed based on a oneway permutation. This construction can be iterated to build a UOWHF which compresses by "n bits, at the cost of "n invocations of the oneway permutation. We show that this construction is not far from optimal, in the following sense: there exists an oracle relative to which there exists a oneway permutation with inversion probability 2 \Gammap(n) (for any p(n) 2 !(log n)), but any construction of an "nbitcompressing UOWHF requires \Omega\Gamma p n=p(n)) invocations of the oneway permutation, on average. (For example, there exists in this relativized world a oneway permutation with inversion probability n \Gamma!(1) , but no UOWHF that invokes it fewer than \Omega\Gamma p n= log n) times.) Thus any proof that a more efficient UOWHF can be derived from a oneway permutation is necessarily nonrelativizing; in particular, no provable construction...
SWIFFT: A Modest Proposal for FFT Hashing
"... We propose SWIFFT, a collection of compression functions that are highly parallelizable and admit very efficient implementations on modern microprocessors. The main technique underlying our functions is a novel use of the Fast Fourier Transform (FFT) to achieve “diffusion, ” together with a linear ..."
Abstract

Cited by 28 (10 self)
 Add to MetaCart
We propose SWIFFT, a collection of compression functions that are highly parallelizable and admit very efficient implementations on modern microprocessors. The main technique underlying our functions is a novel use of the Fast Fourier Transform (FFT) to achieve “diffusion, ” together with a linear combination to achieve compression and “confusion. ” We provide a detailed security analysis of concrete instantiations, and give a highperformance software implementation that exploits the inherent parallelism of the FFT algorithm. The throughput of our implementation is competitive with that of SHA256, with additional parallelism yet to be exploited. Our functions are set apart from prior proposals (having comparable efficiency) by a supporting asymptotic security proof: it can be formally proved that finding a collision in a randomlychosen function from the family (with noticeable probability) is at least as hard as finding short vectors in cyclic/ideal lattices in the worst case.
Eliminating Counterevidence with Applications to Accountable Certificate Management
 Journal of Computer Security
, 2002
"... This paper presents a method to increase the accountability of certificate management by making it intractable for the certification authority (CA) to create contradictory statements about the validity of a certificate. The core of the method is a new primitive, undeniable attester, that allows s ..."
Abstract

Cited by 27 (3 self)
 Add to MetaCart
This paper presents a method to increase the accountability of certificate management by making it intractable for the certification authority (CA) to create contradictory statements about the validity of a certificate. The core of the method is a new primitive, undeniable attester, that allows someone to commit to some set S of bitstrings by publishing a short digest of S and to give attestations for any x that it is or is not a member of S. Such an attestation can be verified by obtaining in authenticated way the published digest and applying a verification algorithm to the triple of the bitstring, the attestation and the digest. The most important feature of this primitive is intractability of creating two contradictory proofs for the same candidate element x and digest. We give an efficient construction for undeniable attesters based on authenticated search trees. We show that the construction also applies to sets of more structured elements. We also show that undeniable attesters exist iff collisionresistant hash functions exist.
Hash function balance and its impact on birthday attacks
 Advances in Cryptology – EUROCRYPT ’04, Lecture Notes in Computer Science
, 2004
"... Abstract. Textbooks tell us that a birthday attack on a hash function h with range size r requires r 1/2 trials (hash computations) to find a collision. But this is quite misleading, being true only if h is regular, meaning all points in the range have the same number of preimages under h; if h is ..."
Abstract

Cited by 27 (2 self)
 Add to MetaCart
Abstract. Textbooks tell us that a birthday attack on a hash function h with range size r requires r 1/2 trials (hash computations) to find a collision. But this is quite misleading, being true only if h is regular, meaning all points in the range have the same number of preimages under h; if h is not regular, fewer trials may be required. But how much fewer? This paper addresses this question by introducing a measure of the “amount of regularity ” of a hash function that we call its balance, and then providing estimates of the successrate of the birthday attack, and the expected number of trials to find a collision, as a function of the balance of the hash function being attacked. In particular, we will see that the number of trials can be significantly less than r 1/2 for hash functions of low balance. This leads us to examine popular design principles, such as the MD (MerkleDamg˚ard) transform, from the point of view of balance preservation, and to mount experiments to determine the balance of popular hash functions. 1
Constructing VILMACs from FILMACs: Message authentication under weakened assumptions
, 1999
"... ..."
On the impossibility of highlyefficient blockcipherbased hash functions
 in Advances in Cryptology—EUROCRYPT 2005
, 2005
"... Abstract. Fix a small, nonempty set of blockcipher keys K. We say a blockcipherbased hash function is highlyefficient if it makes exactly one blockcipher call for each message block hashed, and all blockcipher calls use a key from K. Although a few highlyefficient constructions have been propose ..."
Abstract

Cited by 26 (3 self)
 Add to MetaCart
Abstract. Fix a small, nonempty set of blockcipher keys K. We say a blockcipherbased hash function is highlyefficient if it makes exactly one blockcipher call for each message block hashed, and all blockcipher calls use a key from K. Although a few highlyefficient constructions have been proposed, no one has been able to prove their security. In this paper we prove, in the idealcipher model, that it is impossible to construct a highlyefficient iterated blockcipherbased hash function that is provably secure. Our result implies, in particular, that the Tweakable Chain Hash (TCH) construction suggested by Liskov, Rivest, and Wagner [7] is not correct under an instantiation suggested for this construction, nor can TCH be correctly instantiated by any other efficient means.