Results 1  10
of
63
Strengthening Digital Signatures Via Randomized Hashing
 In CRYPTO
, 2006
"... Abstract. We propose randomized hashing as a mode of operation for cryptographic hash functions intended for use with standard digital signatures and without necessitating of any changes in the internals of the underlying hash function (e.g., the SHA family) or in the signature algorithms (e.g., RSA ..."
Abstract

Cited by 58 (2 self)
 Add to MetaCart
Abstract. We propose randomized hashing as a mode of operation for cryptographic hash functions intended for use with standard digital signatures and without necessitating of any changes in the internals of the underlying hash function (e.g., the SHA family) or in the signature algorithms (e.g., RSA or DSA). The goal is to free practical digital signature schemes from their current reliance on strong collision resistance by basing the security of these schemes on significantly weaker properties of the underlying hash function, thus providing a safety net in case the (current or future) hash functions in use turn out to be less resilient to collision search than initially thought. We design a specific mode of operation that takes into account engineering considerations (such as simplicity, efficiency and compatibility with existing implementations) as well as analytical soundness. Specifically, the scheme consists of a regular use of the hash function with randomization applied only to the message before it is input to the hash function. We formally show the sufficiency of weaker than collisionresistance assumptions for proving the security of the scheme. 1
A failurefriendly design principle for hash functions
, 2005
"... Abstract. This paper reconsiders the established MerkleDamg˚ard design principle for iterated hash functions. The internal state size w of an iterated nbit hash function is treated as a security parameter of its own right. In a formal model, we show that increasing w quantifiably improves security ..."
Abstract

Cited by 42 (5 self)
 Add to MetaCart
Abstract. This paper reconsiders the established MerkleDamg˚ard design principle for iterated hash functions. The internal state size w of an iterated nbit hash function is treated as a security parameter of its own right. In a formal model, we show that increasing w quantifiably improves security against certain attacks, even if the compression function fails to be collision resistant. We propose the widepipe hash, internally using a wbit compression function, and the doublepipe hash, with w = 2n and an nbit compression function used twice in parallel.
SWIFFT: A Modest Proposal for FFT Hashing
"... We propose SWIFFT, a collection of compression functions that are highly parallelizable and admit very efficient implementations on modern microprocessors. The main technique underlying our functions is a novel use of the Fast Fourier Transform (FFT) to achieve “diffusion, ” together with a linear ..."
Abstract

Cited by 28 (10 self)
 Add to MetaCart
We propose SWIFFT, a collection of compression functions that are highly parallelizable and admit very efficient implementations on modern microprocessors. The main technique underlying our functions is a novel use of the Fast Fourier Transform (FFT) to achieve “diffusion, ” together with a linear combination to achieve compression and “confusion. ” We provide a detailed security analysis of concrete instantiations, and give a highperformance software implementation that exploits the inherent parallelism of the FFT algorithm. The throughput of our implementation is competitive with that of SHA256, with additional parallelism yet to be exploited. Our functions are set apart from prior proposals (having comparable efficiency) by a supporting asymptotic security proof: it can be formally proved that finding a collision in a randomlychosen function from the family (with noticeable probability) is at least as hard as finding short vectors in cyclic/ideal lattices in the worst case.
Improved fast syndrome based cryptographic hash functions
 in Proceedings of ECRYPT Hash Workshop 2007 (2007). URL: http://wwwroc.inria.fr/secret/Matthieu.Finiasz
"... Abstract. Recently, some collisions have been exposed for a variety of cryptographic hash functions [19] including some of the most widely used today. Many other hash functions using similar constrcutions can however still be considered secure. Nevertheless, this has drawn attention on the need for ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
Abstract. Recently, some collisions have been exposed for a variety of cryptographic hash functions [19] including some of the most widely used today. Many other hash functions using similar constrcutions can however still be considered secure. Nevertheless, this has drawn attention on the need for new hash function designs. In this article is presented a familly of secure hash functions, whose security is directly related to the syndrome decoding problem from the theory of errorcorrecting codes. Taking into account the analysis by Coron and Joux [4] based on Wagner’s generalized birthday algorithm [18] we study the asymptotical security of our functions. We demonstrate that this attack is always exponential in terms of the length of the hash value. We also study the workfactor of this attack, along with other attacks from coding theory, for non asymptotic range, i.e. for practical values. Accordingly, we propose a few sets of parameters giving a good security and either a faster hashing or a shorter desciption for the function. Key Words: cryptographic hash functions, provable security, syndrome decoding, NPcompleteness, Wagner’s generalized birthday problem.
Formalizing human ignorance: Collisionresistant hashing without the keys
 In Proc. Vietcrypt ’06
, 2006
"... Abstract. There is a foundational problem involving collisionresistant hashfunctions: common constructions are keyless, but formal definitions are keyed. The discrepancy stems from the fact that a function H: {0, 1} ∗ → {0, 1} n always admits an efficient collisionfinding algorithm, it’s just t ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
Abstract. There is a foundational problem involving collisionresistant hashfunctions: common constructions are keyless, but formal definitions are keyed. The discrepancy stems from the fact that a function H: {0, 1} ∗ → {0, 1} n always admits an efficient collisionfinding algorithm, it’s just that us human beings might be unable to write the program down. We explain a simple way to sidestep this difficulty that avoids having to key our hash functions. The idea is to state theorems in a way that prescribes an explicitlygiven reduction, normally a blackbox one. We illustrate this approach using wellknown examples involving digital signatures, pseudorandom functions, and the MerkleDamg˚ard construction. Key words. Collisionfree hash function, Collisionintractable hash function, Collisionresistant hash function, Cryptographic hash function, Provable security. 1
A study of the md5 attacks: insights and improvements
 In Proceedings of the 13th international conference on Fast Software Encryption, FSE’06
, 2006
"... Abstract. MD5 is a wellknown and widelyused cryptographic hash function. It has received renewed attention from researchers subsequent to the recent announcement of collisions found by Wang et al. [16]. To date, however, the method used by researchers in this work has been fairly difficult to gras ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
Abstract. MD5 is a wellknown and widelyused cryptographic hash function. It has received renewed attention from researchers subsequent to the recent announcement of collisions found by Wang et al. [16]. To date, however, the method used by researchers in this work has been fairly difficult to grasp. In this paper we conduct a study of all attacks on MD5 starting from Wang. We explain the techniques used by her team, give insights on how to improve these techniques, and use these insights to produce an even faster attack on MD5. Additionally, we provide an “MD5 Toolkit” implementing these improvements that we hope will serve as an opensource platform for further research. Our hope is that a better understanding of these attacks will lead to a better understanding of our current collection of hash functions, what their strengths and weaknesses are, and where we should direct future efforts in order to produce even stronger primitives.
VSH, an Efficient and Provable CollisionResistant Hash Function
 OF LECTURE NOTES IN COMPUTER SCIENCE
, 2006
"... We introduce VSH, very smooth hash, a new Sbit hash function that is provably collisionresistant assuming the hardness of finding nontrivial modular square roots of very smooth numbers modulo an Sbit composite. By very smooth, we mean that the smoothness bound is some fixed polynomial function of ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
We introduce VSH, very smooth hash, a new Sbit hash function that is provably collisionresistant assuming the hardness of finding nontrivial modular square roots of very smooth numbers modulo an Sbit composite. By very smooth, we mean that the smoothness bound is some fixed polynomial function of S. We argue that finding collisions for VSH has the same asymptotic complexity as factoring using the Number Field Sieve factoring algorithm, i.e., subexponential in S. VSH is theoretically pleasing because it requires just a single multiplication modulo the Sbit composite per Ω(S) messagebits (as opposed to O(log S) messagebits for previous provably secure hashes). It is relatively practical. A preliminary implementation on a 1GHz Pentium III processor that achieves collision resistance at least equivalent to the difficulty of factoring a 1024bit RSA modulus, runs at 1.1 MegaByte per second, with a moderate slowdown to 0.7MB/s for 2048bit RSA security. VSH can be used to build a fast, provably secure randomised trapdoor hash function, which can be applied to speed up provably secure signature schemes (such as CramerShoup) and designatedverifier signatures.
Hash Functions in the DedicatedKey Setting: Design Choices and MPP Transforms
 In ICALP ’07, volume 4596 of LNCS
, 2007
"... In the dedicatedkey setting, one starts with a compression function f: {0, 1} k ×{0, 1} n+d → {0, 1} n and builds a family of hash functions H f: K × M → {0, 1} n indexed by a key space K. This is different from the more traditional design approach used to build hash functions such as MD5 or SHA1, ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
In the dedicatedkey setting, one starts with a compression function f: {0, 1} k ×{0, 1} n+d → {0, 1} n and builds a family of hash functions H f: K × M → {0, 1} n indexed by a key space K. This is different from the more traditional design approach used to build hash functions such as MD5 or SHA1, in which compression functions and hash functions do not have dedicated key inputs. We explore the benefits and drawbacks of building hash functions in the dedicatedkey setting (as compared to the more traditional approach), highlighting several unique features of the former. Should one choose to build hash functions in the dedicatedkey setting, we suggest utilizing multipropertypreserving (MPP) domain extension transforms. We analyze seven existing dedicatedkey transforms with regard to the MPP goal and propose two simple
Verifying distributed erasurecoded data
 In Proceedings of the 26 th ACM Symposium on Principles of Distributed Computing
, 2007
"... Erasure coding can reduce the space and bandwidth overheads of redundancy in faulttolerant data storage and delivery systems. But it introduces the fundamental difficulty of ensuring that all erasurecoded fragments correspond to the same block of data. Without such assurance, a different block may ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
Erasure coding can reduce the space and bandwidth overheads of redundancy in faulttolerant data storage and delivery systems. But it introduces the fundamental difficulty of ensuring that all erasurecoded fragments correspond to the same block of data. Without such assurance, a different block may be reconstructed from different subsets of fragments. This paper develops a technique for providing this assurance without the bandwidth and computational overheads associated with current approaches. The core idea is to distribute with each fragment what we call homomorphic fingerprints. These fingerprints preserve the structure of the erasure code and allow each fragment to be independently verified as corresponding to a specific block. We demonstrate homomorphic fingerprinting functions that are secure, efficient, and compact.
How to Build a Hash Function from any CollisionResistant Function
, 2007
"... Recent collisionfinding attacks against hash functions such as MD5 and SHA1 motivate the use of provably collisionresistant (CR) functions in their place. Finding a collision in a provably CR function implies the ability to solve some hard problem (e.g., factoring). Unfortunately, existing provab ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
Recent collisionfinding attacks against hash functions such as MD5 and SHA1 motivate the use of provably collisionresistant (CR) functions in their place. Finding a collision in a provably CR function implies the ability to solve some hard problem (e.g., factoring). Unfortunately, existing provably CR functions make poor replacements for hash functions as they fail to deliver behaviors demanded by practical use. In particular, they are easily distinguished from a random oracle. We initiate an investigation into building hash functions from provably CR functions. As a method for achieving this, we present the MixCompressMix (MCM) construction; it envelopes any provably CR function H (with suitable regularity properties) between two injective “mixing” stages. The MCM construction simultaneously enjoys (1) provable collisionresistance in the standard model, and (2) indifferentiability from a monolithic random oracle when the mixing stages themselves are indifferentiable from a random oracle that observes injectivity. We instantiate our new design approach by specifying a blockcipherbased construction that