Results 21  30
of
360
SWIFFT: A Modest Proposal for FFT Hashing
"... We propose SWIFFT, a collection of compression functions that are highly parallelizable and admit very efficient implementations on modern microprocessors. The main technique underlying our functions is a novel use of the Fast Fourier Transform (FFT) to achieve “diffusion, ” together with a linear ..."
Abstract

Cited by 52 (17 self)
 Add to MetaCart
(Show Context)
We propose SWIFFT, a collection of compression functions that are highly parallelizable and admit very efficient implementations on modern microprocessors. The main technique underlying our functions is a novel use of the Fast Fourier Transform (FFT) to achieve “diffusion, ” together with a linear combination to achieve compression and “confusion. ” We provide a detailed security analysis of concrete instantiations, and give a highperformance software implementation that exploits the inherent parallelism of the FFT algorithm. The throughput of our implementation is competitive with that of SHA256, with additional parallelism yet to be exploited. Our functions are set apart from prior proposals (having comparable efficiency) by a supporting asymptotic security proof: it can be formally proved that finding a collision in a randomlychosen function from the family (with noticeable probability) is at least as hard as finding short vectors in cyclic/ideal lattices in the worst case.
Accountable Certificate Management Using Undeniable Attestations
 COMPUTER AND COMMUNICATIONS SECURITY
, 2000
"... This paper initiates a study of accountable certificate management methods, necessary to support longterm authenticity of digital documents. Our main contribution is a model for accountable certificate management, where clients receive attestations confirming inclusion/removal of their certificates ..."
Abstract

Cited by 48 (3 self)
 Add to MetaCart
This paper initiates a study of accountable certificate management methods, necessary to support longterm authenticity of digital documents. Our main contribution is a model for accountable certificate management, where clients receive attestations confirming inclusion/removal of their certificates from the database of valid certificates. We explain why accountability depends on the inability of the third parties to create contradictory attestations. After that we define an undeniable attester as a primitive that provides efficient attestation creation, publishing and verification, so that it is intractable to create contradictory attestations. We introduce authenticated search trees and build an efficient undeniable attester upon them. The proposed system is the first accountable longterm certificate management system. Moreover, authenticated search trees can be used in many securitycritical applications instead of the (sorted) hash trees to reduce trust in the authorities, without decrease in efficiency. Therefore, the undeniable attester promises looks like a very useful cryptographic primitive with a wide range of applications.
Randomness extraction and key derivation using the cbc, cascade and hmac modes
 In Franklin [14
"... Abstract. We study the suitability of common pseudorandomness modes associated with cryptographic hash functions and block ciphers (CBCMAC, Cascade and HMAC) for the task of “randomness extraction”, namely, the derivation of keying material from semisecret and/or semirandom sources. Important appl ..."
Abstract

Cited by 48 (5 self)
 Add to MetaCart
Abstract. We study the suitability of common pseudorandomness modes associated with cryptographic hash functions and block ciphers (CBCMAC, Cascade and HMAC) for the task of “randomness extraction”, namely, the derivation of keying material from semisecret and/or semirandom sources. Important applications for such extractors include the derivation of strong cryptographic keys from nonuniform sources of randomness (for example, to extract a seed for a pseudorandom generator from a weak source of physical or digital noise), and the derivation of pseudorandom keys from a DiffieHellman value. Extractors are closely related in their applications to pseudorandom functions and thus it is attractive to (re)use the common pseudorandom modes as randomness extractors. Yet, the crucial difference between pseudorandom generation and randomness extraction is that the former uses random secret keys while the latter uses random but known keys. We show that under a variety of assumptions on the underlying primitives (block ciphers and compression functions), ranging from ideal randomness assumptions to realistic universalhashing properties, these modes induce good extractors. Hence, these schemes represent a more practical alternative to combinatorial extractors (that are seldom used in practice), and a betteranalyzed alternative to the common practice of using SHA1 or MD5 (as a single unkeyed function) for randomness extraction. In particular, our results serve to validate the method of key extraction and key derivation from DiffieHellman values used in the IKE (IPsec’s Key Exchange) protocol.
The PHOTON Family of Lightweight Hash Functions
 CRYPTO, volume 6841 of LNCS
, 2011
"... Abstract. RFID security is currently one of the major challenges cryptography has to face, often solved by protocols assuming that an ontag hash function is available. In this article we present the PHOTON lightweight hashfunction family, available in many different flavors and suitable for extrem ..."
Abstract

Cited by 48 (8 self)
 Add to MetaCart
(Show Context)
Abstract. RFID security is currently one of the major challenges cryptography has to face, often solved by protocols assuming that an ontag hash function is available. In this article we present the PHOTON lightweight hashfunction family, available in many different flavors and suitable for extremely constrained devices such as passive RFID tags. Our proposal uses a spongelike construction as domain extension algorithm and an AESlike primitive as internal unkeyed permutation. This allows us to obtain the most compact hash function known so far (about 1120 GE for 64bit collision resistance security), reaching areas very close to the theoretical optimum (derived from the minimal internal state memory size). Moreover, the speed achieved by PHOTON also compares quite favorably to its competitors. This is mostly due to the fact that unlike for previously proposed schemes, our proposal is very simple to analyze and one can derive tight AESlike bounds on the number of active Sboxes. This kind of AESlike primitive is usually not well suited for ultra constrained environments, but we describe in this paper a new method for generating the column mixing layer in a serial way, lowering drastically the area required. Finally, we slightly extend the sponge framework in order to offer interesting tradeoffs between speed and preimage security for small messages, the classical usecase in hardware.
A composition theorem for universal oneway hash functions
 In Eurocrypt ’00
, 2000
"... Abstract. In this paper we present a new scheme for constructing universal oneway hash functions that hash arbitrarily long messages out of universal oneway hash functions that hash fixedlength messages. The new construction is extremely simple and is also very efficient, yielding shorter keys th ..."
Abstract

Cited by 47 (5 self)
 Add to MetaCart
Abstract. In this paper we present a new scheme for constructing universal oneway hash functions that hash arbitrarily long messages out of universal oneway hash functions that hash fixedlength messages. The new construction is extremely simple and is also very efficient, yielding shorter keys than previously proposed composition constructions. 1
Cryptographic Hash Functions: A Survey
, 1995
"... This paper gives a survey on cryptographic hash functions. It gives an overview of all types of hash functions and reviews design principals and possible methods of attacks. It also focuses on keyed hash functions and provides the applications, requirements, and constructions of keyed hash functions ..."
Abstract

Cited by 46 (7 self)
 Add to MetaCart
(Show Context)
This paper gives a survey on cryptographic hash functions. It gives an overview of all types of hash functions and reviews design principals and possible methods of attacks. It also focuses on keyed hash functions and provides the applications, requirements, and constructions of keyed hash functions.
On the Need for Multipermutations: Cryptanalysis of MD4 and SAFER
, 1994
"... Cryptographic primitives are usually based on a network with some gates. In [SV94], it is claimed that all gates should be multipermutations. In this paper, we investigate a few combinatorial properties of multipermutations. We argue that gates which fail to be multipermutations can open the way to ..."
Abstract

Cited by 44 (2 self)
 Add to MetaCart
Cryptographic primitives are usually based on a network with some gates. In [SV94], it is claimed that all gates should be multipermutations. In this paper, we investigate a few combinatorial properties of multipermutations. We argue that gates which fail to be multipermutations can open the way to unsuspected attacks. We illustrate this statement with two examples. Firstly, we show how to construct collisions to MD4 restricted to its first two rounds. This allows to forge digests close to each other using the full compression function of MD4. Secondly, we show that some generalizations of SAFER are subject to attack faster than exhaustive search in 6:1% cases. This attack can be implemented if we decrease the number of rounds from 6 to 4. In [SV94], multipermutations are introduced as formalization of perfect diffusion. The aim of this paper is to show that the concept of multipermutation is a basic tool in the design of dedicated cryptographic functions, as functions that do not rea...
Herding hash functions and the Nostradamus attack
 of Lecture Notes in Computer Science
, 2006
"... Abstract. In this paper, we develop a new attack on Damg˚ardMerkle hash functions, called the herding attack, in which an attacker who can find many collisions on the hash function by brute force can first provide the hash of a message, and later “herd ” any given starting part of a message to that ..."
Abstract

Cited by 43 (6 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper, we develop a new attack on Damg˚ardMerkle hash functions, called the herding attack, in which an attacker who can find many collisions on the hash function by brute force can first provide the hash of a message, and later “herd ” any given starting part of a message to that hash value by the choice of an appropriate suffix. We focus on a property which hash functions should have–Chosen Target Forced Prefix (CTFP) preimage resistance–and show the distinction between Damg˚ardMerkle construction hashes and random oracles with respect to this property. We describe a number of ways that violation of this property can be used in arguably practical attacks on realworld applications of hash functions. An important lesson from these results is that hash functions susceptible to collisionfinding attacks, especially bruteforce collisionfinding attacks, cannot in general be used to prove knowledge of a secret value. 1
On the compressibility of NP instances and cryptographic applications
 In Electronic Colloquium on Computational Complexity (ECCC
, 2006
"... We initiate the study of compression that preserves the solution to an instance of a problem rather than preserving the instance itself. Our focus is on the compressibility of NP decision problems. We consider NP problems that have long instances but relatively short witnesses. The question is, can ..."
Abstract

Cited by 39 (1 self)
 Add to MetaCart
(Show Context)
We initiate the study of compression that preserves the solution to an instance of a problem rather than preserving the instance itself. Our focus is on the compressibility of NP decision problems. We consider NP problems that have long instances but relatively short witnesses. The question is, can one efficiently compress an instance and store a shorter representation that maintains the information of whether the original input is in the language or not. We want the length of the compressed instance to be polynomial in the length of the witness rather than the length of original input. Such compression enables to succinctly store instances until a future setting will allow solving them, either via a technological or algorithmic breakthrough or simply until enough time has elapsed. We give a new classification of NP with respect to compression. This classification forms a stratification of NP that we call the VC hierarchy. The hierarchy is based on a new type of reduction called Wreduction and there are compressioncomplete problems for each class. Our motivation for studying this issue stems from the vast cryptographic implications compressibility has. For example, we say that SAT is compressible if there exists a polynomial p(·, ·) so that given a formula consisting of m clauses over n variables it is possible to come up with an equivalent (w.r.t satisfiability) formula of size at most p(n, logm). Then given a compression algorithm for SAT we provide a construction of collision resistant hash functions from any oneway function. This task was shown to be impossible via blackbox reductions [57], and indeed the construction presented is inherently nonblackbox. Another application of SAT compressibility is a cryptanalytic result concerning the limitation of everlasting security in the bounded storage model when mixed with (time) complexity based cryptography. In addition, we study an approach to constructing an Oblivious Transfer Protocol from any oneway function. This approach is based on compression for SAT that also has a property that we call witness retrievability. However, we mange to prove severe limitations on the ability to achieve witness retrievable compression of SAT. 1
Design principles for iterated hash functions
 CRYPTOLOGY EPRINT ARCHIVE
, 2004
"... This paper deals with the security of iterated hash functions against generic attacks, such as, e.g., Joux’ multicollision attacks from Crypto 04 [6]. The core idea is to increase the size of the internal state of an nbit hash function to w> n bit. Variations of this core idea allow the use of ..."
Abstract

Cited by 34 (1 self)
 Add to MetaCart
This paper deals with the security of iterated hash functions against generic attacks, such as, e.g., Joux’ multicollision attacks from Crypto 04 [6]. The core idea is to increase the size of the internal state of an nbit hash function to w> n bit. Variations of this core idea allow the use of a compression function with n output bits, even if the compression function itself is based on a block cipher. In a formal model, it is shown that these modifications quantifiably improve the security of iterated hash functions against generic attacks.