Results 11  20
of
35
The Classification of Hash Functions
, 1993
"... When we ask what makes a hash function `good', we usually get an answer which includes collision freedom as the main (if not sole) desideratum. However, we show here that given any collisionfree function, we can derive others which are also collisionfree, but cryptographically useless. This explai ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
When we ask what makes a hash function `good', we usually get an answer which includes collision freedom as the main (if not sole) desideratum. However, we show here that given any collisionfree function, we can derive others which are also collisionfree, but cryptographically useless. This explains why researchers have not managed to find many interesting consequences of this property. We also prove Okamoto's conjecture that correlation freedom is strictly stronger than collision freedom. We go on to show that there are actually rather many properties which hash functions may need. Hash functions for use with RSA must be multiplication free, in the sense that one cannot find X , Y and Z such that h(X)h(Y ) = h(Z); and more complex requirements hold for other signature schemes. Universal principles can be proposed from which all the freedom properties follow, but like most theoretical principles, they do not seem to give much value to a designer; at the practical level, the main imp...
Liability and Computer Security: Nine Principles
 in Computer Security  ESORICS 94, Springer LNCS v 875 pp 231245
"... . The conventional wisdom is that security priorities should be set by risk analysis. However, reality is subtly di#erent: many computer security systems are at least as much about shedding liability as about minimising risk. Banks use computer security mechanisms to transfer liability to their ..."
Abstract

Cited by 24 (7 self)
 Add to MetaCart
. The conventional wisdom is that security priorities should be set by risk analysis. However, reality is subtly di#erent: many computer security systems are at least as much about shedding liability as about minimising risk. Banks use computer security mechanisms to transfer liability to their customers; companies use them to transfer liability to their insurers, or (via the public prosecutor) to the taxpayer; and they are also used to shift the blame to other departments ("we did everything that GCHQ/the internal auditors told us to"). We derive nine principles which might help designers avoid the most common pitfalls. Introduction In the conventional model of technology, there is a smooth progression from research through development and engineering to a product. After this is fielded, the experience gained from its use provides feedback to the research team, and helps drive the next generation of products: Research # Development # Engineering # Product # This cyc...
Paving The Road To Network Security Or The Value Of Small Cobblestones
 In Proceedings of the 1994 Internet Society Symposium on Network and Distributed System Security
, 1994
"... Software subsystems that implement cryptographic security features can be built from small modules using uniform interfaces. The methods demonstrated in this paper illustrate how configuration flexibility can be achieved and how complex services can be constructed, all using the same building block ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
Software subsystems that implement cryptographic security features can be built from small modules using uniform interfaces. The methods demonstrated in this paper illustrate how configuration flexibility can be achieved and how complex services can be constructed, all using the same building block modules. These allow the configuration process to be independent of algorithm details, while the algorithms used in the subsystem are obvious. May 23, 1994 Department of Computer Science The University of Arizona Tucson, AZ 85721 1 This work supported in part by the National Computer Security Center Grant MDA90492C5151. This paper appears in ISOC'94. Authors' email addresses are ho, sean, rcs, dcs @cs.arizona.edu 1 INTRODUCTION Adding security to Internet protocols is a most worthy goal, one that is simultaneously easy and hard. It is often easy because there are many good ideas and existing approaches; it is hard because it is difficult to modify existing protocols or add new ones ac...
Optimal Treebased Onetime Digital Signature Schemes
 In STACS ’96: Proceedings of the 13th Annual Symposium on Theoretical Aspects of Computer Science
, 1996
"... . A minimal cutset of a tree directed from the leaves to the root is a minimal set of vertices such that every path from a leaf to the root meets at least one of these vertices. An order relation on the set of minmal cutsets can be defined: U V if and only if every vertex of U is on the path from s ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
. A minimal cutset of a tree directed from the leaves to the root is a minimal set of vertices such that every path from a leaf to the root meets at least one of these vertices. An order relation on the set of minmal cutsets can be defined: U V if and only if every vertex of U is on the path from some vertex in V to the root. Motivated by the design of efficient cryptographic digital signature schemes, the problem of constructing trees with a large number of pairwise incomparable minimal cutsets or, equivalently, with a large antichain in the poset of minimal cutsets, is considered. Keywords. Cryptography, digital signature schemes, trees, partially ordered sets. 1 Introduction We consider trees directed from the leaves to the root where every vertex has at most two predecessors. In this paper, a cutset of such a tree T is defined as a set of vertices which contains at least one vertex of every path from a leaf to the root. A cutset is minimal when it contains exactly one vertex of...
How to Forge DESEncrypted Messages in 2^28 Steps
, 1996
"... In this paper we suggest keycollision attacks, and show that the theoretic strength of a cipher cannot exceed the square root of the size of the key space. As a result, in some circumstances, some DES keys can be recovered while they are still in use, and these keys can then be used to forge messag ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
In this paper we suggest keycollision attacks, and show that the theoretic strength of a cipher cannot exceed the square root of the size of the key space. As a result, in some circumstances, some DES keys can be recovered while they are still in use, and these keys can then be used to forge messages: in particular, one key of DES can be recovered with complexity 2 28 , and one key of (threekey) tripleDES can be recovered with complexity 2 84 .
Digital Payment Systems Enabling Security and Unobservability
, 1989
"... In presentday cashless payment systems, the banks and (by installing a Trojan Horse) even the manufacturers of the computer equipment used could easily observe who pays what amount to whom and when. With the increasing digitization of these systems, e.g. pointofsale terminals and home banking, th ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
In presentday cashless payment systems, the banks and (by installing a Trojan Horse) even the manufacturers of the computer equipment used could easily observe who pays what amount to whom and when. With the increasing digitization of these systems, e.g. pointofsale terminals and home banking, the amount of transaction data and their computerization drastically increases. Therefore these payment systems become completely unacceptable, since compiling dossiers on the lifestyle and whereabouts of all clients will become easy. We describe the digital payment systems enabling unobservability of clients and arrange them in a general model to compare their different degrees of unobservability and their different levels of security. Since no single system has all desired features, we propose a suitable synthesis. Keywords Cashless payment, digital payment systems enabling unobservability of clients, linkability of actions, anonymous numbered accounts, tamperresistant devices, blindly sig...
Cryptography and Evidence
, 1997
"... The invention of publickey cryptography led to the notion that cryptographically protected messages could be used as evidence to convince an impartial adjudicator that a disputed event had in fact occurred. Information stored in a computer is easily modi ed, and so records can be falsi ed or retros ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
The invention of publickey cryptography led to the notion that cryptographically protected messages could be used as evidence to convince an impartial adjudicator that a disputed event had in fact occurred. Information stored in a computer is easily modi ed, and so records can be falsi ed or retrospectively modi ed. Cryptographic protection prevents modi cation, and it is hoped that this will make cryptographically protected data acceptable as evidence. This usage of cryptography to render an event undeniable has become known as nonrepudiation. This dissertation is an enquiry into the fundamental limitations of this application of cryptography, and the disadvantages of the techniques which are currently in use. In the course of this investigation I consider the converse problem, of ensuring that an instance of communication between computer systems leaves behind no unequivocal evidence of its having taken place. Features of communications protocols that were seen as defects from the standpoint of nonrepudiation can be seen as bene ts from the standpoint of this converse problem, which I call \plausible deniability". i Declaration This dissertation is the result of my own work and includes nothing which is the outcome of work done in collaboration. This dissertation is not substantially the same as any other that I have submitted for a degree, diploma, or other quali cation at any other university. Acknowledgements Iwould like to thank Peter Kirstein and Ben Bacarisse for managing the research projects which caused me to become interested in this area; Steve Kent for many interesting discussions about the problems of key certi cation; Russ Housley for suggesting the term \plausible deniability"; Roger Needham for being my supervisor; and Bruce Christianson for his advice on how to write a dissertation. ii To my grandfather,
OnLine Ciphers and the HashCBC constructions
 Advances in Cryptology  CRYPTO 2000. Lecture Notes in Computer Science
, 2001
"... Abstract We initiate a study of online ciphers. These are ciphers that can take input plaintexts oflarge and varying lengths and will output the ith block of the ciphertext after having processedonly the first i blocks of the plaintext. Such ciphers permit lengthpreserving encryption of adata stre ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
Abstract We initiate a study of online ciphers. These are ciphers that can take input plaintexts oflarge and varying lengths and will output the ith block of the ciphertext after having processedonly the first i blocks of the plaintext. Such ciphers permit lengthpreserving encryption of adata stream with only a single pass through the data. We provide security definitions for this primitive and study its basic properties. We then provide attacks on some possible candidates,including CBC with fixed IV. We then provide two constructions, HCBC1 and HCBC2, basedon a given block cipher E and a family of computationally AXU functions. HCBC1 is provensecure against chosenplaintext attacks assuming that E is a PRP secure against chosenplaintextattacks, while HCBC2 is proven secure against chosenciphertext attacks assuming that E is aPRP secure against chosenciphertext attacks.
More Efficient Software Implementations of (Generalized) DES
, 1990
"... By preserving the macro structure of the Data Encryption Standard (DES), but by allowing the user to choose 1. 16.48 independent key bits instead of generating them all using only 56 key bits, 2. arbitrary substitutions S 1 , ..., S 8 and 3. arbitrary permutations IP and P, and 4. an arbitra ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
By preserving the macro structure of the Data Encryption Standard (DES), but by allowing the user to choose 1. 16.48 independent key bits instead of generating them all using only 56 key bits, 2. arbitrary substitutions S 1 , ..., S 8 and 3. arbitrary permutations IP and P, and 4. an arbitrary expanding permutation E, we obtain a very general and presumably much stronger cipher called generalized DES, or GDES for short. A cipher having the first three extensions is called GDES with nonarbitrary E. We choose, in an unorthodox way, from some well known equivalent representations of GDES and some well suited table combinations and implementations. Concatenations of substitutions and permutations are precomputed and tabulated. Since direct tabulation of e.g. a permutation of 32 bits requires 2 32 entries of 4 bytes each, which clearly exceeds the main memories of today, the big table is split into smaller ones that permute disjoint and compact parts of the input bits at the...
Data Encryption Standard
 In FIPS PUB 46, Federal Information Processing Standards Publication
, 1977
"... its strength against attacks The Data Encryption Standard (DES) was developed by an IBM team around 1974 and adopted as a national standard in 1977. Since that time, many cryptanalysts have attempted to find shortcuts for breaking the system. In this paper, we examine one such attempt, the method of ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
its strength against attacks The Data Encryption Standard (DES) was developed by an IBM team around 1974 and adopted as a national standard in 1977. Since that time, many cryptanalysts have attempted to find shortcuts for breaking the system. In this paper, we examine one such attempt, the method of differential cryptanalysis, published by Blham and Shamir. We show some of the safeguards against differential cryptanalysis that were built into the system from the beginning, with the result that more than 1015 bytes of chosen plaintext are required for this attack to succeed.