Results 1  10
of
35
A Proposal for a New Block Encryption Standard
, 1991
"... A new secretkey block cipher is proposed as a candidate for a new encryption standard. In the proposed cipher, the plaintext and the ciphertext are 64 bit blocks, while the secret key is 128 bit long. The cipher is based on the design concept of "mixing operations from different algebraic grou ..."
Abstract

Cited by 161 (3 self)
 Add to MetaCart
A new secretkey block cipher is proposed as a candidate for a new encryption standard. In the proposed cipher, the plaintext and the ciphertext are 64 bit blocks, while the secret key is 128 bit long. The cipher is based on the design concept of "mixing operations from different algebraic groups" The cipher structure was chosen to provide confusion and diffusion and to facilitate both hardware and software implementations.
A Universal Statistical Test for Random Bit Generators
 Journal of cryptology
, 1992
"... A new statistical test for random bit generators is presented which, in contrast to presently used statistical tests, is universal in the sense that it can detect any significant deviation of a device's output statistics from the statistics of a truly random bit source when the device can be mo ..."
Abstract

Cited by 68 (0 self)
 Add to MetaCart
(Show Context)
A new statistical test for random bit generators is presented which, in contrast to presently used statistical tests, is universal in the sense that it can detect any significant deviation of a device's output statistics from the statistics of a truly random bit source when the device can be modeled as an ergodic stationary source with finite memory but arbitrary (unknown) state transition probabilities. The test parameter is closely related to the device's perbit entropy which is shown to be the correct quality measure for a secretkey source in a cryptographic application. The test hence measures the cryptographic badness of a device's possible defect. The test is easy to implement and very fast and thus wellsuited for practical applications. A sample program listing is provided. Keywords. Randomness, Random bit generator, Statistical test, Entropy, Ergodic stationary source, Exhaustive key search.
Software performance of universal hash functions
 In Advances in Cryptology — EUROCRYPT ’99
, 1999
"... Abstract. This paper compares the parameters sizes and software performance of several recent constructions for universal hash functions: bucket hashing, polynomial hashing, Toeplitz hashing, division hashing, evaluation hashing, and MMH hashing. An objective comparison between these widely varying ..."
Abstract

Cited by 30 (0 self)
 Add to MetaCart
(Show Context)
Abstract. This paper compares the parameters sizes and software performance of several recent constructions for universal hash functions: bucket hashing, polynomial hashing, Toeplitz hashing, division hashing, evaluation hashing, and MMH hashing. An objective comparison between these widely varying approaches is achieved by defining constructions that offer a comparable security level. It is also demonstrated how the security of these constructions compares favorably to existing MAC algorithms, the security of which is less understood. 1
The Shannon Cipher System with a Guessing Wiretapper
 IEEE Trans. Inform. Theory
, 1998
"... The Shannon theory of cipher systems is combined with recent work on guessing values of random variables. The security of encryption systems is measured in terms of moments of the number of guesses needed for the wiretapper to uncover the plaintext given the cryptogram. While the encrypter aims at m ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
The Shannon theory of cipher systems is combined with recent work on guessing values of random variables. The security of encryption systems is measured in terms of moments of the number of guesses needed for the wiretapper to uncover the plaintext given the cryptogram. While the encrypter aims at maximizing the guessing effort, the wiretapper strives to minimize it, e.g., by ordering guesses according to descending order of posterior probabilities of plaintexts given the cryptogram. For a memoryless plaintext source and a given key rate, a singleletter characterization is given for the highest achievable guessing exponent function, that is, the exponential rate of the ae th moment of the number of guesses as a function of the plaintext message length. Moreover, we demonstrate asymptotically optimal strategies for both encryption and guessing, which are universal in the sense of being independent of the statistics of the source. The guessing exponent is then investigated as a functi...
Cryptographic Protocols over Open Distributed Systems: A Taxonomy of Flaws and related Protocol Analysis Tools
 In Proceedings of the 16th International Conference on Computer Safety, Reliability and Security
, 1997
"... When designing and implementing cryptographic protocols one must avoid a number of possible flaws. In this paper we divide possible flaws based on the flaw pathology and the corresponding attack method, into elementary protocol flaws, password/key guessing flaws, stale message flaws, parallel sessio ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
When designing and implementing cryptographic protocols one must avoid a number of possible flaws. In this paper we divide possible flaws based on the flaw pathology and the corresponding attack method, into elementary protocol flaws, password/key guessing flaws, stale message flaws, parallel session flaws, internal protocol flaws, and cryptosystem flaws. We then outline and comment on different attack construction and inferencebased formal methods, protocol analysis tools, and process integration techniques and their effectiveness in aiding the cryptographic protocol design process by discovering protocol flaws with regard to the aforementioned proposed taxonomy of them. * In Peter Daniel, editor, 16th International Conference on Computer Safety, Reliability and Security: SAFECOMP '97, pages 123137, York, UK, September 1997. European Workshop on Industrial Computer Systems: TC7, Springer Verlag. + This is a machinereadable rendering of a working paper draft that led to a pub...
Automatic Secret Keys from Reciprocal MIMO Wireless Channels: Measurements and Analysis
 IEEE Transactions on Information Forensics and Security
"... Abstract—Information theoretic limits for random key generation in multipleinput multipleoutput (MIMO) wireless systems exhibiting a reciprocal channel response are investigated experimentally with a new threenode MIMO measurement campaign. As background, simple expressions are presented for th ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Information theoretic limits for random key generation in multipleinput multipleoutput (MIMO) wireless systems exhibiting a reciprocal channel response are investigated experimentally with a new threenode MIMO measurement campaign. As background, simple expressions are presented for the number of available key bits, as well as the number of bits that are secure from a close eavesdropper. Two methods for generating secret keys are analyzed in the context of MIMO channels and their mismatch rate and efficiency are derived. A new wideband indoor MIMO measurement campaign in the 2.51 to 2.59GHz band is presented, whose purpose is to study the number of available key bits in both lineofsight and nonlineofsight environments. Application of the key generation methods to measured propagation channels indicates key generation rates that can be obtained in practice for fourelement arrays. Index Terms—Cryptography, encryption, measurement, MIMO, time varying channels.
On the Shannon cipher system with a capacitylimited keydistribution channel
 IEEE Transactions on Information Theory
, 2006
"... We consider the Shannon cipher system in a setting where the secret key is delivered to the legitimate receiver via a channel with limited capacity. For this setting, we characterize the achievable region in the space of three figures of merit: the security (measured in terms of the equivocation), t ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
We consider the Shannon cipher system in a setting where the secret key is delivered to the legitimate receiver via a channel with limited capacity. For this setting, we characterize the achievable region in the space of three figures of merit: the security (measured in terms of the equivocation), the compressibility of the cryptogram, and the distortion associated with the reconstruction of the plaintext source. Although lossy reconstruction of the plaintext does not rule out the option that the (noisy) decryption key would differ, to a certain extent, from the encryption key, we show, nevertheless, that the best strategy is to strive for perfect match between the two keys, by applying reliable channel coding to the key bits, and to control the distortion solely via rate– distortion coding of the plaintext source before the encryption. In this sense, our result has a flavor similar to that of the classical source–channel separation theorem. Some variations and extensions of this model are discussed as well. Index Terms: Shannon cipher system, key distribution, encryption, cryptography, source–channel separation.
A 177 Mbit/s VLSI Implementation of the International Data Encryption Algorithm
 IEEE JOURNAL OF SOLICSTATE CIRCUITS SPECIAL ISSUE ON THE 1993 CUSTOM INTEGRATED CIRCUITS CO...
, 1993
"... A VLSI implementation of the International Data Encryption Algorithm is presented. Security considerations led to novel system concepts in chip design including protection of sensitive information and online failure detection capabilities. BIST was instrumental for reconciling contradicting require ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
A VLSI implementation of the International Data Encryption Algorithm is presented. Security considerations led to novel system concepts in chip design including protection of sensitive information and online failure detection capabilities. BIST was instrumental for reconciling contradicting requirements of VLSI testability and cryptographic security. The VLSI chip implements data encryption and decryption in a single hardware unit. All important standardized modes of operation of block ciphers, such as ECB, CBC, CFB, OFB and MAC, are supported. In addition, new modes are proposed and implemented to fully exploit the algorithm's inherent parallelism. With a system clock frequency of 25 MHz the device permits a data conversion rate of more than 177 Mbit/s. Therefore, the chip can be applied to online encryption in highspeed networking protocols like ATM or FDDI.
Some Research Issues in a Heterogeneous Terminal and Transport Environment for Multimedia Services
 in Workshop on Adaptive Systems, Intelligent Approaches, Massively Parallel Computing and Emergent Techniques in Signal Processing and Communications
, 1994
"... Continuousmedia (CM) services like voice, audio, video, and animation utilize three primary signalprocessing operations compression, encryption, and errorcorrection coding that have a substantial impact on the network architecture. Future networks will be heterogeneous, consisting of combinati ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Continuousmedia (CM) services like voice, audio, video, and animation utilize three primary signalprocessing operations compression, encryption, and errorcorrection coding that have a substantial impact on the network architecture. Future networks will be heterogeneous, consisting of combinations of different types of subnets, such as wireless access, the public telephone network, Internet, and broadband ATM. Even in the distant future, we expect wireless access to a broadband network to be common. We point out a number of other issues relating to the signal processing aspects of CM services, with particular emphasis on the traffic efficiency of wireless access links, low delay, high subjective quality, and privacy by endtoend encryption. We point out that popular but simplistic approaches involving the use of transcoding (conversion from one compression standard to another) have a number of undesirable characteristics, among them the realization of a network infrastructure relatively closed to change and inconsistent with privacy. We define a networking framework based on a “medley gateway ” between heterogenous subnets that is open to new services, allows privacy (endtoend encryption under user control), and good traffic efficiency on all links of the network (through joint source/channel coding as appropriate). The medley gateway also opens up new possibilities for exploiting network characteristics in CM services such as video. A key feature is a substream structure that makes certain critical properties of the source visible within the network, even with encryption. We mention a number of open issues relating to resource allocation in session establishment and the design of medley source and transport elements.