Results 1  10
of
47
A Proposal for a New Block Encryption Standard
, 1991
"... A new secretkey block cipher is proposed as a candidate for a new encryption standard. In the proposed cipher, the plaintext and the ciphertext are 64 bit blocks, while the secret key is 128 bit long. The cipher is based on the design concept of "mixing operations from different algebraic grou ..."
Abstract

Cited by 189 (3 self)
 Add to MetaCart
A new secretkey block cipher is proposed as a candidate for a new encryption standard. In the proposed cipher, the plaintext and the ciphertext are 64 bit blocks, while the secret key is 128 bit long. The cipher is based on the design concept of "mixing operations from different algebraic groups" The cipher structure was chosen to provide confusion and diffusion and to facilitate both hardware and software implementations.
A Universal Statistical Test for Random Bit Generators
 Journal of cryptology
, 1992
"... A new statistical test for random bit generators is presented which, in contrast to presently used statistical tests, is universal in the sense that it can detect any significant deviation of a device's output statistics from the statistics of a truly random bit source when the device can be mo ..."
Abstract

Cited by 84 (0 self)
 Add to MetaCart
(Show Context)
A new statistical test for random bit generators is presented which, in contrast to presently used statistical tests, is universal in the sense that it can detect any significant deviation of a device's output statistics from the statistics of a truly random bit source when the device can be modeled as an ergodic stationary source with finite memory but arbitrary (unknown) state transition probabilities. The test parameter is closely related to the device's perbit entropy which is shown to be the correct quality measure for a secretkey source in a cryptographic application. The test hence measures the cryptographic badness of a device's possible defect. The test is easy to implement and very fast and thus wellsuited for practical applications. A sample program listing is provided. Keywords. Randomness, Random bit generator, Statistical test, Entropy, Ergodic stationary source, Exhaustive key search.
Software performance of universal hash functions
 In Advances in Cryptology — EUROCRYPT ’99
, 1999
"... Abstract. This paper compares the parameters sizes and software performance of several recent constructions for universal hash functions: bucket hashing, polynomial hashing, Toeplitz hashing, division hashing, evaluation hashing, and MMH hashing. An objective comparison between these widely varying ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
(Show Context)
Abstract. This paper compares the parameters sizes and software performance of several recent constructions for universal hash functions: bucket hashing, polynomial hashing, Toeplitz hashing, division hashing, evaluation hashing, and MMH hashing. An objective comparison between these widely varying approaches is achieved by defining constructions that offer a comparable security level. It is also demonstrated how the security of these constructions compares favorably to existing MAC algorithms, the security of which is less understood. 1
The Shannon Cipher System with a Guessing Wiretapper
 IEEE Trans. Inform. Theory
, 1998
"... The Shannon theory of cipher systems is combined with recent work on guessing values of random variables. The security of encryption systems is measured in terms of moments of the number of guesses needed for the wiretapper to uncover the plaintext given the cryptogram. While the encrypter aims at m ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
(Show Context)
The Shannon theory of cipher systems is combined with recent work on guessing values of random variables. The security of encryption systems is measured in terms of moments of the number of guesses needed for the wiretapper to uncover the plaintext given the cryptogram. While the encrypter aims at maximizing the guessing effort, the wiretapper strives to minimize it, e.g., by ordering guesses according to descending order of posterior probabilities of plaintexts given the cryptogram. For a memoryless plaintext source and a given key rate, a singleletter characterization is given for the highest achievable guessing exponent function, that is, the exponential rate of the ae th moment of the number of guesses as a function of the plaintext message length. Moreover, we demonstrate asymptotically optimal strategies for both encryption and guessing, which are universal in the sense of being independent of the statistics of the source. The guessing exponent is then investigated as a functi...
Automatic Secret Keys from Reciprocal MIMO Wireless Channels: Measurements and Analysis
 IEEE Transactions on Information Forensics and Security
"... Abstract—Information theoretic limits for random key generation in multipleinput multipleoutput (MIMO) wireless systems exhibiting a reciprocal channel response are investigated experimentally with a new threenode MIMO measurement campaign. As background, simple expressions are presented for th ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Information theoretic limits for random key generation in multipleinput multipleoutput (MIMO) wireless systems exhibiting a reciprocal channel response are investigated experimentally with a new threenode MIMO measurement campaign. As background, simple expressions are presented for the number of available key bits, as well as the number of bits that are secure from a close eavesdropper. Two methods for generating secret keys are analyzed in the context of MIMO channels and their mismatch rate and efficiency are derived. A new wideband indoor MIMO measurement campaign in the 2.51 to 2.59GHz band is presented, whose purpose is to study the number of available key bits in both lineofsight and nonlineofsight environments. Application of the key generation methods to measured propagation channels indicates key generation rates that can be obtained in practice for fourelement arrays. Index Terms—Cryptography, encryption, measurement, MIMO, time varying channels.
Principles of Physical Layer Security in Multiuser Wireless Networks: A Survey
"... This paper provides a comprehensive review of the domain of physical layer security in multiuser wireless networks. The essential premise of physical layer security is to enable the exchange of confidential messages over a wireless medium in the presence of unauthorized eavesdroppers, without rely ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
This paper provides a comprehensive review of the domain of physical layer security in multiuser wireless networks. The essential premise of physical layer security is to enable the exchange of confidential messages over a wireless medium in the presence of unauthorized eavesdroppers, without relying on higherlayer encryption. This can be achieved primarily in two ways: without the need for a secret key by intelligently designing transmit coding strategies, or by exploiting the wireless communication medium to develop secret keys over public channels. The survey begins with an overview of the foundations dating back to the pioneering work of Shannon and Wyner on informationtheoretic security. We then describe the evolution of secure transmission strategies from pointtopoint channels to multipleantenna systems, followed by generalizations to multiuser broadcast, multipleaccess, interference, and relay networks. Secretkey generation and establishment protocols based on physical layer mechanisms are subsequently covered. Approaches for secrecy based on channel coding design are then examined, along with a description of interdisciplinary approaches based on game theory and stochastic geometry. The associated problem of physical layer message authentication is also briefly introduced. The survey concludes with observations on potential research directions in this area.
On the Shannon cipher system with a capacitylimited keydistribution channel
 IEEE Transactions on Information Theory
, 2006
"... We consider the Shannon cipher system in a setting where the secret key is delivered to the legitimate receiver via a channel with limited capacity. For this setting, we characterize the achievable region in the space of three figures of merit: the security (measured in terms of the equivocation), t ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
We consider the Shannon cipher system in a setting where the secret key is delivered to the legitimate receiver via a channel with limited capacity. For this setting, we characterize the achievable region in the space of three figures of merit: the security (measured in terms of the equivocation), the compressibility of the cryptogram, and the distortion associated with the reconstruction of the plaintext source. Although lossy reconstruction of the plaintext does not rule out the option that the (noisy) decryption key would differ, to a certain extent, from the encryption key, we show, nevertheless, that the best strategy is to strive for perfect match between the two keys, by applying reliable channel coding to the key bits, and to control the distortion solely via rate– distortion coding of the plaintext source before the encryption. In this sense, our result has a flavor similar to that of the classical source–channel separation theorem. Some variations and extensions of this model are discussed as well. Index Terms: Shannon cipher system, key distribution, encryption, cryptography, source–channel separation.
Cryptographic Protocols over Open Distributed Systems: A Taxonomy of Flaws and related Protocol Analysis Tools
 In Proceedings of the 16th International Conference on Computer Safety, Reliability and Security
, 1997
"... When designing and implementing cryptographic protocols one must avoid a number of possible flaws. In this paper we divide possible flaws based on the flaw pathology and the corresponding attack method, into elementary protocol flaws, password/key guessing flaws, stale message flaws, parallel sessio ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
When designing and implementing cryptographic protocols one must avoid a number of possible flaws. In this paper we divide possible flaws based on the flaw pathology and the corresponding attack method, into elementary protocol flaws, password/key guessing flaws, stale message flaws, parallel session flaws, internal protocol flaws, and cryptosystem flaws. We then outline and comment on different attack construction and inferencebased formal methods, protocol analysis tools, and process integration techniques and their effectiveness in aiding the cryptographic protocol design process by discovering protocol flaws with regard to the aforementioned proposed taxonomy of them. * In Peter Daniel, editor, 16th International Conference on Computer Safety, Reliability and Security: SAFECOMP '97, pages 123137, York, UK, September 1997. European Workshop on Industrial Computer Systems: TC7, Springer Verlag. + This is a machinereadable rendering of a working paper draft that led to a pub...