Results 1  10
of
46
A general construction of tweakable block ciphers and different modes of operations
 IEEE Transactions on Information Theory
"... Abstract—This work builds on earlier work by Rogaway at Asiacrypt 2004 on tweakable block cipher (TBC) and modes of operations. Our first contribution is to generalize Rogaway’s TBC construction by working over a ring and by the use of a masking sequence of functions. The ring can be instantiated ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
(Show Context)
Abstract—This work builds on earlier work by Rogaway at Asiacrypt 2004 on tweakable block cipher (TBC) and modes of operations. Our first contribution is to generalize Rogaway’s TBC construction by working over a ring and by the use of a masking sequence of functions. The ring can be instantiated as either GF or as. Further, over GF, efficient instantiations of the masking sequence of functions can be done using either a binary linear feedback shift register (LFSR); a powering construction; a cellular automata map; or by using a wordoriented LFSR. Rogaway’s TBC construction was built from the powering construction over GF. Our second contribution is to use the general TBC construction to instantiate constructions of various modes of operations including authenticated encryption (AE) and message authentication code (MAC). In particular, this gives rise to a family of efficient onepass AE modes of operation. Out of these, the mode of operation obtained by the use of wordoriented LFSR promises to provide a masking method which is more efficient than the one used in the well known AE protocol called OCB1. Index Terms—Authenticated encryption with associated data, message authentication code, modes of operations, tweakable block cipher (TBC). I.
Zero Correlation Linear Cryptanalysis with Reduced Data Complexity, IACR Eprint Archive Report
, 2012
"... Abstract. Zero correlation linear cryptanalysis is a novel key recovery technique for block ciphers proposed in [5]. It is based on linear approximations with probability of exactly 1/2 (which corresponds to the zero correlation). Some block ciphers turn out to have multiple linear approximations ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Zero correlation linear cryptanalysis is a novel key recovery technique for block ciphers proposed in [5]. It is based on linear approximations with probability of exactly 1/2 (which corresponds to the zero correlation). Some block ciphers turn out to have multiple linear approximations with correlation zero for each key over a considerable number of rounds. Zero correlation linear cryptanalysis is the counterpart of impossible differential cryptanalysis in the domain of linear cryptanalysis, though having many technical distinctions and sometimes resulting in stronger attacks. In this paper, we propose a statistical technique to significantly reduce the data complexity using the high number of zero correlation linear approximations available. We also identify zero correlation linear approximations for 14 and 15 rounds of TEA and XTEA. Those result in keyrecovery attacks for 21round TEA and 25round XTEA, while requiring less data than the full code book. In the single secret key setting, these are structural attacks breaking the highest number of rounds for both ciphers. The findings of this paper demonstrate that the prohibitive data complexity requirements are not inherent in the zero correlation linear cryptanalysis and can be overcome. Moreover, our results suggest that zero correlation linear cryptanalysis can actually break more rounds than the best known impossible differential cryptanalysis does for relevant block ciphers. This might make a security reevaluation of some ciphers necessary in the view of the new attack.
Probability distributions of correlation and differentials in block ciphers. Cryptology ePrint Archive, Report 2005/212
, 2005
"... In this paper, we derive the probability distributions of difference propagation probabilities and inputoutput correlations for random functions and block ciphers, for several of them for the first time. We show that these parameters have distributions that are wellstudied in the field of probabil ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
In this paper, we derive the probability distributions of difference propagation probabilities and inputoutput correlations for random functions and block ciphers, for several of them for the first time. We show that these parameters have distributions that are wellstudied in the field of probability such as the normal, Poisson, Gamma and extreme value distributions. For Markov ciphers there exists a solid theory that expresses bounds on the complexity of differential and linear cryptanalysis in terms of average difference propagation probabilities and average correlations, where the average is taken over the keys. The propagation probabilities and correlations exploited in differential and linear cryptanalysis actually depend on the key and hence so does the attack complexity. The theory of Markov ciphers does not make statements on the distributions of these fixedkey properties but rather makes the assumption that their values will be close to the average for the vast majority of keys. This assumption is made explicit in the form of the hypothesis of stochastic equivalence.
Leakage Resilient Cryptography in Practice
, 2009
"... In this report, we are concerned with models to analyze the security of cryptographic algorithms against sidechannel attacks. Our objectives are threefold. In a first part of the paper, we aim to survey a number of well known intuitions related to physical security and to connect them with more for ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
In this report, we are concerned with models to analyze the security of cryptographic algorithms against sidechannel attacks. Our objectives are threefold. In a first part of the paper, we aim to survey a number of well known intuitions related to physical security and to connect them with more formal results in this area. For this purpose, we study the definition of leakage function introduced by Micali and Reyzin in 2004 and its relation to practical power consumption traces. Then, we discuss the non equivalence between the unpredictability and indistinguishability of pseudorandom generators in physically observable cryptography. Eventually, we examine the assumption of bounded leakage per iteration that has been used recently to prove the security of different constructions against sidechannel attacks. We show that approximated leakage bounds can be obtained using the framework for the analysis of sidechannel key recovery attacks published at Eurocrypt 2009. In a second part of the paper, we aim to investigate two recent leakage
Towards a Unifying View of Block Cipher Cryptanalysis
, 2004
"... We introduce commutative diagram cryptanalysis, a framework for expressing certain kinds of attacks on product ciphers. We show that many familiar attacks, including linear cryptanalysis, di#erential cryptanalysis, di#erentiallinear cryptanalysis, mod n attacks, truncated di#erential cryptanaly ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
We introduce commutative diagram cryptanalysis, a framework for expressing certain kinds of attacks on product ciphers. We show that many familiar attacks, including linear cryptanalysis, di#erential cryptanalysis, di#erentiallinear cryptanalysis, mod n attacks, truncated di#erential cryptanalysis, impossible di#erential cryptanalysis, higherorder di#erential cryptanalysis, and interpolation attacks can be expressed within this framework. Thus, we show that commutative diagram attacks provide a unifying view into the field of block cipher cryptanalysis.
PseudoRandom Functions and Parallelizable Modes of Operations of a Block Cipher
"... Abstract. This paper considers the construction and analysis of pseudorandom functions (PRFs) with specific reference to modes of operations of a block cipher. In the context of message authentication codes (MACs), earlier independent work by Bernstein and Vaudenay show how to reduce the analysis o ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
(Show Context)
Abstract. This paper considers the construction and analysis of pseudorandom functions (PRFs) with specific reference to modes of operations of a block cipher. In the context of message authentication codes (MACs), earlier independent work by Bernstein and Vaudenay show how to reduce the analysis of relevant PRFs to some probability calculations. In the first part of the paper, we revisit this result and use it to prove a general result on constructions which use a PRF with a “small ” domain to build a PRF with a “large ” domain. This result is used to analyse two new parallelizable PRFs which are suitable for use as MAC schemes. The first scheme, called iPMAC, is based on a block cipher and improves upon the wellknown PMAC algorithm. The improvements consist in faster masking operations and the removal of a design stage discrete logarithm computation. The second scheme, called VPMAC, uses a keyed compression function rather than a block cipher. The only previously known compression function based parallelizable PRF is called the protected counter sum (PCS) and is due to Bernstein. VPMAC improves upon PCS by requiring lesser number of calls to the compression function. The second part of the paper takes a new look at the construction and analysis of modes of operations for authenticated encryption (AE) and for authenticated encryption with associated data (AEAD). Usually, the most complicated part in the security analysis of such modes is the analysis of authentication
Composition does not imply adaptive security
 In Advances in Cryptology — CRYPTO ’05
, 2005
"... Abstract. We study the question whether the sequential or parallel composition of two functions, each indistinguishable from a random function by nonadaptive distinguishers is secure against adaptive distinguishers. The sequential composition of F(.) and G(.) is the function G(F(.)), the parallel c ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
(Show Context)
Abstract. We study the question whether the sequential or parallel composition of two functions, each indistinguishable from a random function by nonadaptive distinguishers is secure against adaptive distinguishers. The sequential composition of F(.) and G(.) is the function G(F(.)), the parallel composition is F(.) ⋆ G(.) where ⋆ is some group operation. It has been shown that composition indeed gives adaptive security in the information theoretic setting, but unfortunately the proof does not translate into the more interesting computational case. In this work we show that in the computational setting composition does not imply adaptive security: If there is a prime order cyclic group where the decisional DiffieHellman assumption holds, then there are functions F and G which are indistinguishable by nonadaptive polynomially timebounded adversaries, but whose parallel composition can be completely broken (i.e. we recover the key) with only three adaptive queries. We give a similar result for sequential composition. Interestingly, we need
On the Data Complexity of Statistical Attacks Against Block Ciphers
 In Cryptology ePrint
, 2009
"... Abstract. Many attacks on iterated block ciphers rely on statistical considerations using plaintext/ciphertext pairs to distinguish some part of the cipher from a random permutation. We provide here a simple formula for estimating the amount of plaintext/ciphertext pairs which is needed for such dis ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Many attacks on iterated block ciphers rely on statistical considerations using plaintext/ciphertext pairs to distinguish some part of the cipher from a random permutation. We provide here a simple formula for estimating the amount of plaintext/ciphertext pairs which is needed for such distinguishers and which applies to a lot of different scenarios (linear cryptanalysis, differentiallinear cryptanalysis, differential/truncated differential/impossible differential cryptanalysis). The asymptotic data complexities of all these attacks are then derived. Moreover, we give an efficient algorithm for computing the data complexity accurately.
K.: Links Between Truncated Differential and Multidimensional Linear Properties of Block Ciphers and Underlying Attack Complexities
, 2014
"... Abstract. The mere number of various apparently different statistical attacks on block ciphers has raised the question about their relationships which would allow to classify them and determine those that give essentially complementary information about the security of block ciphers. While mathema ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The mere number of various apparently different statistical attacks on block ciphers has raised the question about their relationships which would allow to classify them and determine those that give essentially complementary information about the security of block ciphers. While mathematical links between some statistical attacks have been derived in the last couple of years, the important link between general truncated differential and multidimensional linear attacks has been missing. In this work we close this gap. The new link is then exploited to relate the complexities of chosenplaintext and knownplaintext distinguishing attacks of differential and linear types, and further, to explore the relations between the keyrecovery attacks. Our analysis shows that a statistical saturation attack is the same as a truncated differential attack, which allows us, for the first time, to provide a justifiable analysis of the complexity of the statistical saturation attack and discuss its validity on 24 rounds of the PRESENT block cipher. By studying the data, time and memory complexities of a multidimensional linear keyrecovery attack and its relation with a truncated differential one, we also show that in most cases a knownplaintext attack can be transformed into a less costly chosenplaintext attack. In particular, we show that there is a differential attack in the chosenplaintext model on 26 rounds of PRESENT with less memory complexity than the best previous attack, which assumes known plaintext. The links between the statistical attacks discussed in this paper give further examples of attacks where the method used to sample the data required by the statistical test is more differentiating than the method used for finding the distinguishing property.