Results 1 
9 of
9
Understanding brute force
 URL: http://cr.yp.to/ papers.html#bruteforce. ID 73e92f5b71793b498288efe81fe55dee
, 2005
"... There is a widespread myth that parallelizing a computation cannot improve its priceperformance ratio. The reality is that a parallel computer is often several orders of magnitude faster than a comparably priced serial computer. Consider multiplying two nbit ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
There is a widespread myth that parallelizing a computation cannot improve its priceperformance ratio. The reality is that a parallel computer is often several orders of magnitude faster than a comparably priced serial computer. Consider multiplying two nbit
Weak Fields for ECC
, 2003
"... We demonstrate that some finite fields, including F 2 210 , are weak for elliptic curve cryptography in the sense that any instance of the elliptic curve discrete logarithm problem for any elliptic curve over these fields can be solved in significantly less time than it takes Pollard's rho method to ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
We demonstrate that some finite fields, including F 2 210 , are weak for elliptic curve cryptography in the sense that any instance of the elliptic curve discrete logarithm problem for any elliptic curve over these fields can be solved in significantly less time than it takes Pollard's rho method to solve the hardest instances. We discuss the implications of our observations to elliptic curve cryptography, and list some open problems.
Key Length
 CONTRIBUTION TO “THE HANDBOOK OF INFORMATION SECURITY"
, 2004
"... The key length used for a cryptographic protocol determines the highest security it can offer. If the key is found or ‘broken’, the security is undermined. Thus, key lengths must be chosen in accordance with the desired security. In practice, key lengths are mostly determined by standards, legacy sy ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
The key length used for a cryptographic protocol determines the highest security it can offer. If the key is found or ‘broken’, the security is undermined. Thus, key lengths must be chosen in accordance with the desired security. In practice, key lengths are mostly determined by standards, legacy system compatibility issues, and vendors. From a theoretical point of view selecting key lengths is more involved. Understanding the relation between security and key lengths and the impact of anticipated and unexpected cryptanalytic progress, requires insight into the design of the cryptographic methods and the mathematics involved in the attempts at breaking them. In this chapter practical and theoretical aspects of key size selection are discussed.
Better priceperformance ratios for generalized birthday attacks
"... Abstract. Fix i and k with k = 2 i−1. This paper presents a generalizedbirthday attack that uses a machine of size 2 2B/(2i+1) for time 2 B/(2i+1) to find (m1,..., mk) such that f1(m1) + · · · + fk(mk) mod 2 B = 0. The exponents 2/(2i + 1) and 1/(2i + 1) are smaller than the exponents for Wagner’s ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Abstract. Fix i and k with k = 2 i−1. This paper presents a generalizedbirthday attack that uses a machine of size 2 2B/(2i+1) for time 2 B/(2i+1) to find (m1,..., mk) such that f1(m1) + · · · + fk(mk) mod 2 B = 0. The exponents 2/(2i + 1) and 1/(2i + 1) are smaller than the exponents for Wagner’s original generalizedbirthday attack. The improved attack also allows a linear tradeoff between time and success probability, and an ithpower tradeoff between machine size and success probability. 1
Jawaharlal Nehru Technological
"... Information theoretic approach for decipherment problems is the recent trend in cryptanalysis. The behavioral transformation of message units is addressed upto certain extent in the encryption process. However the amount of confusion and diffusion in terms of statistical distribution parameters betw ..."
Abstract
 Add to MetaCart
Information theoretic approach for decipherment problems is the recent trend in cryptanalysis. The behavioral transformation of message units is addressed upto certain extent in the encryption process. However the amount of confusion and diffusion in terms of statistical distribution parameters between message and cipher text is a point of interest for cryptanalyst. In the present work we addressed this issue with the help of enhanced probability distribution function. The basic units of any message text are observed to be heuristic in nature depending on the sample. Averaging function is adopted while evaluating the enhanced probabilities of message units. The retrieved efficiency of cipher text only attack on samples of English, Hindi Telugu, Kannada is presented in this paper.
unknown title
"... Abstract. An rcollision for a function is a set of r distinct inputs with identical outputs. Actually finding rcollisions for a random map over a finite set of cardinality N requires at least about N (r−1)/r units of time on a sequential machine. For r=2, memoryless and wellparallelisable algorit ..."
Abstract
 Add to MetaCart
Abstract. An rcollision for a function is a set of r distinct inputs with identical outputs. Actually finding rcollisions for a random map over a finite set of cardinality N requires at least about N (r−1)/r units of time on a sequential machine. For r=2, memoryless and wellparallelisable algorithms are known. The current paper describes memoryefficient and parallelisable algorithms for r ≥ 3. The main results are: (1) A sequential algorithm for 3collisions, roughly using memory N α and time N 1−α for α ≤ 1/3. I.e., given N 1/3 units of storage, on can find 3collisions in time N 2/3. Note that there is a timememory tradeoff which allows to reduce the memory consumption. (2) A parallelisation of this algorithm using N 1/3 processors running in time N 1/3. Each single processor only needs a constant amount of memory. (3) An generalisation of this second approach to rcollisions for r ≥ 3: given N s parallel processors, on can generate rcollisions roughly in time N ((r−1)/r)−s, using memory N ((r−2)/r)−s on every processor.
Noname manuscript No. (will be inserted by the editor) A Comparison of Perfect Table Cryptanalytic Tradeoff Algorithms
, 2012
"... Abstract Analyses of three major time memory tradeoff algorithms were presented by a recent paper in such a way that facilitates comparisons of the algorithm performances at arbitrary choices of the algorithm parameters. The algorithms considered there were the classical Hellman tradeoff and the non ..."
Abstract
 Add to MetaCart
Abstract Analyses of three major time memory tradeoff algorithms were presented by a recent paper in such a way that facilitates comparisons of the algorithm performances at arbitrary choices of the algorithm parameters. The algorithms considered there were the classical Hellman tradeoff and the nonperfect table versions of the distinguished point method and the rainbow table method. This paper adds the perfect table versions of the distinguished point method and the rainbow table method to the list, so that all the major tradeoff algorithms may now be compared against each other. The algorithm performance information provided by this and the preceding paper is aimed at making practical comparisons possible. Comparisons that take both the cost of precomputation and the efficiency of the online phase into account, at parameters that achieve a common success rate, can now be carried out with ease. Comparisons can be based on the expected execution complexities rather than the worst case complexities, and details such as the effects of false alarms and various storage optimization techniques need no longer be ignored. A large portion of this paper is allocated to accurately analyzing the execution behavior of the perfect table distinguished point method. In particular, we obtain a closedform formula for the average length of chains associated with a perfect distinguished point table. Keywords time memory tradeoff · distinguished point · rainbow table · perfect table · algorithm complexity 1