## Using

### Abstract

an innovative coding algorithm for data encryption ∗

### Citations

9359 |
Elements of information theory
- Cover, Thomas
- 1991
(Show Context)
Citation Context ...his paper we consider using a variable-length prefix code for encryption. Although Huffman codes produce optimal compression, their encryption capabilities are not very good. Shannon-Fano-Elias codes =-=[8]-=- are a better candidate for encryption but their compression capabilities are inferior compared to Huffman codes. In this paper we show that it is more difficult to cryptanalyze Shannon-Fano-Elias cod... |

7283 |
A mathematical theory of communications
- Shannon
- 1948
(Show Context)
Citation Context ...timal expected length H ≤ LHuffman < H + 1 where H represents the entropy of the source. Shannon mentioned an approach using the cumulative distribution function when describing the Shannon-Fano code =-=[9]-=-. Elias later came up with a recursive implementation for this idea. It is now known as Shannon-Fano-Elias coding. However Elias never published it. It was first introduced in a 1963 information theor... |

1068 |
A method for the construction of minimum-redundancy codes
- Huffman
- 1952
(Show Context)
Citation Context ...he Department of Electrical and Computer Engineering, North Dakota State University, Fargo, ND 58105-5285, USA (e-mail: xiaoyu.ruan@ndsu.edu; rajendra.katti@ndsu.edu). 1s1 Introduction Huffman coding =-=[1]-=- is one of the best-known compression techniques that produces optimal compression for any given probability distribution. The use of Huffman codes for encryption has been considered in [2] and [3]. T... |

167 |
Information Theory and Coding
- Abramson
- 1963
(Show Context)
Citation Context ...up with a recursive implementation for this idea. It is now known as Shannon-Fano-Elias coding. However Elias never published it. It was first introduced in a 1963 information theory book by Abramson =-=[10]-=-. We now review the construction of Shannon-Fano-Elias codes. The modified cumulative distribution function is defined as �i−1 F (p(xi)) = p(xk) + 1 2 p(xi). (2) k=1 F (p(xi)) represents the sum of pr... |

45 |
Two inequalities implied by unique decipherability
- McMillan
- 1956
(Show Context)
Citation Context ...F (p(xi)) is rounded off to � � 1 lSF E(xi) = log + 1 (3) p(xi) bits, denoted by � F (p(xi)) � lSF E(xi) . It can be shown that the set of lengths lSF E(xi) (1 ≤ i ≤ n) satisfies the Kraft inequality =-=[11]-=- and hence can be used to construct a uniquely decodable code. � F (p(xi)) � lSF E(xi) is within the step corresponding to xi. Thus we can use the first 4slSF E(xi) bits of F (p(xi)) to describe xi. S... |

22 | Storing text retrieval systems on CD-ROM: compression and encryption considerations
- KLEIN, BOOKSTEIN, et al.
- 1989
(Show Context)
Citation Context ...ng [1] is one of the best-known compression techniques that produces optimal compression for any given probability distribution. The use of Huffman codes for encryption has been considered in [2] and =-=[3]-=-. These works were motivated by security requirements for storing a large textual database on a CD-ROM. The text of the database needed to be compressed for memory efficiency and encrypted to prevent ... |

14 | On breaking a Huffman Code
- Gillman, Mohtashemi, et al.
- 1996
(Show Context)
Citation Context ...s a means to achieve security can improve performance and memory requirement of the system. Increased security can be obtained by using simpler encryption methods in addition to using compression. In =-=[5]-=-, it was shown that if the cryptanalyst knows whether the encoder is using an arbitrary Huffman code 1 or a right-heavy Huffman code 2 , but does not know the Probability mass function (PMF) of the so... |

14 |
On design of error-correcting reversible variable length codes
- Lakovic, Villasenor
- 2002
(Show Context)
Citation Context ...nt orders. This feature might be employed to increase the ambiguity of the codes and hence make the code difficult to break. For example, probabilities of the English alphabet occurring in literature =-=[12]-=- are shown in the second column of Table 3 or Table 4. The corresponding codewords for the alphabet ordered from A to Z are listed in the third column of Table 3 and that for the order Z to A are list... |

12 | Improving Memory Encryption Performance in Secure Processors
- Yang, Gao, et al.
- 2005
(Show Context)
Citation Context ...ethod is that adding random bits increases the average length of the code. A better method would be to XOR the compressed sequence with a sequence of bits obtained by encrypting a seed as was done in =-=[7]-=-. This would maintain the compression properties and improve the security of the code. In this paper we consider using a variable-length prefix code for encryption. Although Huffman codes produce opti... |

3 | Complexity aspects of guessing prefix codes
- Fraenkel, Klein
- 1994
(Show Context)
Citation Context ...alyst not only knows the construction procedure of the codes but also knows the PMF. Under this assumption, Huffman codes can be easily decoded and essentially provide no security. Fraenkel and Klein =-=[6]-=- have enhanced prefix codes by adding a short sequence of random bits to some of the codewords. They then show that decoding such a code is NP-complete. However the drawback of this method is that add... |

2 |
Cypress: Compression and encryption of data and code for embedded multimedia systems
- Lekatsas, Henkel, et al.
- 2004
(Show Context)
Citation Context ...ficult enough so that the cost of decryption exceeds any potential profit incurred by breaking the code. Another application where compression and encryption are needed is embedded multimedia systems =-=[4]-=-, such systems require both data and programs to be compressed and encrypted and stored in main memory. Both data and programs are decrypted and decompressed after they enter the processor which is co... |

1 |
Cryptographic aspects of data compression codes
- Robin
- 1979
(Show Context)
Citation Context ...man coding [1] is one of the best-known compression techniques that produces optimal compression for any given probability distribution. The use of Huffman codes for encryption has been considered in =-=[2]-=- and [3]. These works were motivated by security requirements for storing a large textual database on a CD-ROM. The text of the database needed to be compressed for memory efficiency and encrypted to ... |