Results 1  10
of
12
Parallel Huffman Decoding with Applications to JPEG Files
 THE COMPUTER JOURNAL
, 2003
"... ..."
(Show Context)
Fast and Compact Prefix Codes ⋆
"... Abstract. It is wellknown that, given a probability distribution over n characters, in the worst case it takes Θ(n log n) bits to store a prefix code with minimum expected codeword length. However, in this paper we first show that, for any ɛ with 0 < ɛ < 1/2 and 1/ɛ = O(polylog(n)), it takes ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
(Show Context)
Abstract. It is wellknown that, given a probability distribution over n characters, in the worst case it takes Θ(n log n) bits to store a prefix code with minimum expected codeword length. However, in this paper we first show that, for any ɛ with 0 < ɛ < 1/2 and 1/ɛ = O(polylog(n)), it takes O(n log log(1/ɛ)) bits to store a prefix code with expected codeword length within an additive ɛ of the minimum. We then show that, for any constant c> 1, it takes O ( n 1/c log n) bits to store a prefix code with expected codeword length at most c times the minimum. In both cases, our data structures allow us to encode and decode any character in O(1) time. 1
Using Fibonacci Compression Codes as Alternatives to Dense Codes
"... Abstract Recent publications advocate the use of various variable length codes forwhich each codeword consists of an integral number of bytes in compression applications using large alphabets. This paper shows that another tradeoffwith similar properties can be obtained by Fibonacci codes. These are ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Abstract Recent publications advocate the use of various variable length codes forwhich each codeword consists of an integral number of bytes in compression applications using large alphabets. This paper shows that another tradeoffwith similar properties can be obtained by Fibonacci codes. These are fixed codeword sets, using binary representations of integers based on Fibonaccinumbers of order m> = 2. Fibonacci codes have been used before, and thispaper extends previous work presenting several novel features. In particular,
On the Usefulness of Fibonacci Compression Codes
, 2004
"... Recent publications advocate the use of various variable length codes for which each codeword consists of an integral number of bytes in compression applications using large alphabets. This paper shows that another tradeoff with similar properties can be obtained by Fibonacci codes. These are fixed ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Recent publications advocate the use of various variable length codes for which each codeword consists of an integral number of bytes in compression applications using large alphabets. This paper shows that another tradeoff with similar properties can be obtained by Fibonacci codes. These are fixed codeword sets, using binary representations of integers based on Fibonacci numbers of order m ≥ 2. Fibonacci codes have been used before, and this paper extends previous work presenting several novel features. In particular, the compression efficiency is analyzed and compared to that of dense codes, and various tabledriven decoding routines are suggested.
FAST CODES FOR LARGE ALPHABETS ∗
"... Abstract. We address the problem of constructing a fast lossless code in the case when the source alphabet is large. The main idea of the new scheme may be described as follows. We group letters with small probabilities in subsets (acting as super letters) and use time consuming coding for these sub ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We address the problem of constructing a fast lossless code in the case when the source alphabet is large. The main idea of the new scheme may be described as follows. We group letters with small probabilities in subsets (acting as super letters) and use time consuming coding for these subsets only, whereas letters in the subsets have the same code length and therefore can be coded fast. The described scheme can be applied to sources with known and unknown statistics.
Adapting the KnuthMorrisPratt algorithm for pattern matching in Huffman encoded texts
 Inf. Process. Manage
"... ..."
(Show Context)
Towards Using Neural Networks to Perform ObjectOriented Function Approximation
"... Abstract — Many computational methods are based on the manipulation of entities with internal structure, such as objects, records, or data structures. Most conventional approaches based on neural networks have problems dealing with such structured entities. The algorithms presented in this paper rep ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — Many computational methods are based on the manipulation of entities with internal structure, such as objects, records, or data structures. Most conventional approaches based on neural networks have problems dealing with such structured entities. The algorithms presented in this paper represent a novel approach to neuralsymbolic integration that allows for symbolic data in the form of objects to be translated to a scalar representation that can then be used by connectionist systems. We present the implementation of two translation algorithms that aid in performing objectoriented function approximation. We argue that objects provide an abstract representation of data that is well suited for the input and output of neural networks, as well as other statistical learning techniques. By examining the results of a simple sorting example, we illustrate the efficacy of these techniques. I.
Comparative Study of Arithmetic and Huffman Compression Techniques for Enhancing Security and Effective Bandwidth Utilization in the Context of ECC for Text
"... In this paper, we proposed a model for text encryption using elliptic curve cryptography (ECC) for secure transmission of text and by incorporating the Arithmetic/Huffman data compression technique for effective utilization of channel bandwidth and enhancing the security. In this model, every charac ..."
Abstract
 Add to MetaCart
In this paper, we proposed a model for text encryption using elliptic curve cryptography (ECC) for secure transmission of text and by incorporating the Arithmetic/Huffman data compression technique for effective utilization of channel bandwidth and enhancing the security. In this model, every character of text message is transformed into the elliptic curve points (X m,Y m), these elliptic curve points are converted into cipher text.The resulting size of cipher text becomes four times of the original text. For minimizing the channel bandwidth requirements, the encrypted text is compressed using the Arithmetic and Huffman compression technique in the following two ways by considering i)xy coordinates of encrypted text and ii) xcoordinates of the encrypted text. The results of the above two cases are compared in terms of overall bandwidth required and saved for Arithmetic and Huffman compression.