Results 1 
6 of
6
Optimal prefix codes for infinite alphabets with nonlinear costs
 IEEE Trans. Inf. Theory
, 2008
"... Abstract — Let P = {p(i)} be a measure of strictly positive probabilities on the set of nonnegative integers. Although the countable number of inputs prevents usage of the Huffman algorithm, there are nontrivial P for which known methods find a source code that is optimal in the sense of minimizing ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Abstract — Let P = {p(i)} be a measure of strictly positive probabilities on the set of nonnegative integers. Although the countable number of inputs prevents usage of the Huffman algorithm, there are nontrivial P for which known methods find a source code that is optimal in the sense of minimizing expected codeword length. For some applications, however, a source code should instead minimize one of a family of nonlinear objective P functions, βexponential means, those of the form loga i p(i)an(i) , where n(i) is the length of the ith codeword and a is a positive constant. Applications of such minimizations include a novel problem of maximizing the chance of message receipt in singleshot communications (a < 1) and a previously known problem of minimizing the chance of buffer overflow in a queueing system (a> 1). This paper introduces methods for finding codes optimal for such exponential means. One method applies to geometric distributions, while another applies to distributions with lighter tails. The latter algorithm is applied to Poisson distributions and both are extended to alphabetic codes, as well as to minimizing maximum pointwise redundancy. The aforementioned application of minimizing the chance of buffer overflow is also considered. Index Terms — Communication networks, generalized entropies, generalized means, Golomb codes, Huffman algorithm,
Reservedlength prefix coding
 In Proceedings of the 2008 IEEE International Symposium on Information Theory
, 2008
"... Abstract — Huffman coding finds an optimal prefix code for a given probability mass function. Consider situations in which one wishes to find an optimal code with the restriction that all codewords have lengths that lie in a userspecified set of lengths (or, equivalently, no codewords have lengths ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract — Huffman coding finds an optimal prefix code for a given probability mass function. Consider situations in which one wishes to find an optimal code with the restriction that all codewords have lengths that lie in a userspecified set of lengths (or, equivalently, no codewords have lengths that lie in a complementary set). This paper introduces a polynomialtime dynamic programming algorithm that finds optimal codes for this reservedlength prefix coding problem. This has applications to quickly encoding and decoding lossless codes. In addition, one modification of the approach solves any quasiarithmetic prefix coding problem, while another finds optimal codes restricted to the set of codes with g codeword lengths for userspecified g (e.g., g = 2). I.
Dary BoundedLength Huffman Coding
, 2007
"... Abstract — Efficient optimal prefix coding has long been accomplished via the Huffman algorithm. However, there is still room for improvement and exploration regarding variants of the Huffman problem. Lengthlimited Huffman coding, useful for many practical applications, is one such variant, in whic ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract — Efficient optimal prefix coding has long been accomplished via the Huffman algorithm. However, there is still room for improvement and exploration regarding variants of the Huffman problem. Lengthlimited Huffman coding, useful for many practical applications, is one such variant, in which codes are restricted to the set of codes in which none of the n codewords is longer than a given length, lmax. Binary lengthlimited coding can be done in O(nlmax) time and O(n) space using the widely used PackageMerge algorithm. In this paper the PackageMerge approach is generalized in order to introduce a minimum codeword length, lmin, to allow for objective functions other than the minimization of expected codeword length, and to be applicable to both binary and nonbinary codes, the latter of which was previously addressed using a slower dynamic programming approach. These extensions have various applications — including faster decompression — and can be used to solve the problem of finding an optimal code with bounded fringe, that is, finding the best code among codes with a maximum difference between the longest and shortest codewords. The previously proposed method for solving this problem was nonpolynomial time, whereas the novel algorithm requires only O(n(lmax − lmin) 2) time and O(n) space. I.
Twenty (or so) Questions: Dary LengthBounded Prefix Coding
, 2007
"... Abstract — Efficient optimal prefix coding has long been accomplished via the Huffman algorithm. However, there is still room for improvement and exploration regarding variants of the Huffman problem. Lengthlimited Huffman coding, useful for many practical applications, is one such variant, for whi ..."
Abstract
 Add to MetaCart
Abstract — Efficient optimal prefix coding has long been accomplished via the Huffman algorithm. However, there is still room for improvement and exploration regarding variants of the Huffman problem. Lengthlimited Huffman coding, useful for many practical applications, is one such variant, for which codes are restricted to the set of codes in which none of the n codewords is longer than a given length, lmax. Binary lengthlimited coding can be done in O(nlmax) time and O(n) space via the widely used PackageMerge algorithm and with even smaller asymptotic complexity using a lesserknown algorithm. In this paper these algorithms are generalized without increasing complexity in order to introduce a minimum codeword length constraint lmin, to allow for objective functions other than the minimization of expected codeword length, and to be applicable to both binary and nonbinary codes; nonbinary codes were previously addressed using a slower dynamic programming approach. These extensions have various applications — including fast decompression and a modified version of the game “Twenty Questions ” — and can be used to solve the problem of finding an optimal code with limited fringe, that is, finding the best code among codes with a maximum difference between the longest and shortest codewords. The previously proposed method for solving this problem was nonpolynomial time, whereas solving this using the novel linearspace algorithm requires only O(n(lmax − lmin) 2) time, or even less if lmax − lmin is not O(log n). I.
InfiniteAlphabet Prefix Codes Optimal for βExponential Penalties
, 2007
"... Abstract — Let P = {p(i)} be a measure of strictly positive probabilities on the set of nonnegative integers. Although the countable number of inputs prevents usage of the Huffman algorithm, there are nontrivial P for which known methods find a source code that is optimal in the sense of minimizing ..."
Abstract
 Add to MetaCart
Abstract — Let P = {p(i)} be a measure of strictly positive probabilities on the set of nonnegative integers. Although the countable number of inputs prevents usage of the Huffman algorithm, there are nontrivial P for which known methods find a source code that is optimal in the sense of minimizing expected codeword length. For some applications, however, a source code should instead minimize one of a family of nonlinear objective P functions, βexponential means, those of the form loga i p(i)an(i) , where n(i) is the length of the ith codeword and a is a positive constant. Applications of such minimizations include a problem of maximizing the chance of message receipt in singleshot communications (a < 1) and a problem of minimizing the chance of buffer overflow in a queueing system (a> 1). This paper introduces methods for finding codes optimal for such exponential means. One method applies to geometric distributions, while another applies to distributions with lighter tails. The latter algorithm is applied to Poisson distributions. Both are extended to minimizing maximum pointwise redundancy.