Results 1  10
of
11
Optimal prefix codes for infinite alphabets with nonlinear costs
 IEEE Trans. Inf. Theory
, 2008
"... Abstract — Let P = {p(i)} be a measure of strictly positive probabilities on the set of nonnegative integers. Although the countable number of inputs prevents usage of the Huffman algorithm, there are nontrivial P for which known methods find a source code that is optimal in the sense of minimizing ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
Abstract — Let P = {p(i)} be a measure of strictly positive probabilities on the set of nonnegative integers. Although the countable number of inputs prevents usage of the Huffman algorithm, there are nontrivial P for which known methods find a source code that is optimal in the sense of minimizing expected codeword length. For some applications, however, a source code should instead minimize one of a family of nonlinear objective P functions, βexponential means, those of the form loga i p(i)an(i) , where n(i) is the length of the ith codeword and a is a positive constant. Applications of such minimizations include a novel problem of maximizing the chance of message receipt in singleshot communications (a < 1) and a previously known problem of minimizing the chance of buffer overflow in a queueing system (a> 1). This paper introduces methods for finding codes optimal for such exponential means. One method applies to geometric distributions, while another applies to distributions with lighter tails. The latter algorithm is applied to Poisson distributions and both are extended to alphabetic codes, as well as to minimizing maximum pointwise redundancy. The aforementioned application of minimizing the chance of buffer overflow is also considered. Index Terms — Communication networks, generalized entropies, generalized means, Golomb codes, Huffman algorithm,
RedundancyRelated Bounds for Generalized Huffman Codes
, 2009
"... This paper presents new lower and upper bounds for the compression rate of optimal binary prefix codes on memoryless sources according to various nonlinear codeword length objectives. Like the most wellknown redundancy bounds for minimum (arithmetic) average redundancy coding — Huffman coding — t ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper presents new lower and upper bounds for the compression rate of optimal binary prefix codes on memoryless sources according to various nonlinear codeword length objectives. Like the most wellknown redundancy bounds for minimum (arithmetic) average redundancy coding — Huffman coding — these are in terms of a form of entropy and/or the probability of the most probable input symbol. The bounds here improve on known bounds of the form L ∈ [H, H + 1), where H is some form of entropy in bits (or, in the case of redundancy measurements, 0) and L is the length objective, also in bits. The objectives explored here include exponentialaverage length, maximum pointwise redundancy, and exponentialaverage pointwise redundancy (also called d th exponential redundancy). These relate to queueing and singleshot communications, Shannon coding and universal modeling (worstcase minimax redundancy), and bridging the maximum pointwise redundancy problem with Huffman coding, respectively. A generalized form of Huffman coding known to find optimal codes for these objectives helps yield these bounds, some of which are tight. Related properties to such bounds, also explored here, are the necessary and sufficient conditions for the shortest codeword being a specific length.
1 Lossless Image Compression based on Predictive Coding and Bit Plane Slicing
"... In this paper, a simple lossless image compression method based on a combination between bitplane slicing and adaptive predictive coding is adopted for compressing natural and medical images. The idea basically utilized the spatial domain efficiently after discarding the lowest order bits namely, e ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper, a simple lossless image compression method based on a combination between bitplane slicing and adaptive predictive coding is adopted for compressing natural and medical images. The idea basically utilized the spatial domain efficiently after discarding the lowest order bits namely, exploiting only the highest order bits in which the most significant bit corresponds to last layer7 used adaptive predictive coding, while the other layers used run length coding. The test results leads to high system performance in which higher compression ratio achieves for lossless system that characterized by guaranty fully reconstruction. General Terms Bitplane slicing along with adaptive predictive coding for lossless image compression.
unknown title
"... Abstract — A novel lossless source coding paradigm applies to problems in which a vital message needs to be transmitted prior to termination of communications, as in Alfréd Rényi’s secondhand account of an ancient siege in which information was obtained to prevent the fall of a fortress. Rényi told ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — A novel lossless source coding paradigm applies to problems in which a vital message needs to be transmitted prior to termination of communications, as in Alfréd Rényi’s secondhand account of an ancient siege in which information was obtained to prevent the fall of a fortress. Rényi told this story with reference to traditional prefix coding, in which the objective is minimization of expected codeword length. The goal of maximizing probability of survival in the siege scenario is distinct from yet related to this traditional objective. Rather than finding a code minimizing Pn P i=1 p(i)l(i), this variant involves maximizing n i=1 p(i)θl(i) for a given θ ∈ (0, 1). A known generalization of Huffman coding solves this, and, for nontrivial θ (θ ∈ (0.5, 1)), the optimal solution has coding bounds which are functions of Rényi’s αentropy for α =1/log22θ>1. A new improvement on known bounds is derived here. When alphabetically constrained, as in search trees and in diagnostic testing of sequential systems, a dynamic programming algorithm finds the optimal solution in O(n 3) time and O(n 2) space, whereas two novel approximation algorithms can find a suboptimal solution in linear time (for one) or O(n log n) time (for the other). These approximation algorithms, along with simple associated coding bounds, apply to both the siege scenario and a complementary problem. Index Terms — Dynamic programming, Huffman codes, military communication, reliability, Rényi entropy, tree searching, wireless. I.
InfiniteAlphabet Prefix Codes Optimal for βExponential Penalties
, 2007
"... Abstract — Let P = {p(i)} be a measure of strictly positive probabilities on the set of nonnegative integers. Although the countable number of inputs prevents usage of the Huffman algorithm, there are nontrivial P for which known methods find a source code that is optimal in the sense of minimizing ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — Let P = {p(i)} be a measure of strictly positive probabilities on the set of nonnegative integers. Although the countable number of inputs prevents usage of the Huffman algorithm, there are nontrivial P for which known methods find a source code that is optimal in the sense of minimizing expected codeword length. For some applications, however, a source code should instead minimize one of a family of nonlinear objective P functions, βexponential means, those of the form loga i p(i)an(i) , where n(i) is the length of the ith codeword and a is a positive constant. Applications of such minimizations include a problem of maximizing the chance of message receipt in singleshot communications (a < 1) and a problem of minimizing the chance of buffer overflow in a queueing system (a> 1). This paper introduces methods for finding codes optimal for such exponential means. One method applies to geometric distributions, while another applies to distributions with lighter tails. The latter algorithm is applied to Poisson distributions. Both are extended to minimizing maximum pointwise redundancy.
Prefix Coding under Siege
, 2006
"... A novel lossless source coding paradigm applies to problems of unreliable lossless channels with low bitrate, in which a vital message needs to be transmitted prior to termination of communications. This paradigm can be applied to Alfréd Rényi’s secondhand account of an ancient siege in which a spy ..."
Abstract
 Add to MetaCart
A novel lossless source coding paradigm applies to problems of unreliable lossless channels with low bitrate, in which a vital message needs to be transmitted prior to termination of communications. This paradigm can be applied to Alfréd Rényi’s secondhand account of an ancient siege in which a spy was sent to scout the enemy but was captured. After escaping, the spy returned to his base in no condition to speak and unable to write. His commander asked him questions that he could answer by nodding or shaking his head, and the fortress was defended with this information. Rényi told this story with reference to traditional lossless prefix coding, in which the objective is minimization of expected codeword length. The goal of maximizing probability of survival in the siege scenario, however, is distinct from yet related to this traditional objective. Rather ∑ than finding a code minimizing expected codeword length n i=1 p(i)l(i), this variant involves maximizing ∑n i=1 p(i)θl(i) for a known θ ∈ (0,1). When there are no restrictions on codewords, this problem can be solved using a known generalization of Huffman coding. The optimal solution has coding bounds which are functions of Rényi entropy; in addition to known bounds, new bounds are derived here. The alphabetically constrained version of this problem has applications in search trees and diagnostic testing. A novel dynamic programming algorithm — based upon the oldest known algorithm for the traditional alphabetic problem — optimizes this problem in O(n 3) time and O(n 2) space, whereas two novel approximation algorithms can find a suboptimal solution faster: one in linear time, the other in O(n log n) time. Coding bounds for the alphabetic version of this problem are also presented.