Results 1 
4 of
4
Dynamic Shannon Coding
, 2005
"... We present a new algorithm for dynamic prefixfree coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient lengthr ..."
Abstract

Cited by 9 (7 self)
 Add to MetaCart
We present a new algorithm for dynamic prefixfree coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient lengthrestricted coding, alphabetic coding and coding with unequal letter costs.
WorstCase Optimal Adaptive Prefix Coding
 IN: PROCEEDINGS OF THE ALGORITHMS AND DATA STRUCTURES SYMPOSIUM (WADS
, 2009
"... A common complaint about adaptive prefix coding is that it is much slower than static prefix coding. Karpinski and Nekrich recently took an important step towards resolving this: they gave an adaptive Shannon coding algorithm that encodes each character in O(1) amortized time and decodes it in O(l ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
A common complaint about adaptive prefix coding is that it is much slower than static prefix coding. Karpinski and Nekrich recently took an important step towards resolving this: they gave an adaptive Shannon coding algorithm that encodes each character in O(1) amortized time and decodes it in O(log H) amortized time, where H is the empirical entropy of the input string s. For comparison, Gagie’s adaptive Shannon coder and both Knuth’s and Vitter’s adaptive Huffman coders all use Θ(H) amortized time for each character. In this paper we give an adaptive Shannon coder that both encodes and decodes each character in O(1) worstcase time. As with both previous adaptive Shannon coders, we store s in at most (H + 1)s  + o(s) bits. We also show that this encoding length is worstcase optimal up to the lower order term.
Dynamic LengthRestricted Coding
, 2003
"... Suppose that $S$ is a string of length $m$ drawn from an alphabet of $n$ characters, $d$ of which occur in $S$. Let $P$ be the relative frequency distribution of characters in $S$. We present a new algorithm for dynamic coding that uses at most \(\lceil \lg n \rceil 1\) bits to encode each character ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Suppose that $S$ is a string of length $m$ drawn from an alphabet of $n$ characters, $d$ of which occur in $S$. Let $P$ be the relative frequency distribution of characters in $S$. We present a new algorithm for dynamic coding that uses at most \(\lceil \lg n \rceil 1\) bits to encode each character in $S$
Dynamic Asymmetric Communication STUDENT PAPER
, 2005
"... Abstract. In Adler and Maggs ’ asymmetric communication problem, a server with high bandwidth tries to help clients with low bandwidth send it messages. We give four new asymmetric communication protocols and show they are robust with respect to changes in the messages ’ distribution. Three of our p ..."
Abstract
 Add to MetaCart
Abstract. In Adler and Maggs ’ asymmetric communication problem, a server with high bandwidth tries to help clients with low bandwidth send it messages. We give four new asymmetric communication protocols and show they are robust with respect to changes in the messages ’ distribution. Three of our protocols require only one round of communication for each message. 1