Results 1 
2 of
2
Entropy Computations Via Analytic Depoissonization
 IEEE Trans. Information Theory
, 1998
"... We investigate the basic question of information theory, namely, evaluation of Shannon entropy, and a more general Rényi entropy, for some discrete distributions (e.g., binomial, negative binomial, etc.). We aim at establishing analytic methods (i.e., those in which complex analysis plays a pivotal ..."
Abstract

Cited by 29 (12 self)
 Add to MetaCart
We investigate the basic question of information theory, namely, evaluation of Shannon entropy, and a more general Rényi entropy, for some discrete distributions (e.g., binomial, negative binomial, etc.). We aim at establishing analytic methods (i.e., those in which complex analysis plays a pivotal role) for such computations which often yield estimates of unparalleled precision. The main analytic tool used here is that of analytic poissonization and depoissonization. We illustrate our approach on the entropy evaluation of the binomial distribution, that is, we prove that for Binomial(n; p) distribution the entropy h n becomes h n i 1 2 ln n+ 1 2 +ln p 2ßp(1 \Gamma p)+ P k1 a k n \Gammak where a k are explicitly computable constants. Moreover, we shall argue that analytic methods (e.g., complex asymptotics such as Rice's method and singularity analysis, Mellin transforms, poissonization and depoissonization) can offer new tools for information theory, especially for studying ...
On Asymptotics Of Certain Recurrences Arising In Universal Coding
 Problems of Information Transmission
, 1997
"... Ramanujan's Qfunction and the so called "tree function" T (z) defined implicitly by the equation T (z) = ze T (z) found applications in hashing, the birthday paradox problem, random mappings, caching, memory conflicts, and so forth. Recently, several novel applications of these functions to infor ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
Ramanujan's Qfunction and the so called "tree function" T (z) defined implicitly by the equation T (z) = ze T (z) found applications in hashing, the birthday paradox problem, random mappings, caching, memory conflicts, and so forth. Recently, several novel applications of these functions to information theory problems such as linear coding and universal portfolios were brought to light. In this paper, we study them in the context of another information theory problem, namely: universal coding which was recently investigated by Shtarkov et al. [Prob. Inf. Trans., 31, 1995]. We provide asymptotic expansions of certain recurrences studied there which describe the optimal redundancy of universal codes. Our methodology falls under the so called analytical information theory that was recently applied successfully to a variety of information theory problems. Key Words: Source coding, multialphabet universal coding, redundancy, minimum description length, analytical information theory, si...