## Inequalities between Entropy and Index of Coincidence derived from Information Diagrams (2001)

Venue: | IEEE Trans. Inform. Theory |

Citations: | 19 - 11 self |

### BibTeX

@ARTICLE{Harremoës01inequalitiesbetween,

author = {Peter Harremoës and Flemming Topsøe and Rønne Allé Søborg},

title = {Inequalities between Entropy and Index of Coincidence derived from Information Diagrams},

journal = {IEEE Trans. Inform. Theory},

year = {2001},

volume = {47},

pages = {2944--2960}

}

### OpenURL

### Abstract

To any discrete probability distribution P we can associate its entropy H(P) = − � pi ln pi and its index of coincidence IC(P) = � p 2 i. The main result of the paper is the determination of the precise range of the map P � (IC(P), H(P)). The range looks much like that of the map P � (Pmax, H(P)) where Pmax is the maximal point probability, cf. research from 1965 (Kovalevskij [18]) to 1994 (Feder and Merhav [7]). The earlier results, which actually focus on the probability of error 1 − Pmax rather than Pmax, can be conceived as limiting cases of results obtained by methods here presented. Ranges of maps as those indicated are called Information Diagrams. The main result gives rise to precise lower as well as upper bounds for the entropy function. Some of these bounds are essential for the exact solution of certain problems of universal coding and prediction for Bernoulli sources. Other applications concern Shannon theory (relations betweeen various measures of divergence), statistical decision theory and rate distortion theory. Two methods are developed. One is topological, another involves convex analysis and is based on a “lemma of replacement ” which is of independent interest in relation to problems of optimization of mixed type (concave/convex optimization).