Results 1 
4 of
4
Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study
"... entropy ..."
On the Vocabulary of GrammarBased Codes and the Logical Consistency of Texts
, 2008
"... The article presents a new interpretation for Zipf’s law in natural language which relies on two areas of information theory. We reformulate the problem of grammarbased compression and investigate properties of strongly nonergodic stationary processes. The motivation for the joint discussion is to ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
The article presents a new interpretation for Zipf’s law in natural language which relies on two areas of information theory. We reformulate the problem of grammarbased compression and investigate properties of strongly nonergodic stationary processes. The motivation for the joint discussion is to prove a proposition with a simple informal statement: If an nletter long text describes n β independent facts in a random but consistent way then the text contains at least n β /log n different words. In the formal statement, two specific postulates are adopted. Firstly, the words are understood as the nonterminal symbols of the shortest grammarbased encoding of the text. Secondly, the texts are assumed to be emitted by a nonergodic source, with the described facts being binary IID variables that are asymptotically predictable in a shiftinvariant way. The proof of the formal proposition applies several new tools. These
Information width
, 2007
"... Kolmogorov argued that the concept of information exists also in problems with no underlying stochastic model (as Shannon’s information representation) for instance, the information contained in an algorithm or in the genome. He introduced a combinatorial notion of entropy and information I(x: y) co ..."
Abstract
 Add to MetaCart
Kolmogorov argued that the concept of information exists also in problems with no underlying stochastic model (as Shannon’s information representation) for instance, the information contained in an algorithm or in the genome. He introduced a combinatorial notion of entropy and information I(x: y) conveyed by a binary string x about the unknown value of a variable y. The current paper poses the following questions: what is the relationship between the information conveyed by x about y to the description complexity of x? is there a notion of cost of information? are there limits on how efficient x conveys information? To answer these questions Kolmogorov’s definition is extended and a new concept termed information width which is similar to nwidths in approximation theory is introduced. Information of any input source, e.g., samplebased, general sideinformation or a hybrid of both can be evaluated by a single common formula. An application to the space of binary functions is considered. Keywords: Binary functions,, Combinatorics, nwidths, VCdimension 1
Entropy Rate Constancy in Text
 In Proceedings of ACL–2002
, 2002
"... We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three di#erent ways. We als ..."
Abstract
 Add to MetaCart
We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three di#erent ways. We also show that this e#ect has both lexical (which words are used) and nonlexical (how the words are used) causes.