## A Formal Theory Of Information: I. Statics (1998)

Citations: | 1 - 0 self |

### BibTeX

@MISC{Hillman98aformal,

author = {Chris Hillman},

title = {A Formal Theory Of Information: I. Statics},

year = {1998}

}

### OpenURL

### Abstract

this paper, I study "entropy" as a quantity defined axiomatically by certain formal properties, independently of any particular mathematical context. In particular, unlike all previous attempts to axiomatize information theory which are known to me, the concept of probability will play no role in the formal theory. I will define the notion of a joinset (essentially a lattice without the meet operation) and the notion of an entropy valuation on a joinset, which is a function assigning a non-negative real number to every element of the joinset, subject to three simple axioms to be given below. I will explain how, given an entropy valuation, we immediately obtain conditional and interaction entropies defined on pairs of elements, and I will show that this suite of entropies has all the formal properties of Shannon's probabilistic entropy (see [3][16]). In particular, we have notions of

### Citations

9231 |
Elements of Information Theory
- Cover, Thomas
- 1990
(Show Context)
Citation Context ...self. Setting H(A) = \Gamma X j p(A j ) log p(A j ) defines an entropy valuation on(\Omega ; ). Positivity is obvious; Monotonicity and Contractivity can be proven with the aid of Jensen's inequality =-=[3]-=-[16]. This valuation is the usual entropy of ff as it is defined in ergodic theory [16], and was introduced in this context by Kolmogorov. The conditional entropy is given by H(A=B) = \Gamma X j X k p... |

1622 |
and Complex Analysis
- Real
- 1974
(Show Context)
Citation Context ...c on the space of integrable functions. However, if we pass to equivalence classes of functions "taken up to values on sets of measure zero", then we obtain a true metric. (See, for instance=-=, [12] or [14]-=- for details.) Just as in the theory of L p spaces, usually no harm will come if one blurs over the distinction between elements and codependency classes, but there are occasions when this distinction... |

1176 |
Introduction to Lattices and Order
- Davey, Priestley
- 1990
(Show Context)
Citation Context ...unds of A; B: fC : AsC; BsCg Thus AsB is always the least upper bound of fA; Bg. Also, the zero Z is the minimal element of the joinset, with respect to the partial order . (See the early chapters of =-=[4]-=- for a thorough discussion of partial orders and posets.) Proof: This is certainly reflexive (by the idempotent law). If AsB and BsC, we can compute as follows: AsC = As(BsC) = (AsB)sC = BsC = C whenc... |

615 |
An Introduction to Symbolic Dynamics and Coding
- Lind, Marcus
- 1995
(Show Context)
Citation Context ... left diamonds. that the concepts discussed in this paper are natural, general, and easy to apply. In particular, I show how to recover the usual notions of measure-theoretic [16] and topological [16]=-=[10], in-=- addition to combinatorial and algebraic entropies. Example 9.1. Let (X; M;��) be a measure space and let\Omega be the set of finite measure sets in M. Then\Omega becomes a joinset when equipped w... |

437 |
An Introduction to Ergodic Theory
- Walters
- 1982
(Show Context)
Citation Context ...iately obtain conditional and interaction entropies defined on pairs of elements, and I will show that this suite of entropies has all the formal properties of Shannon's probabilistic entropy (see [3]=-=[16]-=-). In particular, we have notions of dependence, codependence, and independence, as well as an entropy distanceswhich defines the Rohklin metric on the class joinset, which consists of equivalence cla... |

414 |
An Introduction to Differentiable Manifolds and Riemannian Geometry
- Boothby
- 1986
(Show Context)
Citation Context ...denoted in this paper by calligraphic letters such as A, together with a binary operation called join and written AsB, such that the following properties hold: (1) Associative Law: (AsB)sC = As(BsC). =-=(2)-=- Commutative Law: A B = BsA. (3) Idempotent Law: AsA = A. (4) Zero Element: There is an element Z such that for all A, AsZ = A. We will interpret the elements of\Omega to be sets of alternatives of so... |

208 | Ergodic Theory - Petersen - 1983 |

141 |
Topics in Algebra
- Herstein
- 1975
(Show Context)
Citation Context ...itivity and Monotonicity are obvious. Contractivity follows from Noether's Theorem, which states that (A+C)=A is isomorphic as a vector space to C=(A"C). (See [1] for quotient vector spaces; Hers=-=tein [5] attributes the anal-=-ogous result for ideals to Emmy Noether.) Then, since H(C=A) = dimC=(A " C) and H(C=B) = dimC=(B " C), we see that if AsB, we have A " CsB " C, whence H(C=A)sH(C=B). The entropy di... |

80 |
Real Analysis, Third Edition
- Royden
- 1988
(Show Context)
Citation Context ...p)t)sp'(s) + (1 \Gamma p)'(t) A FORMAL THEORY OF INFORMATION: I. STATICS 5 That is, no point on the secant over (s; t) where s ! t can lie above the graph of '. We will need the following useful fact =-=[12]-=- concerning such functions: Lemma 3.1 (Secant Lemma). Suppose ' is concave down on (a; b). Let x; y; x 0 ; y 0 2 (a; b) with xsx 0 ! y 0 x ! ysy 0 Then '(y) \Gamma '(x) y \Gamma x '(y 0 ) \Gamma '(x 0... |

67 |
A mathematical theory of communication," Bell Sys
- Shannon
- 1948
(Show Context)
Citation Context ...by many other entropies of interest for applications. Incidently, this explains why in much of the classical information theory literature, interaction entropy is called mutual information (following =-=[15]-=-). Proof: By the Quotient Rule, we have H(B) \Gamma H(B=A) = H(A) + H(B) \Gamma H(AsB) = H(A) \Gamma H(A=B) These are non-negative by the Second Order Property, and vanish iff A; B are independent, by... |

47 | The Theory of Transformation Groups - Kawakubo - 1991 |

3 |
A Survey of Modern Algebra. Fourth edition
- Birkoff, Lane
- 1977
(Show Context)
Citation Context ... an entropy valuation on(\Omega ; +). Positivity and Monotonicity are obvious. Contractivity follows from Noether's Theorem, which states that (A+C)=A is isomorphic as a vector space to C=(A"C). =-=(See [1] for quotient v-=-ector spaces; Herstein [5] attributes the analogous result for ideals to Emmy Noether.) Then, since H(C=A) = dimC=(A " C) and H(C=B) = dimC=(B " C), we see that if AsB, we have A " CsB ... |

3 |
are the 29 Dimensions
- What
- 2006
(Show Context)
Citation Context ...etting H(A) = dimA (where dim is the Hausdorff dimension) defines an entropy valuation on(\Omega ; [). Positivity is obvious but Monotonicity and Contractivity take a little work to prove. I refer to =-=[8]-=- for the definition and computation of Hausdorff dimension. Lemma 9.2 (Monotonicity of Hausdorff Dimension). Let A; B be closed subsets of a metric space (X; d). If A ae B, then dimAsdimB. Proof: Let ... |

1 | Symmetry and information
- Hillman
(Show Context)
Citation Context ...A; X=B) = codim(A + B) and A ?B iff X = A+ B. The next two examples employ some notions from the theory of complexions; for motivation of the definitions and proofs of the assertions I make below see =-=[6]-=-. Example 9.5. Let G be a locally compact Lie group and let G act smoothly on the smooth manifold X. Then the power set of X, 2 X , becomes a joinset when equipped with join AsB = A [ B. The natural p... |