## A Theory of Program Size Formally Identical to Information Theory (1975)

### Cached

### Download Links

- [www.umcs.maine.edu]
- [minerva.ufpel.edu.br]
- [www.cs.auckland.ac.nz]
- [www.umcs.maine.edu]
- [www.cs.auckland.ac.nz]
- [www.dna.caltech.edu]
- DBLP

### Other Repositories/Bibliography

Citations: | 332 - 16 self |

### BibTeX

@MISC{Chaitin75atheory,

author = {Gregory J. Chaitin},

title = {A Theory of Program Size Formally Identical to Information Theory},

year = {1975}

}

### Years of Citing Articles

### OpenURL

### Abstract

A new definition of program-size complexity is made. H(A;B=C;D) is defined to be the size in bits of the shortest self-delimiting program for calculating strings A and B if one is given a minimal-size selfdelimiting program for calculating strings C and D. This differs from previous definitions: (1) programs are required to be self-delimiting, i.e. no program is a prefix of another, and (2) instead of being given C and D directly, one is given a program for calculating them that is minimal in size. Unlike previous definitions, this one has precisely the formal 2 G. J. Chaitin properties of the entropy concept of information theory. For example, H(A;B) = H(A) + H(B=A) + O(1). Also, if a program of length k is assigned measure 2 \Gammak , then H(A) = \Gamma log 2 (the probability that the standard universal computer will calculate A) +O(1). Key Words and Phrases: computational complexity, entropy, information theory, instantaneous code, Kraft inequality, minimal program, probab...

### Citations

533 |
Three approaches to the quantitative definition of information
- Kolmogorov
(Show Context)
Citation Context ...rsuasive analogy between the entropy concept of information theory and the size of programs. This was realized by the first workers in the field of program-size complexity, Solomonoff [1], Kolmogorov =-=[2]-=-, and Chaitin [3,4], and it accounts for the large measure of success of subsequent work in this area. However, it is often the case that results are cumbersome and have unpleasant error terms. These ... |

517 | Perceptrons: An introduction to computational geometry - Minsky, Papert - 1969 |

462 | Logical Reversability of Computation - Bennett - 1973 |

459 | Computation: finite and infinite machines - Minsky - 1967 |

406 |
A formal theory of inductive inference
- Solomonoff
- 1964
(Show Context)
Citation Context ...on There is a persuasive analogy between the entropy concept of information theory and the size of programs. This was realized by the first workers in the field of program-size complexity, Solomonoff =-=[1]-=-, Kolmogorov [2], and Chaitin [3,4], and it accounts for the large measure of success of subsequent work in this area. However, it is often the case that results are cumbersome and have unpleasant err... |

347 | Universal Codeword Sets and Representations of the Integers - Elias - 1975 |

230 | On the length of programs for computing finite binary sequences
- Chaitin
- 1966
(Show Context)
Citation Context ...between the entropy concept of information theory and the size of programs. This was realized by the first workers in the field of program-size complexity, Solomonoff [1], Kolmogorov [2], and Chaitin =-=[3,4]-=-, and it accounts for the large measure of success of subsequent work in this area. However, it is often the case that results are cumbersome and have unpleasant error terms. These ideas cannot be a t... |

185 | The Complexity of Finite Objects and the Development of the Concepts of Inforrna- p. 73. tion and Randomness by Means of the Theory of Algo- rithms
- Zvonkin, Levin
- 1970
(Show Context)
Citation Context .... To compare the properties of our entropy function H with those it has in information theory, see [9–12]; to contrast its properties with those of previous definitions of programsize complexity, se=-=e [14]. C-=-over [15] and Gewirtz [16] use our news4 G. J. Chaitin definition. See [17–32] for other applications of information/entropy concepts. 2. Definitions X = {Λ, 0, 1, 00, 01, 10, 11, 000,...} is the s... |

150 | Information Theory and Coding - Abramson - 1963 |

105 | Transmission of information - Fano - 1949 |

85 |
Process complexity and effective random tests
- Schnorr
- 1973
(Show Context)
Citation Context ... Rivadavia 3580, Dpto. 10A, Buenos Aires, Argentina.sA Theory of Program Size 3 definitions of program-size complexity, for example, Loveland’s uniform complexity [6] and Schnorr’s process complex=-=ity [7]-=-. In this paper we present a new concept of program-size complexity. What train of thought led us to it? Following [8, Sec. VI, p.7], think of a computer as decoding equipment at the receiving end of ... |

61 | Information-theoretic Characterizations of Recursive Infinite Strings,” Theoret - Chaitin - 1976 |

48 | Theories of probability: an examination of foundations - Fine - 1973 |

47 | Information-Theoretic Limitations of Formal Systems - Chaitin - 1974 |

44 |
A variant of the kolmogorov concept of complexity
- Loveland
- 1969
(Show Context)
Citation Context ...tober 1974. Author’s present address: Rivadavia 3580, Dpto. 10A, Buenos Aires, Argentina.sA Theory of Program Size 3 definitions of program-size complexity, for example, Loveland’s uniform complex=-=ity [6] a-=-nd Schnorr’s process complexity [7]. In this paper we present a new concept of program-size complexity. What train of thought led us to it? Following [8, Sec. VI, p.7], think of a computer as decodi... |

35 | Information-Theoretic Computational Complexity - Chaitin - 1974 |

25 |
Universal gambling schemes and the complexity measures of Kolmogorov and Chaitin
- Cover
- 1974
(Show Context)
Citation Context ... the properties of our entropy function H with those it has in information theory, see [9–12]; to contrast its properties with those of previous definitions of programsize complexity, see [14]. Cove=-=r [15] an-=-d Gewirtz [16] use our news4 G. J. Chaitin definition. See [17–32] for other applications of information/entropy concepts. 2. Definitions X = {Λ, 0, 1, 00, 01, 10, 11, 000,...} is the set of finite... |

22 | Foundation of Information Theory - FEINSTEIN - 1958 |

22 |
Computational complexity and probability construction
- Willis
(Show Context)
Citation Context ... bounded. In Sections 2–4 we define this new concept formally, establish the basic identities, and briefly consider the resulting concept of randomness or maximal entropy. We recommend reading Willi=-=s [13]. -=-In retrospect it is clear that he was aware of some of the basic ideas of this paper, though he developed them in a different direction. Chaitin’s study [3,4] of the state complexity of Turing machi... |

20 | On the Difficulty of Computations - Chaitin - 1970 |

18 | On programming, an interim report on the setl project - Schwartz - 1973 |

13 |
Investigations in the theory of descriptive complexity
- Gewirtz
- 1974
(Show Context)
Citation Context ...f our entropy function H with those it has in information theory, see [9–12]; to contrast its properties with those of previous definitions of programsize complexity, see [14]. Cover [15] and Gewirt=-=z [16] us-=-e our news4 G. J. Chaitin definition. See [17–32] for other applications of information/entropy concepts. 2. Definitions X = {Λ, 0, 1, 00, 01, 10, 11, 000,...} is the set of finite binary strings, ... |

12 | Foundations of Probability (Holden-Day - Renyi - 1970 |

11 | On the determination of the irrationality of the mean of a random variable - Cover - 1973 |

10 | The extent and density of sequences within the minimal-program complexity hierarchies - Daley - 1974 |

9 | O n Programming: An Interim Report on - Schwartz |

8 | The isomorphism problem in ergodic theory - Weiss - 1972 |

4 |
On the Logical Foundations of Information Theory and Probability Theory.” Problems Inform
- Kolmogorov
- 1969
(Show Context)
Citation Context ...eas cannot be a tool for general use until they are clothed in a powerful formalism like that of information theory. This opinion is apparently not shared by all workers in this field (see Kolmogorov =-=[5]),-=- but it has led others to formulate alternative 1 Copyright c○ 1975, Association for Computing Machinery, Inc. General permission to republish, but not for profit, all or part of this material is gr... |

2 | Information Theory. Wiley-Interscience - Ash - 1965 |

2 | Minimum times and memories needed to compute the values of a function - Elias - 1974 |

2 | The Information Theoretic Approach to Cryptography - Hellman - 1974 |

1 | Mathematical Logic for Computer Scientists. Rep - Levin - 1974 |

1 | in press. (Note. Reference [33] is not cited in the text.) Theory of Program Size 27 Received April 1974; Revised - Amer - 1975 |

1 |
On the length of programs for computing finite binary sequences
- CHAITX
(Show Context)
Citation Context ...between the entropy concept of information theory and the size of programs. This was realized by the first workers in the field of program-size complexity, Solomonoit [1], Kolmogorov [2], and Chaitin =-=[3, 4]-=-, and it accounts for the large measure of success of subsequent work in this area. However, it is often the case that results are cumbersome and have unpleasant error terms. These ideas cannot be a t... |

1 |
On the logical foundations of information theory and probability theory
- OLMOGOROV
- 1969
(Show Context)
Citation Context ...deas cannot be a tool for general use until they are clothed in a powerful formalism like that of information theory. This opinion is apparently not shared by all workers in the field (see Kolmogorov =-=[5]-=-), but it has led others to formulate alternative definitions of program-size complexity, for example, Loveland's uniform complexity [6] and Schnorr's process complexity [7]. In this paper we present ... |

1 |
Computational complexity and probability constructions
- WLLIS
- 1970
(Show Context)
Citation Context ... bounded. In Sections 2-4 we define this new concept formally, establish the basic identities, and briefly consider the resulting concept of randomness or maximal entropy. We recommend reading Willis =-=[13]-=-. In retrospect it is clear that he was aware of some of the basic ideas of this paper, though he developed them in a different direction. Chaitin's study [3, 4] of the state complexity of Turing mach... |

1 |
Investigations in the theory of descriptive complexity
- GEWIaTZ
- 1974
(Show Context)
Citation Context ... our entropy function H with those it has in information theory, see [9-12]; to contrast its properties with those of previous definitions of program-size complexity, see [14]. Cover [15] and Gewirtz =-=[16]-=- use our new definition. See [17-32] for other applications of information/entropy concepts. 2. Definitions X -- { A, 0, 1, 00, 01, 10, 11,000, • • • } is the set of finite binary strings, and X ® is ... |

1 | Foundations of Probability. Holden-Day - RNYI - 1970 |

1 | Mathematical Logic for Computer Scientists. Rep - LEvis - 1974 |