## Chaitin Ω numbers and halting problems (2009)

### BibTeX

@MISC{Tadaki09chaitinω,

author = {Kohtaro Tadaki},

title = { Chaitin Ω numbers and halting problems},

year = {2009}

}

### OpenURL

### Abstract

... 1975] introduced Ω number as a concrete example of random real. The real Ω is defined as the probability that an optimal computer halts, where the optimal computer is a universal decoding algorithm used to define the notion of program-size complexity. Chaitin showed Ω to be random by discovering the property that the first n bits of the base-two expansion of Ω solve the halting problem of the optimal computer for all binary inputs of length at most n. In the present paper we investigate this property from various aspects. We consider the relative computational power between the base-two expansion of Ω and the halting problem by imposing the restriction to finite size on both the problems. It is known that the base-two expansion of Ω and the halting problem are Turing equivalent. We thus consider an elaboration of the Turing equivalence in a certain manner.

### Citations

343 | A theory of program size formally identical to information theory
- Chaitin
- 1975
(Show Context)
Citation Context ...e definition, H(s) can be thought of as the information content of the individual finite binary string s. In fact, AIT has precisely the formal properties of classical information theory (see Chaitin =-=[2]-=-). In particular, the notion of program-size complexity plays a crucial role in characterizing the randomness of an infinite binary string, or equivalently, a real. In [2] Chaitin introduced the halti... |

342 | The definition of random sequences - Martin-Löf - 1966 |

340 | Algorithmic Information Theory
- Chaitin
- 1987
(Show Context)
Citation Context ...r.e. real number is also called a left-computable real number. 32.2 Algorithmic information theory In the following we concisely review some definitions and results of algorithmic information theory =-=[2, 4]-=-. A computer is a partial recursive function C : {0,1} ∗ → {0,1} ∗ such that dom C is a prefix-free set. For each computer C and each s ∈ {0,1} ∗ , HC(s) is defined by HC(s) = min { |p| ∣ ∣ p ∈ {0,1} ... |

174 |
Algorithmic randomness and complexity
- Downey, Hirschfeldt
- 2010
(Show Context)
Citation Context ... Theorem 5.1, we need Theorem 5.2 below. For the purpose of understanding the statement of Theorem 5.2, we concisely review some definitions and results of the theory of relative randomness. See e.g. =-=[12, 6]-=- for the detail of the theory. 10An oracle computer is an oracle deterministic Turing machine M with the input and output alphabet {0,1} such that, for every subset A of {0,1} ∗ , the domain of defin... |

115 | Computability and Randomness
- Nies
(Show Context)
Citation Context ... Theorem 5.1, we need Theorem 5.2 below. For the purpose of understanding the statement of Theorem 5.2, we concisely review some definitions and results of the theory of relative randomness. See e.g. =-=[12, 6]-=- for the detail of the theory. 10An oracle computer is an oracle deterministic Turing machine M with the input and output alphabet {0,1} such that, for every subset A of {0,1} ∗ , the domain of defin... |

102 |
Laws of information conservation (non-growth) and aspects of the foundations of probability theory, Problems of Inform
- Levin
- 1974
(Show Context)
Citation Context ...lar optimal computer U as the standard one for use, and define H(s) as HU(s), which is referred to as the program-size complexity of s, the information content of s, or the Kolmogorov complexity of s =-=[7, 9, 2]-=-. It follows that for every computer C there exists d ∈ N such that, for every s ∈ {0,1} ∗ , H(s) ≤ HC(s) + d. (3) Based on this we can show that there exists c ∈ N such that, for every s ∈ {0,1} ∗ , ... |

94 |
On the symmetry of algorithmic information
- Gács
- 1974
(Show Context)
Citation Context ...lar optimal computer U as the standard one for use, and define H(s) as HU(s), which is referred to as the program-size complexity of s, the information content of s, or the Kolmogorov complexity of s =-=[7, 9, 2]-=-. It follows that for every computer C there exists d ∈ N such that, for every s ∈ {0,1} ∗ , H(s) ≤ HC(s) + d. (3) Based on this we can show that there exists c ∈ N such that, for every s ∈ {0,1} ∗ , ... |

89 | Process complexity and effective random tests - Schnorr - 1973 |

68 | Randomness and recursive enumerability
- Kučera, Slaman
(Show Context)
Citation Context ...orem 2.3. For every α ∈ R, α is weakly Chaitin random if and only if α is Chaitin random. The following is an important result on random r.e. reals. Theorem 2.4 (Calude, et al. [1], Kučera and Slaman =-=[8]-=-). For every α ∈ (0,1), α is r.e. and weakly Chaitin random if and only if there exists an optimal computer V such that α = ΩV . 43 Elaboration I of the Turing reduction ΩU ≤T dom U Theorem 3.1 (main... |

51 |
Draft of a paper (or series of papers) on Chaitin's work
- Solovay
- 1975
(Show Context)
Citation Context ...h at most n. Theorem 3.9. Let V be an optimal computer, and let M be a deterministic Turing machine which computes V . Then n = H(T M n ,n) + O(1) = H(T M n ) + O(1) for all n ≥ LM. Note that Solovay =-=[14]-=- showed a similar result to Theorem 3.9 for hn = #{p ∈ dom V | |p| ≤ n} in place of T M n . On the other hand, Chaitin showed a similar result to Theorem 3.9 for p ∈ dom V such that |p| ≤ n and the ru... |

37 | On initial segment complexity and degrees of randomness
- Miller, Yu
(Show Context)
Citation Context ...nd the running time of M on the input p equals to T M n , in place of T M n (see Note in Section 8.1 of Chaitin [4]). We include the proof of Theorem 3.9 in Appendix A for completeness. Miller and Yu =-=[11]-=- recently strengthened Theorem 2.3 to the following form. Theorem 3.10 (Ample Excess Lemma, Miller and Yu [11]). For every α ∈ R, α is weakly Chaitin random if and only if ∑ ∞ n=1 2n−H(α↾n) < ∞. Then ... |

36 | Recursively enumerable reals and Chaitin Ω numbers, Theoret
- Calude, Hertling, et al.
(Show Context)
Citation Context ...historical detail). Theorem 2.3. For every α ∈ R, α is weakly Chaitin random if and only if α is Chaitin random. The following is an important result on random r.e. reals. Theorem 2.4 (Calude, et al. =-=[1]-=-, Kučera and Slaman [8]). For every α ∈ (0,1), α is r.e. and weakly Chaitin random if and only if there exists an optimal computer V such that α = ΩV . 43 Elaboration I of the Turing reduction ΩU ≤T ... |

6 | Incompleteness theorems for random reals,” Adv - Chaitin - 1987 |

5 | Program-size complexity computes the halting problem,” Bulletin of the European Association for Theoretical - Chaitin - 1995 |