Results 1 
7 of
7
Recursively Enumerable Reals and Chaitin Ω Numbers
"... A real is called recursively enumerable if it is the limit of a recursive, increasing, converging sequence of rationals. Following Solovay [23] and Chaitin [10] we say that an r.e. real dominates an r.e. real if from a good approximation of from below one can compute a good approximation of from b ..."
Abstract

Cited by 34 (3 self)
 Add to MetaCart
A real is called recursively enumerable if it is the limit of a recursive, increasing, converging sequence of rationals. Following Solovay [23] and Chaitin [10] we say that an r.e. real dominates an r.e. real if from a good approximation of from below one can compute a good approximation of from below. We shall study this relation and characterize it in terms of relations between r.e. sets. Solovay's [23]like numbers are the maximal r.e. real numbers with respect to this order. They are random r.e. real numbers. The halting probability ofa universal selfdelimiting Turing machine (Chaitin's Ω number, [9]) is also a random r.e. real. Solovay showed that any Chaitin Ω number islike. In this paper we show that the converse implication is true as well: any Ωlike real in the unit interval is the halting probability of a universal selfdelimiting Turing machine.
Recursive computational depth
 Information and Computation
, 1999
"... In the 1980's, Bennett introduced computational depth as a formal measure of the amount of computational history that is evident in an object's structure. In particular, Bennett identi ed the classes of weakly deep and strongly deep sequences, and showed that the halting problem is strongly deep. Ju ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
In the 1980's, Bennett introduced computational depth as a formal measure of the amount of computational history that is evident in an object's structure. In particular, Bennett identi ed the classes of weakly deep and strongly deep sequences, and showed that the halting problem is strongly deep. Juedes, Lathrop, and Lutz subsequently extended this result by de ning the class of weakly useful sequences, and proving that every weakly useful sequence is strongly deep. The present paper investigates re nements of Bennett's notions of weak and strong depth, called recursively weak depth (introduced by Fenner, Lutz and Mayordomo) and recursively strong depth (introduced here). It is argued that these re nements naturally capture Bennett's idea that deep objects are those which \contain internal evidence of a nontrivial causal history. " The fundamental properties of recursive computational depth are developed, and it is shown that the recursively weakly (respectively, strongly) deep sequences form a proper subclass of the class of weakly (respectively, strongly) deep sequences. The abovementioned theorem of Juedes, Lathrop, and Lutz is then strengthened by proving that every weakly useful sequence is recursively strongly deep. It follows from these results that not every strongly deep sequence is weakly useful, thereby answering a question posed by Juedes.
Computational depth and reducibility
 Theoretical Computer Science
, 1994
"... This paper reviews and investigates Bennett's notions of strong and weak computational depth (also called logical depth) for in nite binary sequences. Roughly, an in nite binary sequence x is de ned to be weakly useful if every element of a nonnegligible set of decidable sequences is reducible to x ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
This paper reviews and investigates Bennett's notions of strong and weak computational depth (also called logical depth) for in nite binary sequences. Roughly, an in nite binary sequence x is de ned to be weakly useful if every element of a nonnegligible set of decidable sequences is reducible to x in recursively bounded time. It is shown that every weakly useful sequence is strongly deep. This result (which generalizes Bennett's observation that the halting problem is strongly deep) implies that every high Turing degree contains strongly deep sequences. It is also shown that, in the sense of Baire category, almost
A Glimpse into Algorithmic Information Theory
 Logic, Language and Computation, Volume 3, CSLI Series
, 1999
"... This paper is a subjective, short overview of algorithmic information theory. We critically discuss various equivalent algorithmical models of randomness motivating a #randomness hypothesis". Finally some recent results on computably enumerable random reals are reviewed. 1 Randomness: An Informa ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
This paper is a subjective, short overview of algorithmic information theory. We critically discuss various equivalent algorithmical models of randomness motivating a #randomness hypothesis". Finally some recent results on computably enumerable random reals are reviewed. 1 Randomness: An Informal Discussion In which we discuss some di#culties arising in de#ning randomness. Suppose that one is watching a simple pendulum swing back and forth, recording 0 if it swings clockwise at a given instant and 1 if it swings counterclockwise. Suppose further that after some time the record looks as follows: 10101010101010101010101010101010: At this point one would like to deduce a #theory" from the experiment. 1 The #theory" should account for the data presently available and make #predictions" about future observations. How should one proceed? It is obvious that there are many #theories" that one could writedown for the given data, for example: 10101010101010101010101010101010000000000...
Compression Depth and the Behavior of Cellular Automata
 Complex Systems
, 1997
"... A computable complexity measure analogous to computational depth is developed using the LempelZiv compression algorithm. This complexity measure, which we call compression depth, is then applied to the computational output of cellular automata. We find that compression depth captures the complexity ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
A computable complexity measure analogous to computational depth is developed using the LempelZiv compression algorithm. This complexity measure, which we call compression depth, is then applied to the computational output of cellular automata. We find that compression depth captures the complexity found in Wolfram Class III celluar automata, and is in good agreement with his classification scheme. We further investigate the rule space of cellular automata using Langton's parameter. 1 This research was supported in part by National Science Foundation Grant CCR9157382, with matching funds from Rockwell, Microware Systems Corporation, and Amoco Foundation. 1 Introduction Measures of the complexities of objects are widely used in both theory and applications in order to model, predict, and classify objects. Information theory gives us several methods for measuring the information content of objects. The most widely used of these information measures, entropy and algorithmic informa...