Results 1  10
of
10
Recursively Enumerable Reals and Chaitin Ω Numbers
"... A real is called recursively enumerable if it is the limit of a recursive, increasing, converging sequence of rationals. Following Solovay [23] and Chaitin [10] we say that an r.e. real dominates an r.e. real if from a good approximation of from below one can compute a good approximation of from b ..."
Abstract

Cited by 34 (3 self)
 Add to MetaCart
A real is called recursively enumerable if it is the limit of a recursive, increasing, converging sequence of rationals. Following Solovay [23] and Chaitin [10] we say that an r.e. real dominates an r.e. real if from a good approximation of from below one can compute a good approximation of from below. We shall study this relation and characterize it in terms of relations between r.e. sets. Solovay's [23]like numbers are the maximal r.e. real numbers with respect to this order. They are random r.e. real numbers. The halting probability ofa universal selfdelimiting Turing machine (Chaitin's Ω number, [9]) is also a random r.e. real. Solovay showed that any Chaitin Ω number islike. In this paper we show that the converse implication is true as well: any Ωlike real in the unit interval is the halting probability of a universal selfdelimiting Turing machine.
Recursive computational depth
 Information and Computation
, 1999
"... In the 1980's, Bennett introduced computational depth as a formal measure of the amount of computational history that is evident in an object's structure. In particular, Bennett identi ed the classes of weakly deep and strongly deep sequences, and showed that the halting problem is strongly deep. Ju ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
In the 1980's, Bennett introduced computational depth as a formal measure of the amount of computational history that is evident in an object's structure. In particular, Bennett identi ed the classes of weakly deep and strongly deep sequences, and showed that the halting problem is strongly deep. Juedes, Lathrop, and Lutz subsequently extended this result by de ning the class of weakly useful sequences, and proving that every weakly useful sequence is strongly deep. The present paper investigates re nements of Bennett's notions of weak and strong depth, called recursively weak depth (introduced by Fenner, Lutz and Mayordomo) and recursively strong depth (introduced here). It is argued that these re nements naturally capture Bennett's idea that deep objects are those which \contain internal evidence of a nontrivial causal history. " The fundamental properties of recursive computational depth are developed, and it is shown that the recursively weakly (respectively, strongly) deep sequences form a proper subclass of the class of weakly (respectively, strongly) deep sequences. The abovementioned theorem of Juedes, Lathrop, and Lutz is then strengthened by proving that every weakly useful sequence is recursively strongly deep. It follows from these results that not every strongly deep sequence is weakly useful, thereby answering a question posed by Juedes.
TimeBounded Kolmogorov Complexity and Solovay Functions
"... Abstract. A Solovay function is a computable upper bound g for prefixfree Kolmogorov complexity K that is nontrivial in the sense that g agrees with K, up to some additive constant, on infinitely many places n. We obtain natural examples of Solovay functions by showing that for some constant c0 and ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract. A Solovay function is a computable upper bound g for prefixfree Kolmogorov complexity K that is nontrivial in the sense that g agrees with K, up to some additive constant, on infinitely many places n. We obtain natural examples of Solovay functions by showing that for some constant c0 and all computable functions t such that c0n ≤ t(n), the timebounded version K t of K is a Solovay function. By unifying results of Bienvenu and Downey and of Miller, we show that a rightcomputable upper bound g of K is a Solovay function if and only if Ωg is MartinLöf random. Letting Ωg = ∑ 2 −g(n),weobtainas a corollary that the MartinLöf randomness of the various variants of Chaitin’s Ω extends to the timebounded case in so far as ΩKt is MartinLöf random for any t as above. As a step in the direction of a characterization of Ktriviality in terms of jumptraceability, we demonstrate that a set A is Ktrivial if and only if A is O(g(n) − K(n))jump traceable for all Solovay functions g, where the equivalence remains true when we restrict attention to functions g of the form K t, either for a single or all functions t as above. Finally, we investigate the plain Kolmogorov complexity C and its timebounded variant C t of initial segments of computably enumerable sets. Our main theorem here is a dichotomy similar to Kummer’s gap theorem and asserts that every high c.e. Turing degree contains a c.e. set B such that for any computable function t there is a constant ct> 0such that for all m it holds that C t (B ↾ m) ≥ ct · m, whereas for any nonhigh c.e. set A there is a computable time bound t and a constant c such that for infinitely many m it holds that C t (A ↾ m) ≤ log m + c. By similar methods it can be shown that any high degree contains a set B such that C t (B ↾ m) ≥ + m/4. The constructed sets B have low unbounded but high timebounded Kolmogorov complexity, and accordingly we obtain an alternative proof of the result due to Juedes, Lathrop, and Lutz [JLL] that every high degree contains a strongly deep set. 1
Compression Depth and the Behavior of Cellular Automata
 Complex Systems
, 1997
"... A computable complexity measure analogous to computational depth is developed using the LempelZiv compression algorithm. This complexity measure, which we call compression depth, is then applied to the computational output of cellular automata. We find that compression depth captures the complexity ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
A computable complexity measure analogous to computational depth is developed using the LempelZiv compression algorithm. This complexity measure, which we call compression depth, is then applied to the computational output of cellular automata. We find that compression depth captures the complexity found in Wolfram Class III celluar automata, and is in good agreement with his classification scheme. We further investigate the rule space of cellular automata using Langton's parameter. 1 This research was supported in part by National Science Foundation Grant CCR9157382, with matching funds from Rockwell, Microware Systems Corporation, and Amoco Foundation. 1 Introduction Measures of the complexities of objects are widely used in both theory and applications in order to model, predict, and classify objects. Information theory gives us several methods for measuring the information content of objects. The most widely used of these information measures, entropy and algorithmic informa...
RANDOMNESS – BEYOND LEBESGUE MEASURE
"... Much of the recent research on algorithmic randomness has focused on randomness for Lebesgue measure. While, from a computability theoretic point of view, the picture remains unchanged if one passes to arbitrary computable measures, interesting phenomena occur if one studies the the set of reals wh ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Much of the recent research on algorithmic randomness has focused on randomness for Lebesgue measure. While, from a computability theoretic point of view, the picture remains unchanged if one passes to arbitrary computable measures, interesting phenomena occur if one studies the the set of reals which are random for an arbitrary (continuous) probability measure or a generalized Hausdorff measure on Cantor space. This paper tries to give a survey of some of the research that has been done on randomness for nonLebesgue measures.
DOI 10.1007/s0022400991710 Depth as Randomness Deficiency
, 2009
"... Abstract Depth of an object concerns a tradeoff between computation time and excess of program length over the shortest program length required to obtain the object. It gives an unconditional lower bound on the computation time from a given program in absence of auxiliary information. Variants known ..."
Abstract
 Add to MetaCart
Abstract Depth of an object concerns a tradeoff between computation time and excess of program length over the shortest program length required to obtain the object. It gives an unconditional lower bound on the computation time from a given program in absence of auxiliary information. Variants known as logical depth and computational depth are expressed in Kolmogorov complexity theory. We derive quantitative relation between logical depth and computational depth and unify the different depth notions by relating them to A. Kolmogorov and L. Levin’s fruitful notion of randomness deficiency. Subsequently, we revisit the computational The authors from University of Porto are partially supported by KCrypt (POSC/EIA/60819/2004) and funds granted to LIACC through the Programa de Financiamento Plurianual, Fundação para a Ciência e Tecnologia and Programa POSI.
A General Notion of Useful Information
"... In this paper we introduce a general framework for defining the depth of a sequence with respect to a class of observers. We show that our general framework captures all depth notions introduced in complexity theory so far. We review most such notions, show how they are particular cases of our gener ..."
Abstract
 Add to MetaCart
In this paper we introduce a general framework for defining the depth of a sequence with respect to a class of observers. We show that our general framework captures all depth notions introduced in complexity theory so far. We review most such notions, show how they are particular cases of our general depth framework, and review some classical results about the different depth notions. Key words: Bennett’s logical depth, computable depth 1