## Computational depth and reducibility (1994)

### Cached

### Download Links

- [www.cs.iastate.edu]
- [www.cs.iastate.edu]
- DBLP

### Other Repositories/Bibliography

Venue: | Theoretical Computer Science |

Citations: | 12 - 2 self |

### BibTeX

@ARTICLE{Juedes94computationaldepth,

author = {David W. Juedes and James I. Lathrop and Jack H. Lutz},

title = {Computational depth and reducibility},

journal = {Theoretical Computer Science},

year = {1994},

volume = {132},

pages = {37--70}

}

### Years of Citing Articles

### OpenURL

### Abstract

This paper reviews and investigates Bennett's notions of strong and weak computational depth (also called logical depth) for in nite binary sequences. Roughly, an in nite binary sequence x is de ned to be weakly useful if every element of a non-negligible set of decidable sequences is reducible to x in recursively bounded time. It is shown that every weakly useful sequence is strongly deep. This result (which generalizes Bennett's observation that the halting problem is strongly deep) implies that every high Turing degree contains strongly deep sequences. It is also shown that, in the sense of Baire category, almost

### Citations

8563 |
Elements of Information Theory
- Cover, Thomas
- 1991
(Show Context)
Citation Context ....e., must be a set of nonempty strings, no one of which is a prefix of another. (It is this feature of the model that the adjective "self-delimiting" describes.) It follows by Kraft's inequa=-=lity (see [13], for ex-=-ample) that, for all Turing machines M , X ��2PROGM 2 \Gammaj��js1: It is well-known that there are Turing machines U that are universal, in the sense that, for every Turing machine M , there ... |

3836 |
J.D.: Introduction to automata theory, languages, and computation
- Hopcroft, Motwani, et al.
(Show Context)
Citation Context ...at the reader is already familiar with the general ideas of Turing machine computation, including computation by oracle Turing machines. (Discussion of such machines may be found in many texts, e.g., =-=[2, 19, 44, 50]-=-.) Given a recursive time bound s : N ! N, we say that an oracle Turing machine M is s-time-bounded if, given any input n 2 N and oracle y 2 f0; 1g 1 , M outputs a bit M y (n) 2 f0; 1g in at most s(l)... |

1682 | An Introduction to Kolmogorov Complexity and its Applications
- Li, Vitányi
- 1997
(Show Context)
Citation Context ...s of algorithmic information theory that are used in this paper. We are especially concerned with selfdelimiting Kolmogorov complexity and algorithmic randomness. The interested reader is referred to =-=[33, 35]-=- for more details, discussion, and proofs. Kolmogorov complexity, also called program-size complexity, was discovered independently by Solomonoff [51], Kolmogorov [21], and Chaitin [9]. Self-delimitin... |

837 |
Theory of recursive functions and effective computability
- Rogers
- 1967
(Show Context)
Citation Context ...at the reader is already familiar with the general ideas of Turing machine computation, including computation by oracle Turing machines. (Discussion of such machines may be found in many texts, e.g., =-=[2, 19, 44, 50]-=-.) Given a recursive time bound s : N ! N, we say that an oracle Turing machine M is s-time-bounded if, given any input n 2 N and oracle y 2 f0; 1g 1 , M outputs a bit M y (n) 2 f0; 1g in at most s(l)... |

521 |
Three approaches to the quantitative definition of information
- Kolmogorov
- 1965
(Show Context)
Citation Context ...ndation Grant CCR9157382, with matching funds from Rockwell International and Microware Systems Corporation. 1 Introduction Algorithmic information theory, as developed by Solomonoff [51], Kolmogorov =-=[21, 22, 23]-=-, Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin [26, 27, 28, 29, 30, 31, 55], Schnorr [47], G'acs [15], Shen 0 [48, 49], and others, gives a satisfactory, quantitative account of the information... |

473 |
Recursively Enumerable Sets and Degrees
- Soare
- 1987
(Show Context)
Citation Context ...at the reader is already familiar with the general ideas of Turing machine computation, including computation by oracle Turing machines. (Discussion of such machines may be found in many texts, e.g., =-=[2, 19, 44, 50]-=-.) Given a recursive time bound s : N ! N, we say that an oracle Turing machine M is s-time-bounded if, given any input n 2 N and oracle y 2 f0; 1g 1 , M outputs a bit M y (n) 2 f0; 1g in at most s(l)... |

404 |
A formal theory of inductive inference
- Solomonoff
- 1964
(Show Context)
Citation Context ...ional Science Foundation Grant CCR9157382, with matching funds from Rockwell International and Microware Systems Corporation. 1 Introduction Algorithmic information theory, as developed by Solomonoff =-=[51]-=-, Kolmogorov [21, 22, 23], Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin [26, 27, 28, 29, 30, 31, 55], Schnorr [47], G'acs [15], Shen 0 [48, 49], and others, gives a satisfactory, quantitative a... |

332 |
The definition of random sequences
- Martin-Löf
- 1966
(Show Context)
Citation Context ... Rockwell International and Microware Systems Corporation. 1 Introduction Algorithmic information theory, as developed by Solomonoff [51], Kolmogorov [21, 22, 23], Chaitin [9, 10, 11, 12], Martin-Lof =-=[39, 40]-=-, Levin [26, 27, 28, 29, 30, 31, 55], Schnorr [47], G'acs [15], Shen 0 [48, 49], and others, gives a satisfactory, quantitative account of the information content of individual binary strings (finite)... |

330 | A theory of program size formally identical to information theory
- Chaitin
(Show Context)
Citation Context ...82, with matching funds from Rockwell International and Microware Systems Corporation. 1 Introduction Algorithmic information theory, as developed by Solomonoff [51], Kolmogorov [21, 22, 23], Chaitin =-=[9, 10, 11, 12]-=-, Martin-Lof [39, 40], Levin [26, 27, 28, 29, 30, 31, 55], Schnorr [47], G'acs [15], Shen 0 [48, 49], and others, gives a satisfactory, quantitative account of the information content of individual bi... |

309 |
Measure Theory
- Halmos
- 1974
(Show Context)
Citation Context ...ction reviews the basic ideas from Lebesgue measure, resource-bounded measure, and Baire category that are involved in our use of these three notions of "smallness." The interested reader ma=-=y consult [6, 18, 36, 37, 43, 45]-=- for further discussion of these notions, but the material in the present section is sufficient for following the arguments of this paper. Resource-bounded measure [36, 37] is a generalization of clas... |

227 | On the length of programs for computing finite binary sequences
- Chaitin
(Show Context)
Citation Context ...82, with matching funds from Rockwell International and Microware Systems Corporation. 1 Introduction Algorithmic information theory, as developed by Solomonoff [51], Kolmogorov [21, 22, 23], Chaitin =-=[9, 10, 11, 12]-=-, Martin-Lof [39, 40], Levin [26, 27, 28, 29, 30, 31, 55], Schnorr [47], G'acs [15], Shen 0 [48, 49], and others, gives a satisfactory, quantitative account of the information content of individual bi... |

222 |
Computational complexity of probabilistic Turing machines
- Gill
- 1977
(Show Context)
Citation Context ...ucible to ��K in polynomial time, a recursive sequence y 2 f0; 1g 1 is Turing reducible to z in polynomial time if and only if y is in the complexity class BPP [5, 8]. (The class BPP, defined by G=-=ill [17]-=-, consists of those sequences y 2 f0; 1g 1 such that there is a randomized algorithm that decides y[n], the n th bit of y, with error probability less than 1 n , using time that is at most polynomial ... |

191 |
Descriptive Set Theory
- Moschovakis
- 1980
(Show Context)
Citation Context ... 0 in REC in our sense. We now turn to the fundamentals of Baire category. Baire category gives a topological notion of smallness, usually defined in terms of "countable unions of nowhere dense s=-=ets" [42, 43, 45]-=-. Here it is more convenient to define Baire category in terms of certain two-person, infinite games of perfect information, called Banach-Mazur games. Informally, a Banach-Mazur game is an infinite g... |

184 | The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms
- Zvonkin, Levin
- 1970
(Show Context)
Citation Context ...ational and Microware Systems Corporation. 1 Introduction Algorithmic information theory, as developed by Solomonoff [51], Kolmogorov [21, 22, 23], Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin =-=[26, 27, 28, 29, 30, 31, 55]-=-, Schnorr [47], G'acs [15], Shen 0 [48, 49], and others, gives a satisfactory, quantitative account of the information content of individual binary strings (finite) and binary sequences (infinite). Ho... |

172 | Almost everywhere high nonuniform complexity
- Lutz
- 1992
(Show Context)
Citation Context ...es y 2 f0; 1g 1 such that ysDTIME(s) T x. We are interested in the size of DTIME x (s) " REC as a subset of REC. To quantify this, we use a special case of the resource-bounded measure theory of =-=Lutz [37, 36]-=-. (A detailed description of the relevant special case appears in section 3 below.) Intuitively, this theory, a generalization of classical Lebesgue measure theory, defines a set X of infinite binary ... |

171 |
Structural Complexity I
- Balcázar, Díaz, et al.
- 1988
(Show Context)
Citation Context |

162 |
Convergence of Probability Measures. Second edition
- Billingsley
- 1999
(Show Context)
Citation Context ...ction reviews the basic ideas from Lebesgue measure, resource-bounded measure, and Baire category that are involved in our use of these three notions of "smallness." The interested reader ma=-=y consult [6, 18, 36, 37, 43, 45]-=- for further discussion of these notions, but the material in the present section is sufficient for following the arguments of this paper. Resource-bounded measure [36, 37] is a generalization of clas... |

107 |
On the notion of a random sequence
- Levin
- 1973
(Show Context)
Citation Context ...ational and Microware Systems Corporation. 1 Introduction Algorithmic information theory, as developed by Solomonoff [51], Kolmogorov [21, 22, 23], Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin =-=[26, 27, 28, 29, 30, 31, 55]-=-, Schnorr [47], G'acs [15], Shen 0 [48, 49], and others, gives a satisfactory, quantitative account of the information content of individual binary strings (finite) and binary sequences (infinite). Ho... |

101 |
Randomness conservation inequalities: Information and independence in mathematical theories
- Levin
- 1984
(Show Context)
Citation Context ...ational and Microware Systems Corporation. 1 Introduction Algorithmic information theory, as developed by Solomonoff [51], Kolmogorov [21, 22, 23], Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin =-=[26, 27, 28, 29, 30, 31, 55]-=-, Schnorr [47], G'acs [15], Shen 0 [48, 49], and others, gives a satisfactory, quantitative account of the information content of individual binary strings (finite) and binary sequences (infinite). Ho... |

95 | of information conservation (nongrowth) and aspects of the foundation of probability theory, Probl - Levin, Laws |

90 |
On the symmetry of algorithmic information
- Gács
- 1974
(Show Context)
Citation Context ...duction Algorithmic information theory, as developed by Solomonoff [51], Kolmogorov [21, 22, 23], Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin [26, 27, 28, 29, 30, 31, 55], Schnorr [47], G'acs =-=[15]-=-, Shen 0 [48, 49], and others, gives a satisfactory, quantitative account of the information content of individual binary strings (finite) and binary sequences (infinite). However, a given quantity of... |

89 |
Logical Basis for Information Theory and Probability Theory
- Kolmogorov
- 1968
(Show Context)
Citation Context ...ndation Grant CCR9157382, with matching funds from Rockwell International and Microware Systems Corporation. 1 Introduction Algorithmic information theory, as developed by Solomonoff [51], Kolmogorov =-=[21, 22, 23]-=-, Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin [26, 27, 28, 29, 30, 31, 55], Schnorr [47], G'acs [15], Shen 0 [48, 49], and others, gives a satisfactory, quantitative account of the information... |

85 |
Process complexity and effective random tests
- Schnorr
- 1973
(Show Context)
Citation Context ...ion. 1 Introduction Algorithmic information theory, as developed by Solomonoff [51], Kolmogorov [21, 22, 23], Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin [26, 27, 28, 29, 30, 31, 55], Schnorr =-=[47]-=-, G'acs [15], Shen 0 [48, 49], and others, gives a satisfactory, quantitative account of the information content of individual binary strings (finite) and binary sequences (infinite). However, a given... |

80 |
Measure and Category
- Oxtoby
- 1980
(Show Context)
Citation Context ... REC is meager, but has measure 1 and measure 1 in REC. (b) REC c has measure 0 in REC but is comeager and has measure 1. (c) RAND c has measure 0, but is comeager and has measure 1 in REC. As Oxtoby =-=[43] has noted-=-, "There is of course nothing paradoxical in the fact that a set that is small in one sense may be large in some other sense." 4 Algorithmic Information and Randomness In this section we rev... |

66 |
Real Analysis, Third Edition
- Royden
- 1967
(Show Context)
Citation Context ...ction reviews the basic ideas from Lebesgue measure, resource-bounded measure, and Baire category that are involved in our use of these three notions of "smallness." The interested reader ma=-=y consult [6, 18, 36, 37, 43, 45]-=- for further discussion of these notions, but the material in the present section is sufficient for following the arguments of this paper. Resource-bounded measure [36, 37] is a generalization of clas... |

62 |
Degrees of Unsolvability
- Sacks
- 1963
(Show Context)
Citation Context ...; 1g 1 is defined by y k [n] = y[hk; ni] for all n 2 N. (Here we are using the standard pairing functionshk; ni = \Gamma k+n+1 2 \Delta + n:) We use the following two known facts. Theorem 5.13 (Sacks =-=[46]). There-=- exist r.e. sequences that are high and not Turing equivalent to ��K . Theorem 5.14 (Martin [38]). A sequence y 2 f0; 1g 1 satisfies jump(��K ) T jump(y) if and only if there exists x jT y suc... |

56 | Learning simple concept under simple distributions
- Li, Vitányi
- 1991
(Show Context)
Citation Context ...amma log m(x)sK(x) ! \Gamma log m(x) + e c: A straightforward modification of the proof of Theorem 4.2 yields the following time-bounded version. (This result also follows immediately from Lemma 3 of =-=[34]-=-.) Theorem 4.3. Let t : N ! N be recursive. 1. For all x 2 f0; 1g , \Gamma log m t (x)sK t (x): 2. There exist a recursive function t 1 : N ! N and a constant c 1 2 N such that, for all x 2 f0; 1g , K... |

53 |
Classes of recursively enumerable sets and degrees of unsolvability
- Martin
- 1966
(Show Context)
Citation Context ...shk; ni = \Gamma k+n+1 2 \Delta + n:) We use the following two known facts. Theorem 5.13 (Sacks [46]). There exist r.e. sequences that are high and not Turing equivalent to ��K . Theorem 5.14 (Mar=-=tin [38]). A-=- sequence y 2 f0; 1g 1 satisfies jump(��K ) T jump(y) if and only if there exists x jT y such that REC is uniformly recursive in x. Corollary 5.15. Every high Turing degree contains a strongly dee... |

48 | Logical Depth and Physical Complexity
- Bennett
- 1988
(Show Context)
Citation Context ...ring it more or less useful for various computational purposes. In order to quantify the degree to which the information in a computational, physical, or biological object has been organized, Bennett =-=[4, 5] has -=-extended algorithmic information theory by defining and investigating the computational depth of binary strings and binary sequences. Roughly speaking, the computational depth (called "logical de... |

42 |
Every sequence is reducible to a random one
- Gács
- 1986
(Show Context)
Citation Context ...t suffices to show that y 62 RAND. But this follows immediately from Theorems 5.2 and 5.6. 2 In particular, Theorems 5.11 and 6.1 imply that weakly deep sequences exist. It should be noted that G'acs =-=[16]-=- has proven that, for every sequence x 2 f0; 1g 1 , there exists a sequence z 2 RAND such that x T z. Thus T -reducibility cannot be used in place ofstt -reducibility in the definition of wkDEEP. We h... |

36 |
Theory of Recursive Functions and E ective Computability
- Rogers
- 1967
(Show Context)
Citation Context ...at the reader is already familiar with the general ideas of Turing machine computation, including computation by oracle Turing machines. (Discussion of such machines may be found in many texts, e.g., =-=[2, 19, 44, 50]-=-.) Given a recursive time bound s : N ! N, wesay that an oracle Turing machine M is s-time-bounded if, given any input n 2 N and oracle y 2 f0� 1g 1 , M outputs a bit M y (n) 2f0� 1g in at most s(l) s... |

33 |
Resource-bounded measure
- Lutz
- 1998
(Show Context)
Citation Context ...es y 2 f0; 1g 1 such that ysDTIME(s) T x. We are interested in the size of DTIME x (s) " REC as a subset of REC. To quantify this, we use a special case of the resource-bounded measure theory of =-=Lutz [37, 36]-=-. (A detailed description of the relevant special case appears in section 3 below.) Intuitively, this theory, a generalization of classical Lebesgue measure theory, defines a set X of infinite binary ... |

32 | Incompleteness theorems for random reals - Chaitin - 1987 |

25 |
Dissipation, information, computational complexity and the definition of organization
- Bennett
- 1987
(Show Context)
Citation Context ...ring it more or less useful for various computational purposes. In order to quantify the degree to which the information in a computational, physical, or biological object has been organized, Bennett =-=[4, 5] has -=-extended algorithmic information theory by defining and investigating the computational depth of binary strings and binary sequences. Roughly speaking, the computational depth (called "logical de... |

25 |
Various measures of complexity for finite objects (axiomatic description
- Levin
- 1976
(Show Context)
Citation Context |

23 | Complexity oscillations in infinite binary sequences, Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete - Martin-Löf |

21 |
Uniform tests of randomness
- Levin
- 1976
(Show Context)
Citation Context |

21 |
On the length of programs for computing nite binary sequences
- Chaitin
- 1966
(Show Context)
Citation Context ...82, with matching funds from Rockwell International and Microware Systems Corporation. 1s1 Introduction Algorithmic information theory, as developed by Solomono [51], Kolmogorov [21, 22, 23], Chaitin =-=[9, 10, 11, 12]-=-, Martin-Lof [39, 40], Levin [26, 27,28,29,30,31,55],Schnorr [47], Gacs [15], Shen0 [48, 49], and others, gives a satisfactory, quantitative account of the information content ofindividual binary stri... |

20 | An observation on probability versus randomness with applications to complexity classes; Mathematical Systems Theory 27
- BOOK, LUTZ, et al.
- 1994
(Show Context)
Citation Context ...e every recursive sequence is Turing reducible to ��K in polynomial time, a recursive sequence y 2 f0; 1g 1 is Turing reducible to z in polynomial time if and only if y is in the complexity class =-=BPP [5, 8]-=-. (The class BPP, defined by Gill [17], consists of those sequences y 2 f0; 1g 1 such that there is a randomized algorithm that decides y[n], the n th bit of y, with error probability less than 1 n , ... |

18 |
On relations between different algorithmic definitions of randomness
- Shen
- 1989
(Show Context)
Citation Context ...ithmic information theory, as developed by Solomonoff [51], Kolmogorov [21, 22, 23], Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin [26, 27, 28, 29, 30, 31, 55], Schnorr [47], G'acs [15], Shen 0 =-=[48, 49]-=-, and others, gives a satisfactory, quantitative account of the information content of individual binary strings (finite) and binary sequences (infinite). However, a given quantity of information may ... |

17 |
The de nition of random sequences
- Martin-Lof
- 1966
(Show Context)
Citation Context ... Rockwell International and Microware Systems Corporation. 1s1 Introduction Algorithmic information theory, as developed by Solomono [51], Kolmogorov [21, 22, 23], Chaitin [9, 10, 11, 12], Martin-Lof =-=[39, 40]-=-, Levin [26, 27,28,29,30,31,55],Schnorr [47], Gacs [15], Shen0 [48, 49], and others, gives a satisfactory, quantitative account of the information content ofindividual binary strings ( nite) and binar... |

15 |
Three approaches to the quantitative de nition of information, Problems of Information Transmission 1:1-7
- Kolmogorov
- 1965
(Show Context)
Citation Context ...ndation Grant CCR9157382, with matching funds from Rockwell International and Microware Systems Corporation. 1s1 Introduction Algorithmic information theory, as developed by Solomono [51], Kolmogorov =-=[21, 22, 23]-=-, Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin [26, 27,28,29,30,31,55],Schnorr [47], Gacs [15], Shen0 [48, 49], and others, gives a satisfactory, quantitative account of the information content... |

12 |
Complexity, depth and sophistication
- Koppel
- 1987
(Show Context)
Citation Context ...ork" that has been "added" to this information and "stored in the organization" of the object. (Depth is closely related to Adleman's notion of "potential" [1] and K=-=oppel's notion of "sophistication" [24, 25]-=-.) One way to investigate the computational usefulness of an object is to investigate the class of computational problems that can be solved efficiently, given access to the object. When the object is... |

10 |
On languages reducible to algorithmically random languages
- BOOK
- 1994
(Show Context)
Citation Context ...e following two facts. (i) For every recursive time bound s : N! N there exists a recursive time bound b s : N! N such that, for all algorithmically random sequences z, DTIME z (s) " REC ` DTIME(=-=b s) [5, 8, 7]-=-. (ii) For every recursive time bound b s : N ! N, DTIME(b s) has measure 0 in REC [37]. Our main result, Theorem 5.11 below, establishes that every weakly useful sequence is strongly deep. This impli... |

8 |
The Universal Turing Machine: A Half-Century Survey
- Koppel, Structure
- 1988
(Show Context)
Citation Context ...ork" that has been "added" to this information and "stored in the organization" of the object. (Depth is closely related to Adleman's notion of "potential" [1] and K=-=oppel's notion of "sophistication" [24, 25]-=-.) One way to investigate the computational usefulness of an object is to investigate the class of computational problems that can be solved efficiently, given access to the object. When the object is... |

8 |
Solomono . A formal theory of inductive inference, part 1 and 2
- J
- 1964
(Show Context)
Citation Context ...ional Science Foundation Grant CCR9157382, with matching funds from Rockwell International and Microware Systems Corporation. 1s1 Introduction Algorithmic information theory, as developed by Solomono =-=[51]-=-, Kolmogorov [21, 22, 23], Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin [26, 27,28,29,30,31,55],Schnorr [47], Gacs [15], Shen0 [48, 49], and others, gives a satisfactory, quantitative account o... |

7 |
The complexity of nite objects and the development of the concepts of information and randomness by means of the theory of algorithms
- Zvonkin, Levin
- 1970
(Show Context)
Citation Context ...ational and Microware Systems Corporation. 1s1 Introduction Algorithmic information theory, as developed by Solomono [51], Kolmogorov [21, 22, 23], Chaitin [9, 10, 11, 12], Martin-Lof [39, 40], Levin =-=[26, 27,28,29,30,31,55]-=-,Schnorr [47], Gacs [15], Shen0 [48, 49], and others, gives a satisfactory, quantitative account of the information content ofindividual binary strings ( nite) and binary sequences (in nite). However,... |

5 |
The \almost all" theory of subrecursive degrees is decidable
- Mehlhorn
- 1974
(Show Context)
Citation Context ...then X " REC is a negligibly small subset of REC. Further discussion of this intuition may be found in [37, 43]. Other formulations of measure in REC have been investigated by Freidzon [14], Mehl=-=horn [41], and-=- others. The advantage of the formulation here is that it uniformly yields Lebesgue measure, measure in REC, and measure in various complexity classes [37]. It is easy to show that, if X has "mea... |

4 |
Barzdin ′ . Complexity of programs to determine whether natural numbers not greater than n belong to a recursively enumerable set
- M
- 1968
(Show Context)
Citation Context ...sting feature of this example is that ��K has relatively low information content. In fact, an n-bit prefix of ��K , denoted ��K [0::n \Gamma 1], contains only O(log n) bits of algorithmic =-=information [3]. In-=-tuitively, this is because ��K [0::n \Gamma 1] is completely specified by the number of indices i 2 f0; : : : ; n \Gamma 1g such that the i th Turing machine M i halts on input i. Once this O(log ... |

4 |
Families of recursive predicates of measure zero. translated in
- Freidzon
- 1972
(Show Context)
Citation Context ...sure 0 in REC, then X " REC is a negligibly small subset of REC. Further discussion of this intuition may be found in [37, 43]. Other formulations of measure in REC have been investigated by Frei=-=dzon [14]-=-, Mehlhorn [41], and others. The advantage of the formulation here is that it uniformly yields Lebesgue measure, measure in REC, and measure in various complexity classes [37]. It is easy to show that... |