Results 1  10
of
44
Almost Everywhere High Nonuniform Complexity
, 1992
"... . We investigate the distribution of nonuniform complexities in uniform complexity classes. We prove that almost every problem decidable in exponential space has essentially maximum circuitsize and spacebounded Kolmogorov complexity almost everywhere. (The circuitsize lower bound actually exceeds ..."
Abstract

Cited by 170 (34 self)
 Add to MetaCart
. We investigate the distribution of nonuniform complexities in uniform complexity classes. We prove that almost every problem decidable in exponential space has essentially maximum circuitsize and spacebounded Kolmogorov complexity almost everywhere. (The circuitsize lower bound actually exceeds, and thereby strengthens, the Shannon 2 n n lower bound for almost every problem, with no computability constraint.) In exponential time complexity classes, we prove that the strongest relativizable lower bounds hold almost everywhere for almost all problems. Finally, we show that infinite pseudorandom sequences have high nonuniform complexity almost everywhere. The results are unified by a new, more powerful formulation of the underlying measure theory, based on uniform systems of density functions, and by the introduction of a new nonuniform complexity measure, the selective Kolmogorov complexity. This research was supported in part by NSF Grants CCR8809238 and CCR9157382 and in ...
The Dimensions of Individual Strings and Sequences
 INFORMATION AND COMPUTATION
, 2003
"... A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary ..."
Abstract

Cited by 93 (10 self)
 Add to MetaCart
A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary) sequence S a dimension, which is a real number dim(S) in the interval [0, 1]. Sequences that
Effective strong dimension in algorithmic information and computational complexity
 SIAM Journal on Computing
, 2004
"... The two most important notions of fractal dimension are Hausdorff dimension, developed by Hausdorff (1919), and packing dimension, developed independently by Tricot (1982) and Sullivan (1984). Both dimensions have the mathematical advantage of being defined from measures, and both have yielded exten ..."
Abstract

Cited by 79 (29 self)
 Add to MetaCart
The two most important notions of fractal dimension are Hausdorff dimension, developed by Hausdorff (1919), and packing dimension, developed independently by Tricot (1982) and Sullivan (1984). Both dimensions have the mathematical advantage of being defined from measures, and both have yielded extensive applications in fractal geometry and dynamical systems. Lutz (2000) has recently proven a simple characterization of Hausdorff dimension in terms of gales, which are betting strategies that generalize martingales. Imposing various computability and complexity constraints on these gales produces a spectrum of effective versions of Hausdorff dimension, including constructive, computable, polynomialspace, polynomialtime, and finitestate dimensions. Work by several investigators has already used these effective dimensions to shed significant new light on a variety of topics in theoretical computer science. In this paper we show that packing dimension can also be characterized in terms of gales. Moreover, even though the usual definition of packing dimension is considerably more complex than that of Hausdorff dimension, our gale characterization of packing dimension is an exact dual
Equivalence of Measures of Complexity Classes
"... The resourcebounded measures of complexity classes are shown to be robust with respect to certain changes in the underlying probability measure. Specifically, for any real number ffi ? 0, any uniformly polynomialtime computable sequence ~ fi = (fi 0 ; fi 1 ; fi 2 ; : : : ) of real numbers (biases ..."
Abstract

Cited by 70 (19 self)
 Add to MetaCart
The resourcebounded measures of complexity classes are shown to be robust with respect to certain changes in the underlying probability measure. Specifically, for any real number ffi ? 0, any uniformly polynomialtime computable sequence ~ fi = (fi 0 ; fi 1 ; fi 2 ; : : : ) of real numbers (biases) fi i 2 [ffi; 1 \Gamma ffi], and any complexity class C (such as P, NP, BPP, P/Poly, PH, PSPACE, etc.) that is closed under positive, polynomialtime, truthtable reductions with queries of at most linear length, it is shown that the following two conditions are equivalent. (1) C has pmeasure 0 (respectively, measure 0 in E, measure 0 in E 2 ) relative to the cointoss probability measure given by the sequence ~ fi. (2) C has pmeasure 0 (respectively, measure 0 in E, measure 0 in E 2 ) relative to the uniform probability measure. The proof introduces three techniques that may be useful in other contexts, namely, (i) the transformation of an efficient martingale for one probability measu...
Degrees of random sets
, 1991
"... An explicit recursiontheoretic definition of a random sequence or random set of natural numbers was given by MartinLöf in 1966. Other approaches leading to the notions of nrandomness and weak nrandomness have been presented by Solovay, Chaitin, and Kurtz. We investigate the properties of nrando ..."
Abstract

Cited by 46 (4 self)
 Add to MetaCart
An explicit recursiontheoretic definition of a random sequence or random set of natural numbers was given by MartinLöf in 1966. Other approaches leading to the notions of nrandomness and weak nrandomness have been presented by Solovay, Chaitin, and Kurtz. We investigate the properties of nrandom and weakly nrandom sequences with an emphasis on the structure of their Turing degrees. After an introduction and summary, in Chapter II we present several equivalent definitions of nrandomness and weak nrandomness including a new definition in terms of a forcing relation analogous to the characterization of ngeneric sequences in terms of Cohen forcing. We also prove that, as conjectured by Kurtz, weak nrandomness is indeed strictly weaker than nrandomness. Chapter III is concerned with intrinsic properties of nrandom sequences. The main results are that an (n + 1)random sequence A satisfies the condition A (n) ≡T A⊕0 (n) (strengthening a result due originally to Sacks) and that nrandom sequences satisfy a number of strong independence properties, e.g., if A ⊕ B is nrandom then A is nrandom relative to B. It follows that any countable distributive lattice can be embedded
Using random sets as oracles
"... Let R be a notion of algorithmic randomness for individual subsets of N. We say B is a base for R randomness if there is a Z �T B such that Z is R random relative to B. We show that the bases for 1randomness are exactly the Ktrivial sets and discuss several consequences of this result. We also sho ..."
Abstract

Cited by 34 (15 self)
 Add to MetaCart
Let R be a notion of algorithmic randomness for individual subsets of N. We say B is a base for R randomness if there is a Z �T B such that Z is R random relative to B. We show that the bases for 1randomness are exactly the Ktrivial sets and discuss several consequences of this result. We also show that the bases for computable randomness include every ∆ 0 2 set that is not diagonally noncomputable, but no set of PAdegree. As a consequence, we conclude that an nc.e. set is a base for computable randomness iff it is Turing incomplete. 1
Randomness in Computability Theory
, 2000
"... We discuss some aspects of algorithmic randomness and state some open problems in this area. The first part is devoted to the question "What is a computably random sequence?" Here we survey some of the approaches to algorithmic randomness and address some questions on these concepts. In the seco ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
We discuss some aspects of algorithmic randomness and state some open problems in this area. The first part is devoted to the question "What is a computably random sequence?" Here we survey some of the approaches to algorithmic randomness and address some questions on these concepts. In the second part we look at the Turing degrees of MartinLof random sets. Finally, in the third part we deal with relativized randomness. Here we look at oracles which do not change randomness. 1980 Mathematics Subject Classification. Primary 03D80; Secondary 03D28. 1 Introduction Formalizations of the intuitive notions of computability and randomness are among the major achievements in the foundations of mathematics in the 20th century. It is commonly accepted that various equivalent formal computability notions  like Turing computability or recursiveness  which were introduced in the 1930s and 1940s adequately capture computability in the intuitive sense. This belief is expressed in the w...
Relative to a random oracle, NP is not small
 In Proc. 9th Structures
, 1994
"... Resourcebounded measure as originated by Lutz is an extension of classical measure theory which provides a probabilistic means of describing the relative sizes of complexity classes. Lutz has proposed the hypothesis that NP does not have pmeasure zero, meaning loosely that NP contains a nonneglig ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
Resourcebounded measure as originated by Lutz is an extension of classical measure theory which provides a probabilistic means of describing the relative sizes of complexity classes. Lutz has proposed the hypothesis that NP does not have pmeasure zero, meaning loosely that NP contains a nonnegligible subset of exponential time. This hypothesis implies a strong separation of P from NP and is supported by a growing body of plausible consequences which are not known to follow from the weaker assertion P ̸ = NP. It is shown in this paper that relative to a random oracle, NP does not have pmeasure zero. The proof exploits the following independence property of algorithmically random sequences: if A is an algorithmically random sequence and a subsequence A0 is chosen by means of a bounded KolmogorovLoveland
Recursive computational depth
 Information and Computation
, 1999
"... In the 1980's, Bennett introduced computational depth as a formal measure of the amount of computational history that is evident in an object's structure. In particular, Bennett identi ed the classes of weakly deep and strongly deep sequences, and showed that the halting problem is strongly deep. Ju ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
In the 1980's, Bennett introduced computational depth as a formal measure of the amount of computational history that is evident in an object's structure. In particular, Bennett identi ed the classes of weakly deep and strongly deep sequences, and showed that the halting problem is strongly deep. Juedes, Lathrop, and Lutz subsequently extended this result by de ning the class of weakly useful sequences, and proving that every weakly useful sequence is strongly deep. The present paper investigates re nements of Bennett's notions of weak and strong depth, called recursively weak depth (introduced by Fenner, Lutz and Mayordomo) and recursively strong depth (introduced here). It is argued that these re nements naturally capture Bennett's idea that deep objects are those which \contain internal evidence of a nontrivial causal history. " The fundamental properties of recursive computational depth are developed, and it is shown that the recursively weakly (respectively, strongly) deep sequences form a proper subclass of the class of weakly (respectively, strongly) deep sequences. The abovementioned theorem of Juedes, Lathrop, and Lutz is then strengthened by proving that every weakly useful sequence is recursively strongly deep. It follows from these results that not every strongly deep sequence is weakly useful, thereby answering a question posed by Juedes.
Almost everywhere domination and superhighness
 Mathematical Logic Quarterly
"... Let ω denote the set of natural numbers. For functions f, g: ω → ω, we say that f is dominated by g if f(n) < g(n) for all but finitely many n ∈ ω. We consider the standard “fair coin ” probability measure on the space 2 ω of infinite sequences of 0’s and 1’s. A Turing oracle B is said to be almost ..."
Abstract

Cited by 17 (9 self)
 Add to MetaCart
Let ω denote the set of natural numbers. For functions f, g: ω → ω, we say that f is dominated by g if f(n) < g(n) for all but finitely many n ∈ ω. We consider the standard “fair coin ” probability measure on the space 2 ω of infinite sequences of 0’s and 1’s. A Turing oracle B is said to be almost everywhere dominating if, for measure one many X ∈ 2 ω, each function which is Turing computable from X is dominated by some function which is Turing computable from B. Dobrinen and Simpson have shown that the almost everywhere domination property and some of its variant properties are closely related to the reverse mathematics of measure theory. In this paper we exposit some recent results of KjosHanssen, KjosHanssen/Miller/Solomon, and others concerning LRreducibility and almost everywhere domination. We also prove the following new result: If B is almost everywhere dominating, then B is superhigh, i.e., 0 ′′ is