Results 1  10
of
70
Almost Everywhere High Nonuniform Complexity
, 1992
"... . We investigate the distribution of nonuniform complexities in uniform complexity classes. We prove that almost every problem decidable in exponential space has essentially maximum circuitsize and spacebounded Kolmogorov complexity almost everywhere. (The circuitsize lower bound actually exceeds ..."
Abstract

Cited by 173 (36 self)
 Add to MetaCart
. We investigate the distribution of nonuniform complexities in uniform complexity classes. We prove that almost every problem decidable in exponential space has essentially maximum circuitsize and spacebounded Kolmogorov complexity almost everywhere. (The circuitsize lower bound actually exceeds, and thereby strengthens, the Shannon 2 n n lower bound for almost every problem, with no computability constraint.) In exponential time complexity classes, we prove that the strongest relativizable lower bounds hold almost everywhere for almost all problems. Finally, we show that infinite pseudorandom sequences have high nonuniform complexity almost everywhere. The results are unified by a new, more powerful formulation of the underlying measure theory, based on uniform systems of density functions, and by the introduction of a new nonuniform complexity measure, the selective Kolmogorov complexity. This research was supported in part by NSF Grants CCR8809238 and CCR9157382 and in ...
The quantitative structure of exponential time
 Complexity Theory Retrospective II
, 1997
"... ..."
Effective strong dimension in algorithmic information and computational complexity
 SIAM Journal on Computing
, 2004
"... The two most important notions of fractal dimension are Hausdorff dimension, developed by Hausdorff (1919), and packing dimension, developed independently by Tricot (1982) and Sullivan (1984). Both dimensions have the mathematical advantage of being defined from measures, and both have yielded exten ..."
Abstract

Cited by 78 (29 self)
 Add to MetaCart
The two most important notions of fractal dimension are Hausdorff dimension, developed by Hausdorff (1919), and packing dimension, developed independently by Tricot (1982) and Sullivan (1984). Both dimensions have the mathematical advantage of being defined from measures, and both have yielded extensive applications in fractal geometry and dynamical systems. Lutz (2000) has recently proven a simple characterization of Hausdorff dimension in terms of gales, which are betting strategies that generalize martingales. Imposing various computability and complexity constraints on these gales produces a spectrum of effective versions of Hausdorff dimension, including constructive, computable, polynomialspace, polynomialtime, and finitestate dimensions. Work by several investigators has already used these effective dimensions to shed significant new light on a variety of topics in theoretical computer science. In this paper we show that packing dimension can also be characterized in terms of gales. Moreover, even though the usual definition of packing dimension is considerably more complex than that of Hausdorff dimension, our gale characterization of packing dimension is an exact dual
Almost Every Set in Exponential Time is PBiImmune
 Theoretical Computer Science
, 1994
"... . A set A is Pbiimmune if neither A nor its complement has an infinite subset in P. We investigate here the abundance of Pbiimmune languages in linearexponential time (E). We prove that the class of Pbiimmune sets has measure 1 in E. This implies that `almost' every language in E is Pbi ..."
Abstract

Cited by 54 (5 self)
 Add to MetaCart
. A set A is Pbiimmune if neither A nor its complement has an infinite subset in P. We investigate here the abundance of Pbiimmune languages in linearexponential time (E). We prove that the class of Pbiimmune sets has measure 1 in E. This implies that `almost' every language in E is Pbiimmune, that is to say, almost every set recognizable in linear exponential time has no algorithm that recognizes it and works in polynomial time on an infinite number of instances. A bit further, we show that every prandom (pseudorandom) language is Ebiimmune. Regarding the existence of Pbiimmune sets in NP, we show that if NP does not have measure 0 in E, then NP contains a Pbiimmune set. Another consequence is that the class of p m complete languages for E has measure 0 in E. In contrast, it is shown that in E, and even in REC, the class of Pbiimmune languages lacks the property of Baire (the Baire category analogue of Lebesgue measurability). * This work was supported by a Spani...
Measure on small complexity classes, with applications for BPP
 In Proceedings of the 35th Symposium on Foundations of Computer Science
, 1994
"... We present a notion of resourcebounded measure for P and other subexponentialtime classes. This genemlization is based on Lutz’s notion of measure, but overcomes the limitations that cause Lptz’s definitions to apply only to classes at least as large as E. We present many of the basic properties ..."
Abstract

Cited by 48 (7 self)
 Add to MetaCart
We present a notion of resourcebounded measure for P and other subexponentialtime classes. This genemlization is based on Lutz’s notion of measure, but overcomes the limitations that cause Lptz’s definitions to apply only to classes at least as large as E. We present many of the basic properties of this measure, and use it to ezplore the class of sets that are hard for BPP. Bennett and Gill showed that almost all sets are hard for BPP; Lutz improved this from Lebesgue measure to measure on ESPACE. We use OUT measure to improve this still further, showing that for all E> 0, almost every set in E, is hard for BPP, where E, = Us<rDTIME(2”6), which is the best that can be achieved without showing that BPP is properly contained in E. A number of related results are also obtained in this way. 1
Measure, stochasticity, and the density of hard languages
 Proceedings of the Tenth Symposium on Theoretical Aspects of Computer Science
, 1993
"... The main theorem of this paper is that, for every real number <1 (e.g., = 0:99), only a measure 0 subset of the languages decidable P in exponential time are n;ttreducible to languages that are not P exponentially dense. Thus every n;tthard language for E is exponentially dense. This strengthe ..."
Abstract

Cited by 43 (13 self)
 Add to MetaCart
The main theorem of this paper is that, for every real number <1 (e.g., = 0:99), only a measure 0 subset of the languages decidable P in exponential time are n;ttreducible to languages that are not P exponentially dense. Thus every n;tthard language for E is exponentially dense. This strengthens Watanabe's 1987 result, that every P O(log n);tthard language for E is exponentially dense. The combinatorial technique used here, the sequentially most frequent query selection, also gives a new, simpler proof of Watanabe's result. The main theorem also has implications for the structure of NP under strong hypotheses. Ogiwara and Watanabe (1991) have shown P that the hypothesis P 6 = NP implies that every btthard language for NP is nonsparse (i.e., not polynomially sparse). Their technique does not appear to allow signi cant relaxation of either the query bound or the sparseness criterion. It is shown here that a stronger hypothesis namely, that NP does not have measure 0 in exponential timeimplies P the stronger conclusion that, for every real <1, every n;tthard language for NP is exponentially dense. Evidence is presented that this stronger hypothesis is reasonable. The proof of the main theorem uses a new, very general weak stochasticity theorem, ensuring that almost every language in E is statistically unpredictable by feasible deterministic algorithms, even How dense must a language A f0 � 1g be in order to be hard for a complexity class C? The ongoing investigation of this question, especially important
Resource Bounded Randomness and Weakly Complete Problems
 Theoretical Computer Science
, 1994
"... We introduce and study resource bounded random sets based on Lutz's concept of resource bounded measure ([7, 8]). We concentrate on n c  randomness (c 2) which corresponds to the polynomial time bounded (p) measure of Lutz, and which is adequate for studying the internal and quantitative s ..."
Abstract

Cited by 37 (6 self)
 Add to MetaCart
We introduce and study resource bounded random sets based on Lutz's concept of resource bounded measure ([7, 8]). We concentrate on n c  randomness (c 2) which corresponds to the polynomial time bounded (p) measure of Lutz, and which is adequate for studying the internal and quantitative structure of E = DTIME(2 lin ). However we will also comment on E2 = DTIME(2 pol ) and its corresponding (p2 ) measure. First we show that the class of n c random sets has pmeasure 1. This provides a new, simplified approach to pmeasure 1results. Next we compare randomness with genericity (in the sense of [2, 3]) and we show that n c+1  random sets are n c generic, whereas the converse fails. From the former we conclude that n c random sets are not pbttcomplete for E. Our technical main results describe the distribution of the n c random sets under pmreducibility. We show that every n c random set in E has n k random predecessors in E for any k 1, whereas the amou...
ResourceBounded Balanced Genericity, Stochasticity and Weak Randomness
 In Complexity, Logic, and Recursion Theory
, 1996
"... . We introduce balanced t(n)genericity which is a refinement of the genericity concept of AmbosSpies, Fleischhack and Huwig [2] and which in addition controls the frequency with which a condition is met. We show that this concept coincides with the resourcebounded version of Church's stochas ..."
Abstract

Cited by 21 (8 self)
 Add to MetaCart
. We introduce balanced t(n)genericity which is a refinement of the genericity concept of AmbosSpies, Fleischhack and Huwig [2] and which in addition controls the frequency with which a condition is met. We show that this concept coincides with the resourcebounded version of Church's stochasticity [6]. By uniformly describing these concepts and weaker notions of stochasticity introduced by Wilber [19] and Ko [11] in terms of prediction functions, we clarify the relations among these resourcebounded stochasticity concepts. Moreover, we give descriptions of these concepts in the framework of Lutz's resourcebounded measure theory [13] based on martingales: We show that t(n)stochasticity coincides with a weak notion of t(n)randomness based on socalled simple martingales but that it is strictly weaker than t(n)randomness in the sense of Lutz. 1 Introduction Over the last years resourcebounded versions of Baire category and Lebesgue measure have been introduced in complexity theor...
On ResourceBounded Instance Complexity
 Theoretical Computer Science A
, 1995
"... The instance complexity of a string x with respect to a set A and time bound t, ic t (x : A), is the length of the shortest program for A that runs in time t, decides x correctly, and makes no mistakes on other strings (where "don't know" answers are permitted). The Instance Complexit ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
The instance complexity of a string x with respect to a set A and time bound t, ic t (x : A), is the length of the shortest program for A that runs in time t, decides x correctly, and makes no mistakes on other strings (where "don't know" answers are permitted). The Instance Complexity Conjecture of Ko, Orponen, Schoning, and Watanabe states that for every recursive set A not in P and every polynomial t there is a polynomial t 0 and a constant c such that for infinitely many x, ic t (x : A) C t 0 (x) \Gamma c, where C t 0 (x) is the t 0 time bounded Kolmogorov complexity of x. In this paper the conjecture is proved for all recursive tally sets and for all recursive sets which are NPhard under honest reductions, in particular it holds for all natural NPhard problems. The method of proof also yields the polynomialspace bounded and the exponentialtime bounded versions of the conjecture in full generality. On the other hand, the conjecture itself turns out to be oracl...