Results 11  20
of
34
Relative to a random oracle, NP is not small
 In Proc. 9th Structures
, 1994
"... Resourcebounded measure as originated by Lutz is an extension of classical measure theory which provides a probabilistic means of describing the relative sizes of complexity classes. Lutz has proposed the hypothesis that NP does not have pmeasure zero, meaning loosely that NP contains a nonneglig ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
Resourcebounded measure as originated by Lutz is an extension of classical measure theory which provides a probabilistic means of describing the relative sizes of complexity classes. Lutz has proposed the hypothesis that NP does not have pmeasure zero, meaning loosely that NP contains a nonnegligible subset of exponential time. This hypothesis implies a strong separation of P from NP and is supported by a growing body of plausible consequences which are not known to follow from the weaker assertion P ΜΈ = NP. It is shown in this paper that relative to a random oracle, NP does not have pmeasure zero. The proof exploits the following independence property of algorithmically random sequences: if A is an algorithmically random sequence and a subsequence A0 is chosen by means of a bounded KolmogorovLoveland
Weakly Hard Problems
, 1994
"... A weak completeness phenomenon is investigated in the complexity class E = DTIME(2 linear ). According to standard terminology, a language H is P m hard for E if the set Pm (H), consisting of all languages A P m H , contains the entire class E. A language C is P m complete for E if it ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
A weak completeness phenomenon is investigated in the complexity class E = DTIME(2 linear ). According to standard terminology, a language H is P m hard for E if the set Pm (H), consisting of all languages A P m H , contains the entire class E. A language C is P m complete for E if it is P m hard for E and is also an element of E. Generalizing this, a language H is weakly P m hard for E if the set Pm (H) does not have measure 0 in E. A language C is weakly P m complete for E if it is weakly P m hard for E and is also an element of E. The main result of this paper is the construction of a language that is weakly P m complete, but not P m complete, for E. The existence of such languages implies that previously known strong lower bounds on the complexity of weakly P m hard problems for E (given by work of Lutz, Mayordomo, and Juedes) are indeed more general than the corresponding bounds for P m hard problems for E. The proof of this result in...
Completeness and Weak Completeness under PolynomialSize Circuits
 Information and Computation
, 1996
"... This paper investigates the distribution and nonuniform complexity of problems that are complete or weakly complete for ESPACE under nonuniform reductions that are computed by polynomialsize circuits (P/PolyTuring reductions and P/Polymanyone reductions). A tight, exponential lower bound on the ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
This paper investigates the distribution and nonuniform complexity of problems that are complete or weakly complete for ESPACE under nonuniform reductions that are computed by polynomialsize circuits (P/PolyTuring reductions and P/Polymanyone reductions). A tight, exponential lower bound on the spacebounded Kolmogorov complexities of weakly P/PolyTuring complete problems is established. A Small Span Theorem for P/PolyTuring reductions in ESPACE is proven and used to show that every P/PolyTuring degree  including the complete degree  has measure 0 in ESPACE. (In contrast, it is known that almost every element of ESPACE is weakly Pmanyone complete.) Every weakly P/Polymanyonecomplete problem is shown to have a dense, exponential, nonuniform complexity core. More importantly, the P/Polymanyonecomplete problems are shown to be unusually simple elements of ESPACE, in the sense that they obey nontrivial upper bounds on nonuniform complexity (size of nonuniform complexit...
The Density of Weakly Complete Problems under Adaptive Reductions
 SIAM Journal on Computing
, 2000
"... Given a real number ff ! 1, every language that is weakly P n ff=2 \GammaT hard for E or weakly P n ff \GammaT hard for E 2 is shown to be exponentially dense. This simultaneously strengthens results of Lutz and Mayordomo(1994) and Fu(1995). 1 Introduction In the mid1970's, Meyer[15] prov ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Given a real number ff ! 1, every language that is weakly P n ff=2 \GammaT hard for E or weakly P n ff \GammaT hard for E 2 is shown to be exponentially dense. This simultaneously strengthens results of Lutz and Mayordomo(1994) and Fu(1995). 1 Introduction In the mid1970's, Meyer[15] proved that every P m complete language for exponential timein fact, every P m hard language for exponential timeis dense. That is, E 6` Pm(DENSE c ); (1) where E = DTIME(2 linear ), DENSE is the class of all dense languages, DENSE c is the complement of DENSE, and Pm(DENSE c ) is the class of all languages that are P m reducible to nondense languages. (A language A 2 f0; 1g is dense if there is a real number ffl ? 0 such that jA n j ? 2 n ffl for all sufficiently large n, where An = A " f0; 1g n .) Since that time, a major objective of computational complexity theory has been to extend Meyer's result from P m reductions to P T reductions, i.e., to prove that ...
The zeroone law holds for BPP
"... We show that BPP has pmeasure zero if and only if BPP differs from EXP. The same holds when we replace BPP by any complexity class C that is contained in BPP and is closed underttreductions. The zeroone law for each of these classes C follows: Within EXP, C has either measure zero or else measur ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
We show that BPP has pmeasure zero if and only if BPP differs from EXP. The same holds when we replace BPP by any complexity class C that is contained in BPP and is closed underttreductions. The zeroone law for each of these classes C follows: Within EXP, C has either measure zero or else measure one.
Online learning and resourcebounded dimension: Winnow yields new lower bounds for hard sets
 SIAM Journal on Computing
, 2007
"... We establish a relationship between the online mistakebound model of learning and resourcebounded dimension. This connection is combined with the Winnow algorithm to obtain new results about the density of hard sets under adaptive reductions. This improves previous work ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
We establish a relationship between the online mistakebound model of learning and resourcebounded dimension. This connection is combined with the Winnow algorithm to obtain new results about the density of hard sets under adaptive reductions. This improves previous work
Weakly Complete Problems are Not Rare
 COMPUTATIONAL COMPLEXITY
, 1995
"... Certain natural decision problems are known to be intractable because they are complete for E, the class of all problems decidable in exponential time. Lutz recently conjectured that many other seemingly intractable problems are not complete for E, but are intractable nonetheless because they are we ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Certain natural decision problems are known to be intractable because they are complete for E, the class of all problems decidable in exponential time. Lutz recently conjectured that many other seemingly intractable problems are not complete for E, but are intractable nonetheless because they are weakly complete for E. The main result of this paper shows that Lutz's intuition is at least partially correct; many more problems are weakly complete for E than are complete for E. The main result of this paper states that weakly complete problems are not rare in the sense that they form a nonmeasure 0 subset of E. This extends a recent result of Lutz that establishes the existence of problems that are weakly complete, but not complete, for E. The proof of Lutz's original result employs a sophisticated martingale diagonalization argument. Here we simplify and extend Lutz's argument to prove the main result. This simplified martingale diagonalization argument may be applicable to other quest...
Constant Depth Circuits and the Lutz Hypothesis
"... Resourcebounded measure theory [7] is a study of complexity classes via an adaptation of the probabilistic method. The central hypothesis in this theory is the assertion that NP does not have measure zero in Exponential Time. This is a quantitative strengthening of NP 6= P. ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Resourcebounded measure theory [7] is a study of complexity classes via an adaptation of the probabilistic method. The central hypothesis in this theory is the assertion that NP does not have measure zero in Exponential Time. This is a quantitative strengthening of NP 6= P.
NPhard sets are superterse unless NP is small
 Information Processing Letters
, 1997
"... Introduction One of the important questions in computational complexity theory is whether every NP problem is solvable by polynomial time circuits, i.e., NP `?P=poly. Furthermore, it has been asked what the deterministic time complexity of NP is if NP ` P=poly. That is, if NP is easy in the nonunif ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Introduction One of the important questions in computational complexity theory is whether every NP problem is solvable by polynomial time circuits, i.e., NP `?P=poly. Furthermore, it has been asked what the deterministic time complexity of NP is if NP ` P=poly. That is, if NP is easy in the nonuniform complexity measure, how easy is NP in the uniform complexity measure? Let P T (SPARSE) be the class of languages that are polynomial time Turing reducible to some sparse sets. Then it is well known that P T (SPARSE) = P=poly. Hence the above question is equivalent to the following question. NP `?PT (SPARSE): It has been shown by Wilson [18] that thi