Results 1  10
of
20
Averagecase computational complexity theory
 Complexity Theory Retrospective II
, 1997
"... ABSTRACT Being NPcomplete has been widely interpreted as being computationally intractable. But NPcompleteness is a worstcase concept. Some NPcomplete problems are \easy on average", but some may not be. How is one to know whether an NPcomplete problem is \di cult on average"? ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
ABSTRACT Being NPcomplete has been widely interpreted as being computationally intractable. But NPcompleteness is a worstcase concept. Some NPcomplete problems are \easy on average&quot;, but some may not be. How is one to know whether an NPcomplete problem is \di cult on average&quot;? The theory of averagecase computational complexity, initiated by Levin about ten years ago, is devoted to studying this problem. This paper is an attempt to provide an overview of the main ideas and results in this important new subarea of complexity theory. 1
Asymptotic density and computably enumerable sets
"... We study connections between classical asymptotic density, computability and computable enumerability. In an earlier paper, the second two authors proved that there is a computably enumerable set A of density 1 with no computable subset of density 1. In the current paper, we extend this result in ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
We study connections between classical asymptotic density, computability and computable enumerability. In an earlier paper, the second two authors proved that there is a computably enumerable set A of density 1 with no computable subset of density 1. In the current paper, we extend this result in three different ways: (i) The degrees of such sets A are precisely the nonlow c.e. degrees. (ii) There is a c.e. set A of density 1 with no computable subset of nonzero density. (iii) There is a c.e. set A of density 1 such that every subset of A of density 1 is of high degree. We also study the extent to which c.e. sets A can be approximated by their computable subsets B in the sense that A \ B has small density. There is a very close connection between the computational complexity of a set and the arithmetical complexity of its density and we characterize the lower densities, upper densities and densities of both computable and computably enumerable sets. We also study the notion of “computable at density r ” where r is a real in the unit interval. Finally, we study connections between density and classical smallness notions such as immunity, hyperimmunity, and cohesiveness.
Publickey cryptography and invariant theory
, 2002
"... Publickey cryptosystems are suggested based on invariants of groups. We give also an overview of known cryptosystems which involve groups. ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
Publickey cryptosystems are suggested based on invariants of groups. We give also an overview of known cryptosystems which involve groups.
Complete distributional problems, hard languages, and resourcebounded measure
 Theoretical Computer Science
, 2000
"... We say that a distribution µ is reasonable if there exists a constant s ≥ 0 such that µ({x  x  ≥ n}) = Ω ( 1 ns). We prove the following result, which suggests that all DistNPcomplete problems have reasonable distributions. If NP contains a DTIME(2 n)biimmune set, then every DistNPcomplete ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
We say that a distribution µ is reasonable if there exists a constant s ≥ 0 such that µ({x  x  ≥ n}) = Ω ( 1 ns). We prove the following result, which suggests that all DistNPcomplete problems have reasonable distributions. If NP contains a DTIME(2 n)biimmune set, then every DistNPcomplete set has a reasonable distribution. It follows from work of Mayordomo [May94] that the consequent holds if the pmeasure of NP is not zero. Cai and Selman [CS96] defined a modification and extension of Levin’s notion of average polynomial time to arbitrary timebounds and proved that if L is Pbiimmune, then L is distributionally hard, meaning, that for every polynomialtime computable distribution µ, the distributional problem (L, µ) is not polynomial on the µaverage. We prove the following results, which suggest that distributional hardness is closely related to more traditional notions of hardness. 1. If NP contains a distributionally hard set, then NP contains a Pimmune set. 2. There exists a language L that is distributionally hard but not Pbiimmune if and only if P contains a set that is immune to all Pprintable sets. The following corollaries follow readily 1. If the pmeasure of NP is not zero, then there exists a language L that is distributionally hard but not Pbiimmune. 2. If the p2measure of NP is not zero, then there exists a language L in NP that is distributionally hard but not Pbiimmune. 1
New Combinatorial Complete OneWay Functions
 in Proc. 25th Sympos. on Theoretical Aspects of Computer Science
, 2008
"... In 2003, Leonid A. Levin presented the idea of a combinatorial complete oneway function and a sketch of the proof that Tiling represents such a function. In this paper, we present two new oneway functions based on semiThue string rewriting systems and a version of the Post Correspondence Problem ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
In 2003, Leonid A. Levin presented the idea of a combinatorial complete oneway function and a sketch of the proof that Tiling represents such a function. In this paper, we present two new oneway functions based on semiThue string rewriting systems and a version of the Post Correspondence Problem and prove their completeness. Besides, we present an alternative proof of Levin’s result. We also discuss the properties a combinatorial problem should have in order to hold a complete oneway function. 1
Reductions Do Not Preserve Fast Convergence Rates in Average Time
 ALGORITHMICA
, 1996
"... Cai and Selman [CS96] proposed a general definition of average computation time that, when applied to polynomials, results in a modification of Levin's [Lev86] notion of averagepolynomialtime. The effect of the modification is to control the rate of convergence of the expressions that defin ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Cai and Selman [CS96] proposed a general definition of average computation time that, when applied to polynomials, results in a modification of Levin's [Lev86] notion of averagepolynomialtime. The effect of the modification is to control the rate of convergence of the expressions that define average computation time. With this modification, they proved a hierarchy theorem for averagetime complexity that is as tight as the HartmanisStearns [HS65] hierarchy theorem for worstcase deterministic time. They also proved that under a fairly reasonable condition on distributions, called condition W, a distributional problem is solvable in averagepolynomialtime under the modification exactly when it is solvable in averagepolynomialtime under Levin's denition. Various notions of reductions, as defined by Levin [Lev86] and others, play a central role in the study of averagecase complexity. However, the class of distributional problems that are solvable in averagepolynomialtime under the modification is not closed under
the standard reductions. In particular, we prove that there is a distributional problem that is not solvable in averagepolynomialtime under the modication but is reducible, by the identity function, t...
Invariantbased Cryptosystems and Their Security Against Provable WorstCase Break
"... Cryptography based on noncommutative algebra still suffers from lack of schemes and lack of interest. In this work, we show new constructions of cryptosystems based on group invariants and suggest methods to make such cryptosystems secure in practice. Cryptographers still cannot prove security in i ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Cryptography based on noncommutative algebra still suffers from lack of schemes and lack of interest. In this work, we show new constructions of cryptosystems based on group invariants and suggest methods to make such cryptosystems secure in practice. Cryptographers still cannot prove security in its cryptographic sense or even reduce it to some statement about regular complexity classes. In this paper we introduce a new notion of cryptographic security, a provable break, and prove that cryptosystems based on matrix group invariants and also a variation of the AnshelAnshelGoldfeld key agreement protocol for modular groups are secure against provable worstcase break unless NP ⊆ RP.
Membership Problem for the Modular Group
, 2007
"... The modular group plays an important role in many branches of mathematics. We show that the membership problem for the modular group is decidable in polynomial time. To this end, we develop a new syllablebased version of the known subgroupgraph approach. The new approach can be used to prove addi ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The modular group plays an important role in many branches of mathematics. We show that the membership problem for the modular group is decidable in polynomial time. To this end, we develop a new syllablebased version of the known subgroupgraph approach. The new approach can be used to prove additional results. We demonstrate this by using it to prove that the membership problem for a free group remains decidable in polynomial time when elements are written in a normal form with exponents.
AverageCase Complexity Theory and PolynomialTime Reductions
, 2001
"... This thesis studies averagecase complexity theory and polynomialtime reducibilities. The issues in averagecase complexity arise primarily from Cai and Selman's extension of Levin's denition of average polynomial time. We study polynomialtime reductions between distributional problems. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This thesis studies averagecase complexity theory and polynomialtime reducibilities. The issues in averagecase complexity arise primarily from Cai and Selman's extension of Levin's denition of average polynomial time. We study polynomialtime reductions between distributional problems. Under strong but reasonable hypotheses we separate ordinary NPcompleteness notions.
GENERIC COMPUTABILITY, TURING DEGREES, AND ASYMPTOTIC DENSITY
"... Abstract. Generic decidability has been extensively studied in group theory, and we now study it in the context of classical computability theory. A set A of natural numbers is called generically computable if there is a partial computable function which agrees with the characteristic function of A ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Generic decidability has been extensively studied in group theory, and we now study it in the context of classical computability theory. A set A of natural numbers is called generically computable if there is a partial computable function which agrees with the characteristic function of A on its domain D, and furthermore D has density 1, i.e. limn→ ∞ {k < n: k ∈ D}/n = 1. A set A is called coarsely computable if there is a computable set R such that the symmetric difference of A and R has density 0. We prove that there is a c.e. set which is generically computable but not coarsely computable and vice versa. We show that every nonzero Turing degree contains a set which is not generically computable and also a set which is not coarsely computable. We prove that there is a c.e. set of density 1 which has no computable subset of density 1. Finally, we define and study generic reducibility. 1.