Results 1  10
of
22
On approximating optimal weighted lobbying, and frequency of correctness versus averagecase polynomial time
, 2007
"... We investigate issues related to two hard problems related to voting, the optimal weighted lobbying problem and the winner problem for Dodgson elections. Regarding the former, Christian et al. [CFRS06] showed that optimal lobbying is intractable in the sense of parameterized complexity. We provide a ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
We investigate issues related to two hard problems related to voting, the optimal weighted lobbying problem and the winner problem for Dodgson elections. Regarding the former, Christian et al. [CFRS06] showed that optimal lobbying is intractable in the sense of parameterized complexity. We provide an efficient greedy algorithm that achieves a logarithmic approximation ratio for this problem and even for a more general variant—optimal weighted lobbying. We prove that essentially no better approximation ratio than ours can be proven for this greedy algorithm. The problem of determining Dodgson winners is known to be complete for parallel access to NP [HHR97]. Homan and Hemaspaandra [HH06] proposed an efficient greedy heuristic for finding Dodgson winners with a guaranteed frequency of success, and their heuristic is a “frequently selfknowingly correct algorithm. ” We prove that every distributional problem solvable in polynomial time on the average with respect to the uniform distribution has a frequently selfknowingly correct polynomialtime algorithm. Furthermore, we study some features of probability weight of correctness with respect to Procaccia and Rosenschein’s junta distributions [PR07]. Key words: approximation, Dodgson elections, election systems, frequently selfknowingly correct algorithms, greedy algorithms, optimal lobbying, preference aggregation.
Computational Tractability: The View From Mars
 Bulletin of the European Association of Theoretical Computer Science
"... We describe a point of view about the parameterized computational complexity framework in the broad context of one of the central issues of theoretical computer science as a field: the problem of systematically coping with computational intractability. Those already familiar with the basic ideas of ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
We describe a point of view about the parameterized computational complexity framework in the broad context of one of the central issues of theoretical computer science as a field: the problem of systematically coping with computational intractability. Those already familiar with the basic ideas of parameterized complexity will nevertheless find here something new: the emerging systematic connections between fixedparameter tractability techniques and the design of useful heuristic algorithms, and also perhaps the philosophical maturation of the parameterized complexity program.
An Efficient Local Search Method for Random 3Satisfiability
, 2003
"... We report on some exceptionally good results in the solution of randomly generated 3satisfiability instances using the "recordtorecord travel (RRT)" local search method. When this simple, but lessstudied algorithm is applied to random onemillion variable instances from the problem's satisfiable ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
We report on some exceptionally good results in the solution of randomly generated 3satisfiability instances using the "recordtorecord travel (RRT)" local search method. When this simple, but lessstudied algorithm is applied to random onemillion variable instances from the problem's satisfiable phase, it seems to find satisfying truth assignments almost always in linear time, with the coefficient of linearity depending on the ratio α of clauses to variables in the generated instances. RRT has a parameter for tuning "greediness". By lessening greediness, the linear time phase can be extended up to very close to the satisfiability threshold α_c. Such linear time complexity is typical for randomwalk based local search methods for small values of α. Previously, however, it has been suspected that these methods necessarily lose their time linearity far below the satisfiability threshold. The only previously introduced algorithm reported to have nearly linear time complexity also close to the satisfiability threshold is the survey propagation (SP) algorithm. However, SP is not a local search method and is more complicated to implement than RRT. Comparative experiments with the WalkSAT local search algorithm show behavior somewhat similar to RRT, but with the linear time phase not extending quite as close to the satisfiability threshold.
Complete distributional problems, hard languages, and resourcebounded measure
 Theoretical Computer Science
, 2000
"... We say that a distribution µ is reasonable if there exists a constant s ≥ 0 such that µ({x  x  ≥ n}) = Ω ( 1 ns). We prove the following result, which suggests that all DistNPcomplete problems have reasonable distributions. If NP contains a DTIME(2 n)biimmune set, then every DistNPcomplete ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
We say that a distribution µ is reasonable if there exists a constant s ≥ 0 such that µ({x  x  ≥ n}) = Ω ( 1 ns). We prove the following result, which suggests that all DistNPcomplete problems have reasonable distributions. If NP contains a DTIME(2 n)biimmune set, then every DistNPcomplete set has a reasonable distribution. It follows from work of Mayordomo [May94] that the consequent holds if the pmeasure of NP is not zero. Cai and Selman [CS96] defined a modification and extension of Levin’s notion of average polynomial time to arbitrary timebounds and proved that if L is Pbiimmune, then L is distributionally hard, meaning, that for every polynomialtime computable distribution µ, the distributional problem (L, µ) is not polynomial on the µaverage. We prove the following results, which suggest that distributional hardness is closely related to more traditional notions of hardness. 1. If NP contains a distributionally hard set, then NP contains a Pimmune set. 2. There exists a language L that is distributionally hard but not Pbiimmune if and only if P contains a set that is immune to all Pprintable sets. The following corollaries follow readily 1. If the pmeasure of NP is not zero, then there exists a language L that is distributionally hard but not Pbiimmune. 2. If the p2measure of NP is not zero, then there exists a language L in NP that is distributionally hard but not Pbiimmune. 1
Truthtable closure and Turing closure of average polynomial time have different measures in EXP
 In Proceedings of the Eleventh Annual IEEE Conference on Computational Complexity
, 1996
"... Let PPcomp denote the sets that are solvable in polynomial time on average under every polynomialtime computable distribution on the instances. In this paper we show that the truthtable closure of PPcomp has measure 0 in EXP. Since, as we show, EXP is Turing reducible to PPcomp , the Turing clo ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Let PPcomp denote the sets that are solvable in polynomial time on average under every polynomialtime computable distribution on the instances. In this paper we show that the truthtable closure of PPcomp has measure 0 in EXP. Since, as we show, EXP is Turing reducible to PPcomp , the Turing closure has measure 1 in EXP and thus, PPcomp is an example of a subclass of E such that the closure under truthtable reduction and the closure under Turing reduction have different measures in EXP. Furthermore, it is shown that there exists a set A in PPcomp such that for every k, the class of sets L such that A is ktruthtable reducible to L has measure 0 in EXP. 1 Introduction A randomized problem (or distributional problem) is a pair consisting of a decision problem and a density function. A randomized decision problem (A; ¯) is solvable in average polynomial time ((A; ¯) is in AP) if there exists a deterministic Turing machine M such that A = L(M ) and TimeM , the running time of M ...
Using Depth to Capture AverageCase Complexity
, 2003
"... We give the rst characterization of Turing machines that run in polynomialtime on average. We show that a Turing machine M runs in average polynomialtime if for all inputs x the Turing machine uses time exponential in the computational depth of x, where the computational depth is a measure of ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
We give the rst characterization of Turing machines that run in polynomialtime on average. We show that a Turing machine M runs in average polynomialtime if for all inputs x the Turing machine uses time exponential in the computational depth of x, where the computational depth is a measure of the amount of \useful" information in x.
Reductions Do Not Preserve Fast Convergence Rates in Average Time
 ALGORITHMICA
, 1996
"... Cai and Selman [CS96] proposed a general definition of average computation time that, when applied to polynomials, results in a modification of Levin's [Lev86] notion of averagepolynomialtime. The effect of the modification is to control the rate of convergence of the expressions that define ave ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Cai and Selman [CS96] proposed a general definition of average computation time that, when applied to polynomials, results in a modification of Levin's [Lev86] notion of averagepolynomialtime. The effect of the modification is to control the rate of convergence of the expressions that define average computation time. With this modification, they proved a hierarchy theorem for averagetime complexity that is as tight as the HartmanisStearns [HS65] hierarchy theorem for worstcase deterministic time. They also proved that under a fairly reasonable condition on distributions, called condition W, a distributional problem is solvable in averagepolynomialtime under the modification exactly when it is solvable in averagepolynomialtime under Levin's denition. Various notions of reductions, as defined by Levin [Lev86] and others, play a central role in the study of averagecase complexity. However, the class of distributional problems that are solvable in averagepolynomialtime under the modification is not closed under
the standard reductions. In particular, we prove that there is a distributional problem that is not solvable in averagepolynomialtime under the modication but is reducible, by the identity function, t...
Efficient kernels for sentence pair classification
"... In this paper, we propose a novel class of graphs, the tripartite directed acyclic graphs (tDAGs), to model firstorder rule feature spaces for sentence pair classification. We introduce a novel algorithm for computing the similarity in firstorder rewrite rule feature spaces. Our algorithm is extre ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
In this paper, we propose a novel class of graphs, the tripartite directed acyclic graphs (tDAGs), to model firstorder rule feature spaces for sentence pair classification. We introduce a novel algorithm for computing the similarity in firstorder rewrite rule feature spaces. Our algorithm is extremely efficient and, as it computes the similarity of instances that can be represented in explicit feature spaces, it is a valid kernel function. 1
AverageCase Complexity Theory and PolynomialTime Reductions
, 2001
"... This thesis studies averagecase complexity theory and polynomialtime reducibilities. The issues in averagecase complexity arise primarily from Cai and Selman's extension of Levin's denition of average polynomial time. We study polynomialtime reductions between distributional problems. Under stro ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This thesis studies averagecase complexity theory and polynomialtime reducibilities. The issues in averagecase complexity arise primarily from Cai and Selman's extension of Levin's denition of average polynomial time. We study polynomialtime reductions between distributional problems. Under strong but reasonable hypotheses we separate ordinary NPcompleteness notions.