Results 1  10
of
71
Polynomial time approximation schemes for Euclidean TSP and other geometric problems
 In Proceedings of the 37th IEEE Symposium on Foundations of Computer Science (FOCS’96
, 1996
"... Abstract. We present a polynomial time approximation scheme for Euclidean TSP in fixed dimensions. For every fixed c � 1 and given any n nodes in � 2, a randomized version of the scheme finds a (1 � 1/c)approximation to the optimum traveling salesman tour in O(n(log n) O(c) ) time. When the nodes a ..."
Abstract

Cited by 320 (3 self)
 Add to MetaCart
Abstract. We present a polynomial time approximation scheme for Euclidean TSP in fixed dimensions. For every fixed c � 1 and given any n nodes in � 2, a randomized version of the scheme finds a (1 � 1/c)approximation to the optimum traveling salesman tour in O(n(log n) O(c) ) time. When the nodes are in � d, the running time increases to O(n(log n) (O(�dc))d�1). For every fixed c, d the running time is n � poly(log n), that is nearly linear in n. The algorithm can be derandomized, but this increases the running time by a factor O(n d). The previous best approximation algorithm for the problem (due to Christofides) achieves a 3/2approximation in polynomial time. We also give similar approximation schemes for some other NPhard Euclidean problems: Minimum Steiner Tree, kTSP, and kMST. (The running times of the algorithm for kTSP and kMST involve an additional multiplicative factor k.) The previous best approximation algorithms for all these problems achieved a constantfactor approximation. We also give efficient approximation schemes for Euclidean MinCost Matching, a problem that can be solved exactly in polynomial time. All our algorithms also work, with almost no modification, when distance is measured using any geometric norm (such as �p for p � 1 or other Minkowski norms). They also have simple parallel (i.e., NC) implementations.
The NPcompleteness column: an ongoing guide
 Journal of Algorithms
, 1985
"... This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NPCompleteness,’ ’ W. H. Freeman & Co ..."
Abstract

Cited by 188 (0 self)
 Add to MetaCart
This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NPCompleteness,’ ’ W. H. Freeman & Co., New York, 1979 (hereinafter referred to as ‘‘[G&J]’’; previous columns will be referred to by their dates). A background equivalent to that provided by [G&J] is assumed, and, when appropriate, crossreferences will be given to that book and the list of problems (NPcomplete and harder) presented there. Readers who have results they would like mentioned (NPhardness, PSPACEhardness, polynomialtimesolvability, etc.) or open problems they would like publicized, should
On the Complexity of Propositional Knowledge Base Revision, Updates, and Counterfactuals
 ARTIFICIAL INTELLIGENCE
, 1992
"... We study the complexity of several recently proposed methods for updating or revising propositional knowledge bases. In particular, we derive complexity results for the following problem: given a knowledge base T , an update p, and a formula q, decide whether q is derivable from T p, the updated (or ..."
Abstract

Cited by 186 (12 self)
 Add to MetaCart
We study the complexity of several recently proposed methods for updating or revising propositional knowledge bases. In particular, we derive complexity results for the following problem: given a knowledge base T , an update p, and a formula q, decide whether q is derivable from T p, the updated (or revised) knowledge base. This problem amounts to evaluating the counterfactual p > q over T . Besides the general case, also subcases are considered, in particular where T is a conjunction of Horn clauses, or where the size of p is bounded by a constant.
Data Exchange: Getting to the Core
, 2003
"... Data exchange is the problem of taking data structured under a source schema and creating an instance of a target schema that reflects the source data as accurately as possible. Given a source instance, there may be many solutions to the data exchange problem, that is, many target instances that sat ..."
Abstract

Cited by 136 (16 self)
 Add to MetaCart
Data exchange is the problem of taking data structured under a source schema and creating an instance of a target schema that reflects the source data as accurately as possible. Given a source instance, there may be many solutions to the data exchange problem, that is, many target instances that satisfy the constraints of the data exchange problem. In an earlier paper, we identified a special class of solutions that we call universal. A universal solution has homomorphisms into every possible solution, and hence is a "most general possible" solution. Nonetheless, given a source instance, there may be many universal solutions. This naturally raises the question of whether there is a "best" universal solution, and hence a best solution for data exchange. We answer this question by considering the wellknown notion of the core of a structure, a notion that was first studied in graph theory, but has also played a role in conjunctivequery processing. The core of a structure is the smallest substructure that is also a homomorphic image of the structure. All universal solutions have the same core (up to isomorphism); we show that this core is also a universal solution, and hence the smallest universal solution. The uniqueness of the core of a universal solution together with its minimality make the core an ideal solution for data exchange. Furthermore, we show that the core is the best among all universal solutions for answering unions of conjunctive queries with inequalities. After this, we investigate the computational complexity of producing the core. Wellknown results by Chandra and Merlin imply that, unless P = NP, there is no polynomialtime algorithm that, given a structure as input, returns the core of that structure as output. In contrast, in the context of data e...
Propositional Circumscription and Extended Closed World Reasoning are $\Pi^P_2$complete
 Theoretical Computer Science
, 1993
"... Circumscription and the closed world assumption with its variants are wellknown nonmonotonic techniques for reasoning with incomplete knowledge. Their complexity in the propositional case has been studied in detail for fragments of propositional logic. One open problem is whether the deduction prob ..."
Abstract

Cited by 99 (22 self)
 Add to MetaCart
Circumscription and the closed world assumption with its variants are wellknown nonmonotonic techniques for reasoning with incomplete knowledge. Their complexity in the propositional case has been studied in detail for fragments of propositional logic. One open problem is whether the deduction problem for arbitrary propositional theories under the extended closed world assumption or under circumscription is $\Pi^P_2$complete, i.e., complete for a class of the second level of the polynomial hierarchy. We answer this question by proving these problems $\Pi^P_2$complete, and we show how this result applies to other variants of closed world reasoning.
The Complexity of Global Constraints
, 2004
"... We study the computational complexity of reasoning with global constraints. We show that reasoning with such constraints is intractable in general. We then demonstrate how the same tools of computational complexity can be used in the design and analysis of specific global constraints. In particular ..."
Abstract

Cited by 67 (26 self)
 Add to MetaCart
We study the computational complexity of reasoning with global constraints. We show that reasoning with such constraints is intractable in general. We then demonstrate how the same tools of computational complexity can be used in the design and analysis of specific global constraints. In particular, we illustrate how computational complexity can be used to determine when a lesser level of local consistency should be enforced, when decomposing constraints will lose pruning, and when combining constraints is tractable. We also show how the same tools can be used to study symmetry breaking, metaconstraints like the cardinality constraint, and learning nogoods.
Bounded Queries to SAT and the Boolean Hierarchy
 Theoretical Computer Science
, 1991
"... We study the complexity of decision problems that can be solved by a polynomialtime Turing machine that makes a bounded number of queries to an NP oracle. Depending on whether we allow some queries to depend on the results of other queries, we obtain two (probably) different hierarchies. We present ..."
Abstract

Cited by 64 (12 self)
 Add to MetaCart
We study the complexity of decision problems that can be solved by a polynomialtime Turing machine that makes a bounded number of queries to an NP oracle. Depending on whether we allow some queries to depend on the results of other queries, we obtain two (probably) different hierarchies. We present several results relating the bounded NP query hierarchies to each other and to the Boolean hierarchy. We also consider the similarlydefined hierarchies of functions that can be computed by a polynomialtime Turing machine that makes a bounded number of queries to an NP oracle. We present relations among these two hierarchies and the Boolean hierarchy. In particular we show for all k that there are functions computable with 2 k parallel queries to an NP set that are not computable in polynomial time with k serial queries to any oracle, unless P = NP. As a corollary k + 1 parallel queries to an NP set allow us to compute more functions than are computable with only k parallel queries to a...
Constraint propagation
 Handbook of Constraint Programming
, 2006
"... Constraint propagation is a form of inference, not search, and as such is more ”satisfying”, both technically and aesthetically. —E.C. Freuder, 2005. Constraint reasoning involves various types of techniques to tackle the inherent ..."
Abstract

Cited by 51 (3 self)
 Add to MetaCart
Constraint propagation is a form of inference, not search, and as such is more ”satisfying”, both technically and aesthetically. —E.C. Freuder, 2005. Constraint reasoning involves various types of techniques to tackle the inherent
On the Computational Complexity of Qualitative Coalitional Games
 Artificial Intelligence
, 2004
"... We study coalitional games in which agents are each assumed to have a goal to be achieved, and where the characteristic property of a coalition is a set of choices, with each choice denoting a set of goals that would be achieved if the choice was made. Such qualitative coalitional games (QCGs) are a ..."
Abstract

Cited by 46 (15 self)
 Add to MetaCart
We study coalitional games in which agents are each assumed to have a goal to be achieved, and where the characteristic property of a coalition is a set of choices, with each choice denoting a set of goals that would be achieved if the choice was made. Such qualitative coalitional games (QCGs) are a natural tool for modelling goaloriented multiagent systems. After introducing and formally defining QCGs, we systematically formulate fourteen natural decision problems associated with them, and determine the computational complexity of these problems. For example, we formulate a notion of coalitional stability inspired by that of the core from conventional coalitional games, and prove that the problem of showing that the core of a QCG is nonempty is D 1 complete. (As an aside, we present what we believe is the first "natural" problem that is proven to be complete for D 2 .) We conclude by discussing the relationship of our work to other research on coalitional reasoning in multiagent systems, and present some avenues for future research.