Results 1  10
of
36
The Constrainedness of Search
 In Proceedings of AAAI96
, 1999
"... We propose a definition of `constrainedness' that unifies two of the most common but informal uses of the term. These are that branching heuristics in search algorithms often try to make the most "constrained" choice, and that hard search problems tend to be "critically constrained". Our definition ..."
Abstract

Cited by 116 (26 self)
 Add to MetaCart
We propose a definition of `constrainedness' that unifies two of the most common but informal uses of the term. These are that branching heuristics in search algorithms often try to make the most "constrained" choice, and that hard search problems tend to be "critically constrained". Our definition of constrainedness generalizes a number of parameters used to study phase transition behaviour in a wide variety of problem domains. As well as predicting the location of phase transitions in solubility, constrainedness provides insight into why problems at phase transitions tend to be hard to solve. Such problems are on a constrainedness "knifeedge", and we must search deep into the problem before they look more or less soluble. Heuristics that try to get off this knifeedge as quickly as possible by, for example, minimizing the constrainedness are often very effective. We show that heuristics from a wide variety of problem domains can be seen as minimizing the constrainedness (or proxies ...
Finding Hard Instances of the Satisfiability Problem: A Survey
, 1997
"... . Finding sets of hard instances of propositional satisfiability is of interest for understanding the complexity of SAT, and for experimentally evaluating SAT algorithms. In discussing this we consider the performance of the most popular SAT algorithms on random problems, the theory of average case ..."
Abstract

Cited by 113 (1 self)
 Add to MetaCart
. Finding sets of hard instances of propositional satisfiability is of interest for understanding the complexity of SAT, and for experimentally evaluating SAT algorithms. In discussing this we consider the performance of the most popular SAT algorithms on random problems, the theory of average case complexity, the threshold phenomenon, known lower bounds for certain classes of algorithms, and the problem of generating hard instances with solutions.
The Constrainedness of Arc Consistency
 in Proceedings of CP97
, 1997
"... . We show that the same methodology used to study phase transition behaviour in NPcomplete problems works with a polynomial problem class: establishing arc consistency. A general measure of the constrainedness of an ensemble of problems, used to locate phase transitions in random NPcomplete proble ..."
Abstract

Cited by 45 (9 self)
 Add to MetaCart
. We show that the same methodology used to study phase transition behaviour in NPcomplete problems works with a polynomial problem class: establishing arc consistency. A general measure of the constrainedness of an ensemble of problems, used to locate phase transitions in random NPcomplete problems, predicts the location of a phase transition in establishing arc consistency. A complexity peak for the AC3 algorithm is associated with this transition. Finite size scaling models both the scaling of this transition and the computational cost. On problems at the phase transition, this model of computational cost agrees with the theoretical worst case. As with NPcomplete problems, constrainedness  and proxies for it which are cheaper to compute  can be used as a heuristic for reducing the number of checks needed to establish arc consistency in AC3. 1 Introduction Following [4] there has been considerable research into phase transition behaviour in NPcomplete problems. Problems from...
Phase Transitions and Annealed Theories: Number Partitioning as a Case Study
 In Proceedings of ECAI96
, 1996
"... . We outline a technique for studying phase transition behaviour in computational problems using number partitioning as a case study. We first build an "annealed" theory that assumes independence between parts of the number partition problem. Using this theory, we identify a parameter which represen ..."
Abstract

Cited by 30 (9 self)
 Add to MetaCart
. We outline a technique for studying phase transition behaviour in computational problems using number partitioning as a case study. We first build an "annealed" theory that assumes independence between parts of the number partition problem. Using this theory, we identify a parameter which represents the "constrainedness" of a problem. We determine experimentally the critical value of this parameter at which a rapid transition between soluble and insoluble problems occurs. Finitesize scaling methods developed in statistical mechanics describe the behaviour around the critical value. We identify phase transition behaviour in both the decision and optimization versions of number partitioning, in the size of the optimal partition, and in the quality of heuristic solutions. This case study demonstrates how annealed theories and finitesize scaling allows us to compare algorithms and heuristics in a precise and quantitative manner. 1 Introduction Phase transition behaviour has recently r...
Summarizing CSP hardness with continuous probability distributions
 In Proceedings of the 14th National Conference on AI
, 1997
"... We present empirical evidence that the distribution of effort required to solve CSPs randomly generated at the 50% satisfiable point, when using a backtracking algorithm, can be approximated by two standard families of continuous probability distribution functions. Solvable problems can be modelled ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
We present empirical evidence that the distribution of effort required to solve CSPs randomly generated at the 50% satisfiable point, when using a backtracking algorithm, can be approximated by two standard families of continuous probability distribution functions. Solvable problems can be modelled by the Weibull distribution, and unsolvable problems by the lognormal distribution. These distributions fit equally well over a variety of backtracking based algorithms. 1. Introduction Several key developments in the 1990's have contributed to the advancement of empirical research on CSP algorithms, to the extent that the field may even be called an experimental science. Striking increases in computer power and decreases in cost, coupled with the general adoption of C as the programming language of choice, have made it possible for the developer of a new algorithm or heuristic to test it on large numbers of random instances. Another important advance was the recognition of the "50% satisfi...
Random 3SAT: The Plot Thickens
 IN PRINCIPLES AND PRACTICE OF CONSTRAINT PROGRAMMING
, 2000
"... This paper presents an experimental investigation of the following questions: how does the averagecase complexity of random 3SAT, understood as a function of the order (number of variables) for xed density (ratio of number of clauses to order) instances, depend on the density? Is there a phase tra ..."
Abstract

Cited by 28 (2 self)
 Add to MetaCart
This paper presents an experimental investigation of the following questions: how does the averagecase complexity of random 3SAT, understood as a function of the order (number of variables) for xed density (ratio of number of clauses to order) instances, depend on the density? Is there a phase transition in which the complexity shifts from polynomial to exponential in the order? Is the transition dependent or independent of the solver? Our experiment design uses three complete SAT solvers embodying dierent algorithms: GRASP, CPLEX, and CUDD. We observe new phase transitions for all three solvers, where the median running time shifts from polynomial in the order to exponential. The location of the phase transition appears to be solverdependent. While GRASP and CUDD shift from polynomial to exponential complexity at a density of about 3.8, CUDD exhibits this transition between densities of 0.1 and 0.5. This experimental result underscores the dependence between the solver and the complexity phase transition, and challenges the widely held belief that random 3SAT exhibits a phase transition in computational complexity very close to the crossover point.
The scaling window of the 2sat transition
, 1999
"... Abstract. We consider the random 2satisfiability problem, in which each instance is a formula that is the conjunction of m clauses of the form x ∨ y, chosen uniformly at random from among all 2clauses on n Boolean variables and their negations. As m and n tend to infinity in the ratio m/n → α, the ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
Abstract. We consider the random 2satisfiability problem, in which each instance is a formula that is the conjunction of m clauses of the form x ∨ y, chosen uniformly at random from among all 2clauses on n Boolean variables and their negations. As m and n tend to infinity in the ratio m/n → α, the problem is known to have a phase transition at αc = 1, below which the probability that the formula is satisfiable tends to one and above which it tends to zero. We determine the finitesize scaling about this transition, namely the scaling of the maximal window W(n,δ) = (α−(n,δ),α+(n,δ)) such that the probability of satisfiability is greater than 1 − δ for α < α − and is less than δ for α> α+. We show that W(n,δ) = (1 − Θ(n −1/3),1 + Θ(n −1/3)), where the constants implicit in Θ depend on δ. We also determine the rates at which the probability of satisfiability approaches one and zero at the boundaries of the window. Namely, for m = (1 + ε)n, where ε may depend on n as long as ε  is sufficiently small and εn 1/3 is sufficiently large, we show that the probability of satisfiability decays like exp ( −Θ ( nε 3)) above the window, and goes to one like 1 − Θ ( n −1 ε  −3) below the window. We prove these results by defining an order parameter for the transition and establishing its scaling behavior in n both inside and outside the window. Using this order parameter, we prove that the 2SAT phase transition is continuous with an order parameter critical exponent of 1. We also determine the values of two other critical exponents, showing that the exponents of 2SAT are identical to those of the random graph.
Analysis of heuristics for number partitioning
 Computational Intelligence
, 1998
"... We illustrate the use of phase transition behavior in the study of heuristics. Using an “annealed ” theory, we define a parameter that measures the “constrainedness ” of an ensemble of number partitioning problems. We identify a phase transition at a critical value of constrainedness. We then show t ..."
Abstract

Cited by 24 (10 self)
 Add to MetaCart
We illustrate the use of phase transition behavior in the study of heuristics. Using an “annealed ” theory, we define a parameter that measures the “constrainedness ” of an ensemble of number partitioning problems. We identify a phase transition at a critical value of constrainedness. We then show that constrainedness can be used to analyze and compare algorithms and heuristics for number partitioning in a precise and quantitative manner. For example, we demonstrate that on uniform random problems both the Karmarkar–Karp and greedy heuristics minimize the constrainedness, but that the decisions made by the Karmarkar–Karp heuristic are superior at reducing constrainedness. This supports the better performance observed experimentally for the Karmarkar–Karp heuristic. Our results refute a conjecture of Fu that phase transition behavior does not occur in number partitioning. Additionally, they demonstrate that phase transition behavior is useful for more than just simple benchmarking. It can, for instance, be used to analyze heuristics, and to compare the quality of heuristic solutions. Key words: heuristics, number partitioning, phase transitions. 1.
The scaling of search cost
 In Proc. of the 14th Natl. Conf. on Artificial Intelligence (AAAI97
, 1997
"... We show that a resealed constrainedness parameter provides the basis for accurate numerical models of search cost for both backtracking and local search algorithms. In the past, the scaling of performance has been restricted to critically constrained problems at the phase transition. Here, we show h ..."
Abstract

Cited by 22 (8 self)
 Add to MetaCart
We show that a resealed constrainedness parameter provides the basis for accurate numerical models of search cost for both backtracking and local search algorithms. In the past, the scaling of performance has been restricted to critically constrained problems at the phase transition. Here, we show how to extend models of search cost to the full width of the phase transition. This enables the direct comparison of algorithms on both underconstrained and overconstrained problems. We illustrate the generality of the approach using three different problem domains (satisfiability, constraint satisfaction and travelling salesperson problems) with both backtracking algorithms like the DavisPutnam procedure and local search algorithms like GSAT. As well as modelling data from experiments, we give accurate predictions for results beyond the range of the experiments.
Breadth first search 3SAT algorithms for DNA computers
, 1996
"... This paper demonstrates that some practical 3SAT algorithms on conventional computers can be implemented on a DNA computer as a polynomial time breadth first search procedure based only on the fundamental chemical operations identified by Adleman and Lipton’s method. In particular, the MonienSpecke ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
This paper demonstrates that some practical 3SAT algorithms on conventional computers can be implemented on a DNA computer as a polynomial time breadth first search procedure based only on the fundamental chemical operations identified by Adleman and Lipton’s method. In particular, the MonienSpeckenmeyer algorithm, when implemented on DNA, becomes an ¢¡¤£¦¥¨§�©�����������£���� ���� � ���� � time, space algorithm, with significant increase in time and significant decrease in space. This paper also proposes a fast breadth first search method with fixed split points. The running time is at most twice as that of Lipton. Although theoretical analysis of the algorithm is yet to be done, simulations on a conventional computer suggest that the algorithm could significantly reduce the search space for 3SAT for most cases. If the observation is correct, the algorithm would allow DNA computers to handle 3SAT formulas of more than 120 variables, thereby doubling the limit given by Lipton. 1