## A Discrete Lagrangian-Based Global-Search Method for Solving Satisfiability Problems (1998)

Venue: | Journal of Global Optimization |

Citations: | 59 - 7 self |

### BibTeX

@ARTICLE{Wah98adiscrete,

author = {Benjamin W. Wah and Yi Shang},

title = {A Discrete Lagrangian-Based Global-Search Method for Solving Satisfiability Problems},

journal = {Journal of Global Optimization},

year = {1998},

volume = {12},

pages = {61--99}

}

### Years of Citing Articles

### OpenURL

### Abstract

Satisfiability is a class of NP-complete problems that model a wide range of real-world applications. These problems are difficult to solve because they have many local minima in their search space, often trapping greedy search methods that utilize some form of descent. In this paper, we propose a new discrete Lagrange-multiplier-based global-search method for solving satisfiability problems. We derive new approaches for applying Lagrangian methods in discrete space, show that equilibrium is reached when a feasible assignment to the original problem is found, and present heuristic algorithms to look for equilibrium points. Instead of restarting from a new starting point when a search reaches a local trap, the Lagrange multipliers in our method provide a force to lead the search out of a local minimum and move it in the direction provided by the Lagrange multipliers. One of the major advantages of our method is that it has very few algorithmic parameters to be tuned by users, and the se...

### Citations

3697 | Artificial Intelligence : A Modern Approach - Russell, Norvig - 1995 |

3529 | Optimization by simulated annealing
- Gelatt, Vecchi
- 1983
(Show Context)
Citation Context ...hods, relying on ad hoc heuristics to find random solutions quickly. Those that have been applied include multi-start (restart) of descent methods, stochastic methods such as simulated annealing (SA) =-=[31, 2]-=-, and genetic algorithms (GA) [27, 34]. They are discussed briefly as follows. A pure descent method using multi-starts descends in the space of the objective function from an initial point, and gener... |

1966 | Genetic Algorithms + Data Structure = Evolution Programs - Michalewicz - 1992 |

1075 |
A computing procedure for quantification theory
- Davis, Putnam
- 1960
(Show Context)
Citation Context ...ility. Complete methods for solving (1.1) include resolution [Rob65, GN87], backtracking [Pur83] and consistency testings[Gu89, GW92, Guar]. An important resolution method is Davis-Putnam's algorithm =-=[DP60]-=-. These methods enumerative the search space systematically, and may rely on incomplete methods to find feasible solutions. Their disadvantage is that they are computationally expensive. For instance,... |

1040 | Linear and Nonlinear Programming - Luenberger - 1984 |

945 | A machine-oriented logic based on the resolution principle - Robinson - 1965 |

680 | A New Method for Solving Hard Satisfiability Problems - Selman, Levesque, et al. - 1992 |

601 | Tabu search - Glover, Laguna - 1993 |

480 | Greedy randomized adaptive search procedures
- Feo, Resende
- 1995
(Show Context)
Citation Context ...s of Grasp on the DIMACS benchmark problems [39]. Grasp is a greedy randomized adaptive search procedure that can find good quality solutions for a wide variety of combinatorial optimization problems =-=[7, 32, 8, 40]. In [39], four implementations of Grasp were applied to solve fi-=-ve classes of DIMACS SAT problems, "aim," "ii," "jnh," "ssa7552," and "par." Comparing to GSAT, Grasp did better on the "aim," "ssa7552... |

397 | P.: Minimizing conflicts: A heuristic repair method for constraint satisfaction and scheduling problems - Minton, Johnston, et al. - 1992 |

361 | B.: Noise strategies for improving local search
- Selman, Kautz, et al.
- 1994
(Show Context)
Citation Context ...he execution statistics of A 2 , including the average execution times and the average number of iterations. We also show the published average execution times of WSAT, GSAT and Davis-Putnam's method =-=[SKC94]-=-. We did not attempt to reproduce the reported results of GSAT and WSAT, since the results may depend on initial conditions, such as the seeds of the random number generator and other program paramete... |

313 | A thermodynamical approach to the traveling salesman problem: An efficient simulation algorithm - Čern´y - 1985 |

283 |
and easy distributions of sat problems
- Hard
- 1992
(Show Context)
Citation Context ... in the space of the objective function from an initial point, and generates a new starting point when no further improvement can be found locally. Examples include hill-climbing and steepest descent =-=[14, 51, 36, 44, 45, 35, 19, 54]-=-. For large SAT problems, hill-climbing methods are much faster than steepest descent because they descend in the first direction that leads to improvement, whereas steepest descent methods find the b... |

270 | Local search strategies for satisfiability testing, in Cliques, coloring, and satisfiability : second DIMACS implementation challenge
- SELMAN, KAUTZ, et al.
- 1993
(Show Context)
Citation Context ...rch to a completely different search space. Stochastic methods, such as GA and SA, have more mechanisms to bring a search out of a local minimum, but are more computationally expensive. Selman et al. =-=[SKC93]-=- reported that annealing is not effective for solving SAT problems. To the best of our knowledge, there is no successful application of genetic algorithms to solve SAT problems. In general, stochastic... |

221 | Logical foundations of artificial intelligence - Genesereth, Nilsson - 1987 |

218 | and easy distributions of SAT problems - Mitchell, Selman, et al. - 1992 |

215 | Domain-independent extensions to GSAT: Solving large structured satisfiability problems
- Selman, Kautz
- 1993
(Show Context)
Citation Context ... enumerative the search space systematically, and may rely on incomplete methods to find feasible solutions. Their disadvantage is that they are computationally expensive. For instance, Selman et al. =-=[SK93a]-=- and Gu [GG91, Gu93, Gu94] have reported that Davis-Putnam's algorithm cannot handle SAT problems with more than 150 variables, and better algorithms today have difficulty in solving SAT problems with... |

194 |
The breakout method for escaping from local minima
- Morris
- 1993
(Show Context)
Citation Context ...random restarts to get out of local minima, whereas Gu's local-minimum handler uses stochastic mechanisms to escape from local minima. Although our strategy is similar to Morris' "break-out"=-= strategy [Mor93]-=- and Selman and Kautz's GSAT [SK93a, SKC93] that applies adaptive penalties to escape from local minima, DLM described in this paper provides a theoretical framework for better understanding of these ... |

171 | The Traveling Salesman Problem and Minimum Spanning Trees - Held, Karp - 1971 |

146 |
A probabilistic heuristic for a computationally difficult set covering problem
- Feo, Resende
(Show Context)
Citation Context ...s of Grasp on the DIMACS benchmark problems [39]. Grasp is a greedy randomized adaptive search procedure that can find good quality solutions for a wide variety of combinatorial optimization problems =-=[7, 32, 8, 40]. In [39], four implementations of Grasp were applied to solve fi-=-ve classes of DIMACS SAT problems, "aim," "ii," "jnh," "ssa7552," and "par." Comparing to GSAT, Grasp did better on the "aim," "ssa7552... |

136 | Towards an understanding of hill-climbing procedures for SAT - Gent, Walsh - 1993 |

106 |
Algorithms for the maximum satisfiability problem
- Hansen, Jaumard
- 1990
(Show Context)
Citation Context ...f A 3 on some "g" problems. Recall that A 3 was developed to cope with large flat plateaus in the search space that confuse A 2 , which failed to find any solution within 5 million iteration=-=s. Hansen [HJ90]-=- and later Selman [SKC93] addressed this problem by using the tabu search strategy. In a similar way, we have adopted this strategy in A 3 by keeping a tabu list to prevent flipping the same variable ... |

103 | Efficient local search for very large-scale satisfiability problems - Gu - 1992 |

99 | An empirical study of greedy local search for satisfiability testing - Selman, Kautz - 1993 |

94 | GENET: A connectionist architecture for solving constraint satisfaction problems by iterative improvement - Davenport, Tsang, et al. - 1994 |

80 |
Search rearrangement backtracking and polynomial average time
- Purdom
- 1983
(Show Context)
Citation Context ...n (1.1). Methods to solve it can be either complete or incomplete, depending on their ability to prove infeasibility. Complete methods for solving (1.1) include resolution [Rob65, GN87], backtracking =-=[Pur83]-=- and consistency testings[Gu89, GW92, Guar]. An important resolution method is Davis-Putnam's algorithm [DP60]. These methods enumerative the search space systematically, and may rely on incomplete me... |

51 | Resolution vs. cutting plane solution of inference problems: Some computational experience - Hooker - 1988 |

42 | Computational experience with an interior point algorithm on the satisfiability problem - Kamath, Karmarkar, et al. - 1990 |

38 | A Continuous Approach to Inductive Inference
- KAMATH, KARMAKAR, et al.
- 1992
(Show Context)
Citation Context ...s are very close to 1 and, therefore, overlap with the curve showing the minimum Lagrange multiplier values. See Figure 5 for further explanation. ffl Circuit synthesis problems (ii) by Kamath et al. =-=[30]-=- --- a set of SAT encodings of Boolean circuit-synthesis problems; ffl Circuit diagnosis problems (ssa) --- a set of SAT formulas based on circuit fault analysis; ffl Parity learning problems (par) --... |

36 | Lagrangean relaxation and its uses in integer programming - Geoffrion - 1974 |

34 | Methods of Optimization - Walsh - 1975 |

33 | Local search for Satisfiability (SAT) problems - Gu - 1993 |

32 | A polynomial time algorithm for the n-queens problem - Sosic, Gu - 1990 |

32 | Approximate solution of weighted MAX-SAT problems using GRASP, tech - Resende, Pitsoulis, et al. - 1996 |

31 | Discrete lagrangian-based search for solving MAX-SAT problems
- Wah, Shang
- 1997
(Show Context)
Citation Context ...ue may be large. For instance, in solving MAX-SAT problems, the objective representing the weighted sum of the number of unsatisfied clauses can be large and can provide better guidance in the search =-=[49]-=-. This part of the search is similar to what is done in many local search methods, such as Gu's local-search methods [18, 19, 20] and GSAT [48, 44, 45, 47, 42, 46], which descends into local minima in... |

30 | A GRASP for satisfiability
- Resende, Feo
- 1996
(Show Context)
Citation Context ...). Pure descent methods are not suitable when there are constraints in the search space as formulated in (2). Recently, some local search methods were proposed and applied to solve large SAT problems =-=[37, 11, 5, 39]-=-. The most notable ones are those developed independently by Gu and Selman. Gu developed a group of local search methods for solving SAT and CSP problems. In his Ph.D thesis [14], he first formulated ... |

27 | Efficient local search with conflict minimization: A case study of the N-queen problem - Sosic, Gu - 1994 |

25 |
Parallel Algorithms and Architectures for Very Fast AI Search
- Gu
- 1989
(Show Context)
Citation Context ...lems [Mor93, GW93, DTWZ94]. The most notable ones are those developed independently by Gu and Selman. Gu developed a group of local search methods for solving SAT and CSP problems. In his Ph.D thesis =-=[Gu89]-=-, he first formulated conflicts in the objective function and proposed a discrete relaxation algorithm (a class of deterministic local search) to minimize the number of conflicts in these problems. Th... |

23 |
Avoiding local optima in the p-hub location problem using tabu search and GRASP
- Klincewicz
- 1992
(Show Context)
Citation Context ...s of Grasp on the DIMACS benchmark problems [39]. Grasp is a greedy randomized adaptive search procedure that can find good quality solutions for a wide variety of combinatorial optimization problems =-=[7, 32, 8, 40]. In [39], four implementations of Grasp were applied to solve fi-=-ve classes of DIMACS SAT problems, "aim," "ii," "jnh," "ssa7552," and "par." Comparing to GSAT, Grasp did better on the "aim," "ssa7552... |

20 | Global Optimization for Satisfiability (SAT) Problem
- Gu
- 1994
(Show Context)
Citation Context ...vercome the inefficiency of continuous unconstrained optimization methods, Gu developed discrete bit-parallel optimization algorithms (SAT 14.5 and SAT 14.6) to evaluate continuous objective function =-=[Gu94]-=- and have found significant performance improvements. 6 B. W. WAH AND Y. SHANG (b) Continuous Constrained Formulation. This generally involves a heuristic objective function that indicates the quality... |

19 | Applying GSAT to non-clausal formulas - Sebastiani - 1994 |

18 | 3,000,000 queens in less than one minute - Sosic, Gu - 1991 |

17 | Optimization by simulated annealing, Science 220 - Gelatt, Kirkpatrick, et al. - 1983 |

15 | Linear Programming for Operations Research - Simmons - 1972 |

15 | Discrete Lagrangian methods for optimizing the design of multiplierless QMF banks
- Wah, Shang, et al.
- 1999
(Show Context)
Citation Context ...ave also applied DLM to design multiplier-less QMF filter banks, which involves solving highly nonlinear discrete constrained optimization problems whose objectives and constraints are real functions =-=[55]-=-. To summarize, DLM is a generalization of local search schemes that optimize the objective alone and clause-weight schemes that optimize the constraints alone. When the search reaches a local minimum... |

14 | 1992] “Lagrange programming neural networks - Zhang, Constantinides |

13 | Lagrangian techniques for solving a class of zero-one integer linear programs
- Chang, Wah
- 1995
(Show Context)
Citation Context ... a Lagrangian transformation does not reduce the number of local minima, and continuous Lagrangian methods are an order-of-magnitude more expensive to apply than the corresponding discrete algorithms =-=[CW95]-=-. 3. Discrete Lagrangian Methods for Solving SAT Problems As discussed in the last section, we formulate SAT problems as constrained optimization problems (1.2) and solve them using Lagrangian methods... |

13 | An algorithm for optimal route selection in SNA networks - Gavish, Huntler - 1983 |

12 | Gradient method for concave programming I: Local results - Arrow, Hurwicz - 1958 |

12 | The UniSAT problem models (appendix - Gu - 1992 |