Results 1  10
of
11
Algorithms for the Satisfiability (SAT) Problem: A Survey
 DIMACS Series in Discrete Mathematics and Theoretical Computer Science
, 1996
"... . The satisfiability (SAT) problem is a core problem in mathematical logic and computing theory. In practice, SAT is fundamental in solving many problems in automated reasoning, computeraided design, computeraided manufacturing, machine vision, database, robotics, integrated circuit design, compute ..."
Abstract

Cited by 145 (3 self)
 Add to MetaCart
(Show Context)
. The satisfiability (SAT) problem is a core problem in mathematical logic and computing theory. In practice, SAT is fundamental in solving many problems in automated reasoning, computeraided design, computeraided manufacturing, machine vision, database, robotics, integrated circuit design, computer architecture design, and computer network design. Traditional methods treat SAT as a discrete, constrained decision problem. In recent years, many optimization methods, parallel algorithms, and practical techniques have been developed for solving SAT. In this survey, we present a general framework (an algorithm space) that integrates existing SAT algorithms into a unified perspective. We describe sequential and parallel SAT algorithms including variable splitting, resolution, local search, global optimization, mathematical programming, and practical SAT algorithms. We give performance evaluation of some existing SAT algorithms. Finally, we provide a set of practical applications of the sat...
A Discrete LagrangianBased GlobalSearch Method for Solving Satisfiability Problems
 Journal of Global Optimization
, 1998
"... Satisfiability is a class of NPcomplete problems that model a wide range of realworld applications. These problems are difficult to solve because they have many local minima in their search space, often trapping greedy search methods that utilize some form of descent. In this paper, we propose a n ..."
Abstract

Cited by 66 (7 self)
 Add to MetaCart
Satisfiability is a class of NPcomplete problems that model a wide range of realworld applications. These problems are difficult to solve because they have many local minima in their search space, often trapping greedy search methods that utilize some form of descent. In this paper, we propose a new discrete Lagrangemultiplierbased globalsearch method for solving satisfiability problems. We derive new approaches for applying Lagrangian methods in discrete space, show that equilibrium is reached when a feasible assignment to the original problem is found, and present heuristic algorithms to look for equilibrium points. Instead of restarting from a new starting point when a search reaches a local trap, the Lagrange multipliers in our method provide a force to lead the search out of a local minimum and move it in the direction provided by the Lagrange multipliers. One of the major advantages of our method is that it has very few algorithmic parameters to be tuned by users, and the se...
Global Optimization for Satisfiability (SAT) Problem
, 1994
"... The satisfiability (SAT) problem is a fundamental problem in mathematical logic, inference, automated reasoning, VLSI engineering, and computing theory. In this paper, following CNF and DNF local search methods, we introduce the Universal SAT problem model, UniSAT, that transforms the discrete SAT ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
The satisfiability (SAT) problem is a fundamental problem in mathematical logic, inference, automated reasoning, VLSI engineering, and computing theory. In this paper, following CNF and DNF local search methods, we introduce the Universal SAT problem model, UniSAT, that transforms the discrete SAT problem on Boolean space f0; 1g m into an unconstrained global optimization problem on real space E m . A direct correspondence between the solution of the SAT problem and the global minimum point of the UniSAT objective function is established. Many existing global optimization algorithms can be used to solve the UniSAT problems. Combined with backtracking /resolution procedures, a global optimization algorithm is able to verify satisfiability as well as unsatisfiability. This approach achieves significant performance improvements for certain classes of conjunctive normal form (CNF ) formulas. It offers a complementary approach to the existing SAT algorithms.
TraceBased Methods for Solving Nonlinear Global Optimization and Satisfiability Problems
 J. of Global Optimization
, 1996
"... . In this paper we present a method called NOVEL (Nonlinear Optimization via External Lead) for solving continuous and discrete global optimization problems. NOVEL addresses the balance between global search and local search, using a trace to aid in identifying promising regions before committing to ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
(Show Context)
. In this paper we present a method called NOVEL (Nonlinear Optimization via External Lead) for solving continuous and discrete global optimization problems. NOVEL addresses the balance between global search and local search, using a trace to aid in identifying promising regions before committing to local searches. We discuss NOVEL for solving continuous constrained optimization problems and show how it can be extended to solve constrained satisfaction and discrete satisfiability problems. We first transform the problem using Lagrange multipliers into an unconstrained version. Since a stable solution in a Lagrangian formulation only guarantees a local optimum satisfying the constraints, we propose a global search phase in which an aperiodic and bounded trace function is added to the search to first identify promising regions for local search. The trace generates an informationbearing trajectory from which good starting points are identified for further local searches. Taking only a sm...
Global Search Methods For Solving Nonlinear Optimization Problems
, 1997
"... ... these new methods, we develop a prototype, called Novel (Nonlinear Optimization Via External Lead), that solves nonlinear constrained and unconstrained problems in a unified framework. We show experimental results in applying Novel to solve nonlinear optimization problems, including (a) the lear ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
... these new methods, we develop a prototype, called Novel (Nonlinear Optimization Via External Lead), that solves nonlinear constrained and unconstrained problems in a unified framework. We show experimental results in applying Novel to solve nonlinear optimization problems, including (a) the learning of feedforward neural networks, (b) the design of quadraturemirrorfilter digital filter banks, (c) the satisfiability problem, (d) the maximum satisfiability problem, and (e) the design of multiplierless quadraturemirrorfilter digital filter banks. Our method achieves better solutions than existing methods, or achieves solutions of the same quality but at a lower cost.
The Theory And Applications Of Discrete Constrained Optimization Using Lagrange Multipliers
, 2000
"... In this thesis, we present a new theory of discrete constrained optimization using Lagrange multipliers and an associated firstorder search procedure (DLM) to solve general constrained optimization problems in discrete, continuous and mixedinteger space. The constrained problems are general in the ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
In this thesis, we present a new theory of discrete constrained optimization using Lagrange multipliers and an associated firstorder search procedure (DLM) to solve general constrained optimization problems in discrete, continuous and mixedinteger space. The constrained problems are general in the sense that they do not assume the differentiability or convexity of functions. Our proposed theory and methods are targeted at discrete problems and can be extended to continuous and mixedinteger problems by coding continuous variables using a floatingpoint representation (discretization). We have characterized the errors incurred due to such discretization and have proved that there exists upper bounds on the errors. Hence, continuous and mixedinteger constrained problems, as well as discrete ones, can be handled by DLM in a unified way with bounded errors.
Convergence Properties of Optimization Algorithms for the Satisfiability (SAT) Problem
 IEEE TRANS. ON COMPUTERS
, 1996
"... The satisfiability (SAT) problem is a basic problem in computing theory. Presently, an active area of research on SAT problem is to design efficient optimization algorithms for finding a solution for a satisfiable CNF formula. A new formulation, the Universal SAT problem model, which transforms the ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
The satisfiability (SAT) problem is a basic problem in computing theory. Presently, an active area of research on SAT problem is to design efficient optimization algorithms for finding a solution for a satisfiable CNF formula. A new formulation, the Universal SAT problem model, which transforms the SAT problem on Boolean space into an optimization problem on real space has been developed [31, 35, 34, 32]. Many optimization techniques, such as the steepest descent method, Newton's method, and the coordinate descent method, can be used to solve the Universal SAT problem. In this paper, we prove that, when the initial solution is sufficiently close to the optimal solution, the steepest descent method has a linear convergence ratio fi ! 1, Newton's method has a convergence ratio of order two, and the convergence ratio of the steepest descent method is approximately (1 \Gamma fi=m) for the Universal SAT problem with m variables. An algorithm based on the coordinate descent method for the...
Parallel Heuristic Search in Haskell
"... Parallel heuristic search algorithms are widely used in artificial intelligence. This paper describes novel parallel variants of two standard sequential search algorithms, the standard Davis Putnam algorithm (DP); and the same algorithm extended with conflictdirected backjumping (CBJ). Encouraging ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Parallel heuristic search algorithms are widely used in artificial intelligence. This paper describes novel parallel variants of two standard sequential search algorithms, the standard Davis Putnam algorithm (DP); and the same algorithm extended with conflictdirected backjumping (CBJ). Encouraging preliminary results for the GpH parallel dialect of the nonstrict functional programming language Haskell suggest that modest real speedup can be achieved for the most interesting hard search cases.
A General Framework for Relaxation Processes by
, 2001
"... This paper addresses a major problem in pattern recognition: the unambiguous identification of objects. Comparing the characteristics of objects with the characteristics of possible interpretations, the identification of objects is frequently ambiguous. To reduce ambiguous assignments one can includ ..."
Abstract
 Add to MetaCart
(Show Context)
This paper addresses a major problem in pattern recognition: the unambiguous identification of objects. Comparing the characteristics of objects with the characteristics of possible interpretations, the identification of objects is frequently ambiguous. To reduce ambiguous assignments one can include a) more characteristics or b) contextual information into the identification. We use contextual information. Various approaches are suggested in the literature to describe and generalize discrete or continuous relaxation processes concerning to several objectives and one can find a suitable approach for every problem. The wider problem addressed here is to find the bestsuited relaxation process for a given assignment problem or, better still, to construct a taskdependent relaxation process. In this paper we describe an approach to generalizing relaxation processes. For this purpose, we develop a general framework for the theoretical foundations of relaxation processes in pattern recognition. The resulting structure enables 1) a description of all known relaxation processes in general terms and 2) the design of taskdependent relaxation processes. We show that the wellknown standard relaxation formulas verify our approach.
DEDICATION GLOBAL SEARCH METHODS FOR SOLVING NONLINEAR OPTIMIZATION PROBLEMS
"... In this thesis, we present new methods for solving nonlinear optimization problems. These problems are di cult to solve because the nonlinear constraints form feasible regions that are di cult to nd, and the nonlinear objectives contain local minima that trap descenttype search methods. In order to ..."
Abstract
 Add to MetaCart
(Show Context)
In this thesis, we present new methods for solving nonlinear optimization problems. These problems are di cult to solve because the nonlinear constraints form feasible regions that are di cult to nd, and the nonlinear objectives contain local minima that trap descenttype search methods. In order to nd good solutions in nonlinear optimization, we focus on the following two key issues: how to handle nonlinear constraints and how to escape from local minima. We use a Lagrangemultiplierbased formulation to handle nonlinear constraints, and develop Lagrangian methods with dynamic control to provide faster and more robust convergence. We extend the traditional Lagrangian theory for the continuous space to the To my wife Lei, my parents, and my son Charles discrete space and develop e cient discrete Lagrangian methods. To overcome local minima, we design a new tracebased globalsearch method that relies on an external traveling trace to pull a search trajectory out of a local optimum in a continuous fashion without having to restart the search from a new starting point. Good starting points identi ed in the global search are used in the local search to identify true local optima. By combining these new methods, we develop a prototype, called Novel (Nonlinear Optimization Via External Lead),