Results 11  20
of
184
Model Counting
, 2008
"... Propositional model counting or #SAT is the problem of computing the number of models for a given propositional formula, i.e., the number of distinct truth assignments to variables for which the formula evaluates to true. For a propositional formula F, we will use #F to denote the model count of F. ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
Propositional model counting or #SAT is the problem of computing the number of models for a given propositional formula, i.e., the number of distinct truth assignments to variables for which the formula evaluates to true. For a propositional formula F, we will use #F to denote the model count of F. This problem is also referred to as the solution counting problem for SAT. It generalizes SAT and is the canonical #Pcomplete problem. There has been significant theoretical work trying to characterize the worstcase complexity of counting problems, with some surprising results such as model counting being hard even for some polynomialtime solvable problems like 2SAT. The model counting problem presents fascinating challenges for practitioners and poses several new research questions. Efficient algorithms for this problem will have a significant impact on many application areas that are inherently beyond SAT (‘beyond ’ under standard complexity theoretic assumptions), such as boundedlength adversarial and contingency planning, and probabilistic reasoning. For example, various probabilistic inference problems, such as Bayesian net reasoning, can be effectively translated into model counting problems [cf.
Qbf modeling: Exploiting player symmetry for simplicity and efficiency
 In Proceedings of Theory and Applications of Satis Testing (SAT 2006
, 2006
"... Abstract. Quantified Boolean Formulas (QBFs) present the next big challenge for automated propositional reasoning. Not surprisingly, most of the present day QBF solvers are extensions of successful propositional satisfiability algorithms (SAT solvers). They directly integrate the lessons learned fr ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
Abstract. Quantified Boolean Formulas (QBFs) present the next big challenge for automated propositional reasoning. Not surprisingly, most of the present day QBF solvers are extensions of successful propositional satisfiability algorithms (SAT solvers). They directly integrate the lessons learned from SAT research, thus avoiding reinventing the wheel. In particular, they use the standard conjunctive normal form (CNF) augmented with layers of variable quantification for modeling tasks as QBF. We argue that while CNF is well suited to “existential reasoning” as demonstrated by the success of modern SAT solvers, it is far from ideal for “universal reasoning ” needed by QBF. The CNF restriction imposes an inherent asymmetry in QBF and artificially creates issues that have led to complex solutions, which, in retrospect, were unnecessary and suboptimal. We take a step back and propose a new approach to QBF modeling based on a gametheoretic view of problems and on a dual CNFDNF (disjunctive normal form) representation that treats the existential and universal parts of a problem symmetrically. It has several advantages: (1) it is generic, compact, and simpler, (2) unlike fully nonclausal encodings, it preserves the benefits of pure CNF and leverages the support for DNF already present in many QBF solvers, (3) it doesn’t use the socalled indicator variables for conversion into CNF, thus circumventing the associated illegal search space issue, and (4) our QBF solver based on the dual encoding (Duaffle) consistently outperforms the best solvers by two orders of magnitude on a hard class of benchmarks, even without using standard learning techniques. 1
Tradeoffs in the complexity of backdoor detection
 In Principles and Practice of Constraint Programming  CP 2007
, 2007
"... Abstract. There has been considerable interest in the identification of structural properties of combinatorial problems that lead to efficient algorithms for solving them. Some of these properties are “easily ” identifiable, while others are of interest because they capture key aspects of stateoft ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
Abstract. There has been considerable interest in the identification of structural properties of combinatorial problems that lead to efficient algorithms for solving them. Some of these properties are “easily ” identifiable, while others are of interest because they capture key aspects of stateoftheart constraint solvers. In particular, it was recently shown that the problem of identifying a strong Horn or 2CNFbackdoor can be solved by exploiting equivalence with deletion backdoors, and is NPcomplete. We prove that strong backdoor identification becomes harder than NP (unless NP=coNP) as soon as the inconsequential sounding feature of empty clause detection (present in all modern SAT solvers) is added. More interestingly, in practice such a feature as well as polynomial time constraint propagation mechanisms often lead to much smaller backdoor sets. In fact, despite the worstcase complexity results for strong backdoor detection, we show that SatzRand is remarkably good at finding small strong backdoors on a range of experimental domains. Our results suggest that structural notions explored for designing efficient algorithms for combinatorial problems should capture both statically and dynamically identifiable properties. 1
Maximizing the Spread of Cascades Using Network Design
"... We introduce a new optimization framework to maximize the expected spread of cascades in networks. Our model allows a rich set of actions that directly manipulate cascade dynamics by adding nodes or edges to the network. Our motivating application is one in spatial conservation planning, where a cas ..."
Abstract

Cited by 23 (10 self)
 Add to MetaCart
We introduce a new optimization framework to maximize the expected spread of cascades in networks. Our model allows a rich set of actions that directly manipulate cascade dynamics by adding nodes or edges to the network. Our motivating application is one in spatial conservation planning, where a cascade models the dispersal of wild animals through a fragmented landscape. We propose a mixed integer programming (MIP) formulation that combines elements from network design and stochastic optimization. Our approach results in solutions with stochastic optimality guarantees and points to conservation strategies that are fundamentally different from naive approaches. 1
Leveraging belief propagation, backtrack search, and statistics for model counting
"... Abstract. We consider the problem of estimating the model count (number of solutions) of Boolean formulas, and present two techniques that compute estimates of these counts, as well as either lower or upper bounds with different tradeoffs between efficiency, bound quality, and correctness guarantee ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
Abstract. We consider the problem of estimating the model count (number of solutions) of Boolean formulas, and present two techniques that compute estimates of these counts, as well as either lower or upper bounds with different tradeoffs between efficiency, bound quality, and correctness guarantee. For lower bounds, we use a recent framework for probabilistic correctness guarantees, and exploit message passing techniques for marginal probability estimation, namely, variations of Belief Propagation (BP). Our results suggest that BP provides useful information even on structured loopy formulas. For upper bounds, we perform multiple runs of the MiniSat SAT solver with a minor modification, and obtain statistical bounds on the model count based on the observation that the distribution of a certain quantity of interest is often very close to the normal distribution. Our experiments demonstrate that our model counters based on these two ideas, BPCount and MiniCount, can provide very good bounds in time significantly less than alternative approaches. 1
Learning and Inference in WEIGHTED LOGIC WITH APPLICATION TO NATURAL LANGUAGE PROCESSING
, 2008
"... ..."
New wine into old wineskins: A survey of some pebbling classics with supplemental results
, 2010
"... Pebble games were extensively studied in the 1970s and 1980s in a number of different contexts. The last decade has seen renewed interest in pebbling in the field of proof complexity. This is a survey of some classical theorems in pebbling, as well as a couple of new ones, with a focus on results th ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
Pebble games were extensively studied in the 1970s and 1980s in a number of different contexts. The last decade has seen renewed interest in pebbling in the field of proof complexity. This is a survey of some classical theorems in pebbling, as well as a couple of new ones, with a focus on results that have proven relevant in proof complexity applications.
Integrating Systematic and Local Search Paradigms: A New Strategy for MaxSAT
"... Systematic search and local search paradigms for combinatorial problems are generally believed to have complementary strengths. Nevertheless, attempts to combine the power of the two paradigms have had limited success, due in part to the expensive information communication overhead involved. We prop ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
Systematic search and local search paradigms for combinatorial problems are generally believed to have complementary strengths. Nevertheless, attempts to combine the power of the two paradigms have had limited success, due in part to the expensive information communication overhead involved. We propose a hybrid strategy based on shared memory, ideally suited for multicore processor architectures. This method enables continuous information exchange between two solvers without slowing down either of the two. Such a hybrid search strategy is surprisingly effective, leading to substantially better quality solutions to many challenging Maximum Satisfiability (MaxSAT) instances than what the current best exact or heuristic methods yield, and it often achieves this within seconds. This hybrid approach is naturally best suited to MaxSAT instances for which proving unsatisfiability is already hard; otherwise the method falls back to pure local search. 1
Towards an Optimal Separation of Space and Length in Resolution
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY
, 2008
"... Most stateoftheart satisfiability algorithms today are variants of the DPLL procedure augmented with clause learning. The main bottleneck for such algorithms, other than the obvious one of time, is the amount of memory used. In the field of proof complexity, the resources of time and memory corre ..."
Abstract

Cited by 14 (9 self)
 Add to MetaCart
Most stateoftheart satisfiability algorithms today are variants of the DPLL procedure augmented with clause learning. The main bottleneck for such algorithms, other than the obvious one of time, is the amount of memory used. In the field of proof complexity, the resources of time and memory correspond to the length and space of resolution proofs. There has been a long line of research trying to understand these proof complexity measures, as well as relating them to the width of proofs, i.e., the size of the largest clause in the proof, which has been shown to be intimately connected with both length and space. While strong results have been proven for length and width, our understanding of space is still quite poor. For instance, it has remained open whether the fact that a formula is provable in short length implies that it is also provable in small space (which is the case for length versus width), or whether on the contrary these measures are completely unrelated in the sense that short proofs can be arbitrarily complex with respect to space. In this paper, we present some evidence that the true answer should be that the latter case holds and provide a possible roadmap for how such an optimal separation result could be obtained. We do this by proving a tight bound of Θ ( √ n) on the space needed for socalled pebbling contradictions over pyramid graphs of size n. This yields the first polynomial lower bound on space that is not a consequence of a corresponding lower bound on width, as well as an improvement of the weak separation of space and width in (Nordström 2006) from logarithmic to polynomial. Also, continuing the line of research initiated by (BenSasson 2002) into tradeoffs between different proof complexity measures, we present a simplified proof of the recent lengthspace tradeoff result in (Hertel and Pitassi 2007), and show how our ideas can be used to prove a couple of other exponential tradeoffs in resolution.
Results 11  20
of
184