Results 1  10
of
31
An Algorithm to Evaluate Quantified Boolean Formulae and its Experimental Evaluation
 Journal of Automated Reasoning
, 1999
"... The high computational complexity of advanced reasoning tasks such as reasoning about knowledge and planning calls for efficient and reliable algorithms for reasoning problems harder than NP. In this paper we propose Evaluate, an algorithm for evaluating Quantified Boolean Formulae, a language that ..."
Abstract

Cited by 141 (2 self)
 Add to MetaCart
The high computational complexity of advanced reasoning tasks such as reasoning about knowledge and planning calls for efficient and reliable algorithms for reasoning problems harder than NP. In this paper we propose Evaluate, an algorithm for evaluating Quantified Boolean Formulae, a language that extends propositional logic in a way such that many advanced forms of propositional reasoning, e.g., circumscription, can be easily formulated as evaluation of a QBF. Algorithms for evaluation of QBFs are suitable for the experimental analysis on a wide range of complexity classes, a property not easily found in other formalisms. Evaluate is based on a generalization of the DavisPutnam procedure for SAT, and is guaranteed to work in polynomial space. Before presenting the algorithm, we discuss several abstract properties of QBFs that we singled out to make it more efficient. We also discuss various options that were investigated about heuristics and data structures, and report the main res...
Optimizing binary MRFs via extended roof duality
 In Proc. CVPR
, 2007
"... Many computer vision applications rely on the efficient optimization of challenging, socalled nonsubmodular, binary pairwise MRFs. A promising graph cut based approach for optimizing such MRFs known as “roof duality” was recently introduced into computer vision. We study two methods which extend t ..."
Abstract

Cited by 97 (9 self)
 Add to MetaCart
Many computer vision applications rely on the efficient optimization of challenging, socalled nonsubmodular, binary pairwise MRFs. A promising graph cut based approach for optimizing such MRFs known as “roof duality” was recently introduced into computer vision. We study two methods which extend this approach. First, we discuss an efficient implementation of the “probing ” technique introduced recently by Boros et al. [5]. It simplifies the MRF while preserving the global optimum. Our code is 400700 faster on some graphs than the implementation of [5]. Second, we present a new technique which takes an arbitrary input labeling and tries to improve its energy. We give theoretical characterizations of local minima of this procedure. We applied both techniques to many applications, including image segmentation, new view synthesis, superresolution, diagram recognition, parameter learning, texture restoration, and image deconvolution. For several applications we see that we are able to find the global minimum very efficiently, and considerably outperform the original roof duality approach. In comparison to existing techniques, such as graph cut, TRW, BP, ICM, and simulated annealing, we nearly always find a lower energy. 1.
Minimizing nonsubmodular functions with graph cuts  a review
 TPAMI
, 2007
"... Optimization techniques based on graph cuts have become a standard tool for many vision applications. These techniques allow to minimize efficiently certain energy functions corresponding to pairwise Markov Random Fields (MRFs). Currently, there is an accepted view within the computer vision communi ..."
Abstract

Cited by 89 (6 self)
 Add to MetaCart
Optimization techniques based on graph cuts have become a standard tool for many vision applications. These techniques allow to minimize efficiently certain energy functions corresponding to pairwise Markov Random Fields (MRFs). Currently, there is an accepted view within the computer vision community that graph cuts can only be used for optimizing a limited class of MRF energies (e.g. submodular functions). In this survey we review some results that show that graph cuts can be applied to a much larger class of energy functions (in particular, nonsubmodular functions). While these results are wellknown in the optimization community, to our knowledge they were not used in the context of computer vision and MRF optimization. We demonstrate the relevance of these results to vision on the problem of binary texture restoration.
Improvements to the evaluation of quantified Boolean formulae
 In Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence (IJCAI'99), July 31August 6
, 1999
"... We present a theoremprover for quantified Boolean formulae and evaluate it on random quantified formulae and formulae that represent problems from automated planning. Even though the notion of quantified Boolean formula is theoretically important, automated reasoning with QBF has not been thoroughl ..."
Abstract

Cited by 74 (3 self)
 Add to MetaCart
We present a theoremprover for quantified Boolean formulae and evaluate it on random quantified formulae and formulae that represent problems from automated planning. Even though the notion of quantified Boolean formula is theoretically important, automated reasoning with QBF has not been thoroughly investigated. Universal quantifiers are needed in representing many computational problems that cannot be easily translated to the propositional logic and solved by satisfiability algorithms. Therefore efficient reasoning with QBF is important. The DavisPutnam procedure can be extended to evaluate quantified Boolean formulae. A straightforward algorithm of this kind is not very efficient. We identify universal quantifiers as the main area where improvements to the basic algorithm can be made. We present a number of techniques for reducing the amount of search that is needed, and evaluate their effectiveness by running the algorithm on a collection of formulae obtained from planning and generated randomly. For the structured problems we consider, the techniques lead to a dramatic speedup. 1
Efficient reconstruction of haplotype structure via perfect phylogeny
 Journal of Bioinformatics and Computational Biology
, 2003
"... Each person’s genome contains two copies of each chromosome, one inherited from the father and the other from the mother. A person’s genotype specifies the pair of bases at each site, but does not specify which base occurs on which chromosome. The sequence of each chromosome separately is called a h ..."
Abstract

Cited by 68 (10 self)
 Add to MetaCart
Each person’s genome contains two copies of each chromosome, one inherited from the father and the other from the mother. A person’s genotype specifies the pair of bases at each site, but does not specify which base occurs on which chromosome. The sequence of each chromosome separately is called a haplotype. The determination of the haplotypes within a population is essential for understanding genetic variation and the inheritance of complex diseases. The haplotype mapping project, a successor to the human genome project, seeks to determine the common haplotypes in the human population. Since experimental determination of a person’s genotype is less expensive than determining its component haplotypes, algorithms are required for computing haplotypes from genotypes. Two observations aid in this process: first, the human genome contains short blocks within which only a few different haplotypes occur; second, as suggested by Gusfield, it is reasonable to assume that the haplotypes observed within a block have evolved according to a perfect phylogeny, in which at most one mutation event has occurred at any site, and no recombination occurred at the given region. We present a simple and efficient polynomialtime algorithm for inferring haplotypes from the genotypes of a set of individuals assuming a perfect phylogeny. Using a reduction to 2SAT we extend this algorithm to handle constraints that apply when we have genotypes from both parents and child. We also present a hardness result for the problem of removing the minimum number of individuals from a population to ensure that the genotypes of the remaining individuals are consistent with a perfect phylogeny. Our algorithms have been tested on real data and give biologically meaningful results. Our webserver
List Partitions
 Proc. 31st Ann. ACM Symp. on Theory of Computing
, 2003
"... List partitions generalize list colourings and list homomorphisms. Each symmetric matrix M over 0; 1; defines a list partition problem. Different choices of the matrix M lead to many wellknown graph theoretic problems including the problem of recognizing split graphs and their generalizations, ..."
Abstract

Cited by 27 (11 self)
 Add to MetaCart
List partitions generalize list colourings and list homomorphisms. Each symmetric matrix M over 0; 1; defines a list partition problem. Different choices of the matrix M lead to many wellknown graph theoretic problems including the problem of recognizing split graphs and their generalizations, finding homogeneous sets, joins, clique cutsets, stable cutsets, skew cutsets and so on. We develop tools which allow us to classify the complexity of many list partition problems and, in particular, yield the complete classification for small matrices M . Along the way, we obtain a variety of specific results including: generalizations of Lov'asz's communication bound on the number of cliqueversus stableset separators; polynomialtime algorithms to recognize generalized split graphs; a polynomial algorithm for the list version of the Clique Cutset Problem; and the first subexponential algorithm for the Skew Cutset Problem of Chv'atal. We also show that the dichotomy (NP complete versus polynomialtime solvable), conjectured for certain graph homomorphism problems would, if true, imply a slightly weaker dichotomy (NP complete versus quasipolynomial) for our list partition problems 1 . Email: tomas@theory.stanford.edu. y School of Computing Science, Simon Fraser University, Burnaby, B.C., Canada, V5A1S6. Email: pavol@cs.sfu.ca. Supported by a Research Grant from the National Sciences and Engineering Research Council. z Departamento da Ciencia da Computac~ao  I.M., COPPE/Sistemas, Universidade Federal do Rio de Janeiro, RJ, 21945970, Brasil. Email: sula@cos.ufrj.br. Supported by CNPq and PRONEX 107/97. x Department of Computer Science, Stanford University, CA 943059045. Email: rajeev@cs.stanford.edu. Supported by an ARO MURI Grant DAAH04961...
Inferring AS relationships: Dead end or lively beginning
 In Proceedings of 4th Workshop on Efficient and Experimental Algorithms (WEA’ 05
, 2005
"... Recent techniques for inferring business relationships between ASs [3, 8] have yielded maps that have extremely few invalid BGP paths in the terminology of Gao [9]. However, some relationships inferred by these newer algorithms are incorrect, leading to the deduction of unrealistic AS hierarchies. W ..."
Abstract

Cited by 26 (10 self)
 Add to MetaCart
Recent techniques for inferring business relationships between ASs [3, 8] have yielded maps that have extremely few invalid BGP paths in the terminology of Gao [9]. However, some relationships inferred by these newer algorithms are incorrect, leading to the deduction of unrealistic AS hierarchies. We investigate this problem and discover what causes it. Having obtained such insight, we generalize the problem of AS relationship inference as a multiobjective optimization problem with nodedegreebased corrections to the original objective function of minimizing the number of invalid paths. We solve the generalized version of the problem using the semidefinite programming relaxation of the MAX2SAT problem. Keeping the number of invalid paths small, we obtain a more veracious solution than that yielded by recent heuristics.
Semantics and complexity of abduction from default theories
 Artificial Intelligence
, 1997
"... Since logical knowledge representation is commonly based on nonclassical formalisms like default logic, autoepistemic logic, or circumscription, it is necessary to perform abductive reasoning from theories of nonclassical logics. In this paper, we investigate how abduction can be performed from theo ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
Since logical knowledge representation is commonly based on nonclassical formalisms like default logic, autoepistemic logic, or circumscription, it is necessary to perform abductive reasoning from theories of nonclassical logics. In this paper, we investigate how abduction can be performed from theories in default logic. Different modes of abduction are plausible, based on credulous and skeptical default reasoning; they appear useful for different applications such as diagnosis and planning. Moreover, we analyze the complexity of the main abductive reasoning tasks. They are intractable in the general case; we also present known classes of default theories for which abduction is tractable. 1