Results 1  10
of
129
Fusion, Propagation, and Structuring in Belief Networks
 ARTIFICIAL INTELLIGENCE
, 1986
"... Belief networks are directed acyclic graphs in which the nodes represent propositions (or variables), the arcs signify direct dependencies between the linked propositions, and the strengths of these dependencies are quantified by conditional probabilities. A network of this sort can be used to repre ..."
Abstract

Cited by 438 (7 self)
 Add to MetaCart
Belief networks are directed acyclic graphs in which the nodes represent propositions (or variables), the arcs signify direct dependencies between the linked propositions, and the strengths of these dependencies are quantified by conditional probabilities. A network of this sort can be used to represent the generic knowledge of a domain expert, and it turns into a computational architecture if the links are used not merely for storing factual knowledge but also for directing and activating the data flow in the computations which manipulate this knowledge. The first part of the paper deals with the task of fusing and propagating the impacts of new information through the networks in such a way that, when equilibrium is reached, each proposition will be assigned a measure of belief consistent with the axioms of probability theory. It is shown that if the network is singly connected (e.g. treestructured), then probabilities can be updated by local propagation in an isomorphic network of parallel and autonomous processors and that the impact of new information can be imparted to all propositions in time proportional to the longest path in the network. The second part of the paper deals with the problem of finding a treestructured representation for a collection of probabilistically coupled propositions using auxiliary (dummy) variables, colloquially called "hidden causes. " It is shown that if such a treestructured representation exists, then it is possible to uniquely uncover the topology of the tree by observing pairwise dependencies among the available propositions (i.e., the leaves of the tree). The entire tree structure, including the strengths of all internal relationships, can be reconstructed in time proportional to n log n, where n is the number of leaves.
An assumptionbased truthmaintenance system
 Artificial Intelligence
, 1986
"... In this paper we (1) define the concept of a Clause Managetnent System (CMS) — a generaizatiou of de Kleer’s ATMS, (2) motivate such systems in terms of efficiency of search and abductive reasoning, and (3) characterize the computation affected by a CMS in terms of the concept of prime implicants. ..."
Abstract

Cited by 328 (9 self)
 Add to MetaCart
In this paper we (1) define the concept of a Clause Managetnent System (CMS) — a generaizatiou of de Kleer’s ATMS, (2) motivate such systems in terms of efficiency of search and abductive reasoning, and (3) characterize the computation affected by a CMS in terms of the concept of prime implicants.’ 1. A ProblemSolving Architecture Figure 1 illustrates an architecture for a problem solving system consisting of a domain dependent Reasoner coupled to a domain independent Clause Management System (CMS). For our present purposes, the Reasoner is a black box which, m the process of doing whatever it does, occasionally transmits a propositional clause2 to the CMS. The Reasoner is also permitted to query the CMS any time it feels so inclined. A query takes the form of a propositional clause C. The CMS is expected to respond with every shortest clause S for which the clause S V C is a logical consequence, but S is not a logical consequence, of the clauses thus far transmitted to the CMS by the Reasoner. In Section 2 we show why obtaining such S’s is important for many Al systems. For example, for abductive reasoning‘S will be an hypothesis, which, if known, sanctions the conclusion C. For efficient search~Sdefines a most general context in which C holds. A traditional ATMS/TMS is a restricted CMS in which (1) the clauses transmitted to the CMS are limited to be either Horn (i.e., justifications) or negative (i.e., nogoods),
Remote Agent: To Boldly Go Where No AI System Has Gone Before
, 1998
"... Renewed motives for space exploration have inspired NASA to work toward the goal of establishing a virtual presence in space, through heterogeneous effets of robotic explorers. Information technology, and Artificial Intelligence in particular, will play a central role in this endeavor by endowing th ..."
Abstract

Cited by 226 (18 self)
 Add to MetaCart
(Show Context)
Renewed motives for space exploration have inspired NASA to work toward the goal of establishing a virtual presence in space, through heterogeneous effets of robotic explorers. Information technology, and Artificial Intelligence in particular, will play a central role in this endeavor by endowing these explorers with a form of computational intelligence that we call remote agents. In this paper we describe the Remote Agent, a specific autonomous agent architecture based on the principles of modelbased programming, onboard deduction and search, and goaldirected closedloop commanding, that takes a significant step toward enabling this future. This architecture addresses the unique characteristics of the spacecraft domain that require highly reliable autonomous operations over long periods of time with tight deadlines, resource constraints, and concurrent activity among tightly coupled subsystems. The Remote Agent integrates constraintbased temporal planning and scheduling, robust multithreaded execution, and modelbased mode identification and reconfiguration. The demonstration of the integrated system as an onboard controller for Deep Space One, NASA's rst New Millennium mission, is scheduled for a period of a week in late 1998. The development of the Remote Agent also provided the opportunity to reassess some of AI's conventional wisdom about the challenges of implementing embedded systems, tractable reasoning, and knowledge representation. We discuss these issues, and our often contrary experiences, throughout the paper.
Improvements To Propositional Satisfiability Search Algorithms
, 1995
"... ... quickly across a wide range of hard SAT problems than any other SAT tester in the literature on comparable platforms. On a Sun SPARCStation 10 running SunOS 4.1.3 U1, POSIT can solve hard random 400variable 3SAT problems in about 2 hours on the average. In general, it can solve hard nvariable ..."
Abstract

Cited by 174 (0 self)
 Add to MetaCart
... quickly across a wide range of hard SAT problems than any other SAT tester in the literature on comparable platforms. On a Sun SPARCStation 10 running SunOS 4.1.3 U1, POSIT can solve hard random 400variable 3SAT problems in about 2 hours on the average. In general, it can solve hard nvariable random 3SAT problems with search trees of size O(2 n=18:7 ). In addition to justifying these claims, this dissertation describes the most significant achievements of other researchers in this area, and discusses all of the widely known general techniques for speeding up SAT search algorithms. It should be useful to anyone interested in NPcomplete problems or combinatorial optimization in general, and it should be particularly useful to researchers in either Artificial Intelligence or Operations Research.
Algorithms for the Satisfiability (SAT) Problem: A Survey
 DIMACS Series in Discrete Mathematics and Theoretical Computer Science
, 1996
"... . The satisfiability (SAT) problem is a core problem in mathematical logic and computing theory. In practice, SAT is fundamental in solving many problems in automated reasoning, computeraided design, computeraided manufacturing, machine vision, database, robotics, integrated circuit design, compute ..."
Abstract

Cited by 142 (3 self)
 Add to MetaCart
(Show Context)
. The satisfiability (SAT) problem is a core problem in mathematical logic and computing theory. In practice, SAT is fundamental in solving many problems in automated reasoning, computeraided design, computeraided manufacturing, machine vision, database, robotics, integrated circuit design, computer architecture design, and computer network design. Traditional methods treat SAT as a discrete, constrained decision problem. In recent years, many optimization methods, parallel algorithms, and practical techniques have been developed for solving SAT. In this survey, we present a general framework (an algorithm space) that integrates existing SAT algorithms into a unified perspective. We describe sequential and parallel SAT algorithms including variable splitting, resolution, local search, global optimization, mathematical programming, and practical SAT algorithms. We give performance evaluation of some existing SAT algorithms. Finally, we provide a set of practical applications of the sat...
Recent Advances in AI Planning
 AI MAGAZINE
, 1999
"... The past five years have seen dramatic advances in planning algorithms, with an emphasis on propositional methods such as Graphplan and compilers that convert planning problems into propositional CNF formulae for solution via systematic or stochastic SAT methods. Related work on the Deep Space O ..."
Abstract

Cited by 121 (0 self)
 Add to MetaCart
The past five years have seen dramatic advances in planning algorithms, with an emphasis on propositional methods such as Graphplan and compilers that convert planning problems into propositional CNF formulae for solution via systematic or stochastic SAT methods. Related work on the Deep Space One spacecraft control algorithms advances our understanding of interleaved planning and execution. In this survey,we explain the latest techniques and suggest areas for future research.
From Local to Global Consistency
, 1992
"... In reasoning tasks involving the maintenance of consistent databases (socalled QQconstraint networks/Q/Q), it is customary to enforce local consistency conditions in order to simplify the subsequent construction of a globally coherent model of the data. In this paper we present a relationship betwe ..."
Abstract

Cited by 118 (7 self)
 Add to MetaCart
In reasoning tasks involving the maintenance of consistent databases (socalled QQconstraint networks/Q/Q), it is customary to enforce local consistency conditions in order to simplify the subsequent construction of a globally coherent model of the data. In this paper we present a relationship between the sizes of the variables' domains, the constraints' arity and the level of local consistency sufficient to ensure global consistency. Based on these parameters a new tractability classification of constraint networks is presented. We also show, based on this relationship, that any relation on bivalued variables which is not representable by a network of binary constraints cannot be represented by networks with any number of hidden variables.
Conflict analysis in search algorithms for propositional satisfiability
 in Proceedings of the IEEE International Conference on Tools with Artificial Intelligence
, 1996
"... This paper introduces GRASP (Generic seaRch Algorithm for the Sati$ability Problem), a new search algorithm for Propositional Satisjability (SAT). GRASP incorporates several searchpruning techniques, some of which are spec$c to SAT whereas others find equivalent in other fieh ofArt$cial Intelligenc ..."
Abstract

Cited by 54 (2 self)
 Add to MetaCart
This paper introduces GRASP (Generic seaRch Algorithm for the Sati$ability Problem), a new search algorithm for Propositional Satisjability (SAT). GRASP incorporates several searchpruning techniques, some of which are spec$c to SAT whereas others find equivalent in other fieh ofArt$cial Intelligence. GRASP is premised on the inevitability of conflicts during search and its most distinguishing feature is the augmentation of basic backtracking search with a
Tractable Databases: How to Make Propositional Unit Resolution Complete through Compilation
, 1994
"... We present procedures to compile any propositional clausal database \Sigma into a logically equivalent "compiled" database \Sigma ? such that, for any clause C, \Sigma j= C if and only if there is a unit refutation of \Sigma ? [ :C. It follows that once the compilation process is compl ..."
Abstract

Cited by 50 (5 self)
 Add to MetaCart
We present procedures to compile any propositional clausal database \Sigma into a logically equivalent "compiled" database \Sigma ? such that, for any clause C, \Sigma j= C if and only if there is a unit refutation of \Sigma ? [ :C. It follows that once the compilation process is complete any query about the logical consequences of \Sigma can be correctly answered in time linear in the sum of the sizes of \Sigma ? and the query. The compiled database \Sigma ? is for all but one of the procedures a subset of the set P I (\Sigma) of prime implicates of \Sigma, but \Sigma ? can be exponentially smaller than P I (\Sigma). Of independent interest, we prove the equivalence of unitrefutability with two restrictions of resolution, and provide a new sufficient condition for unit refutation completeness, thus identifying a new class of tractable theories, one which is of interest to abduction problems as well. Finally, we apply the results to the design of a complete LTMS. 1 INTRODUCT...
Reason Maintenance and Belief Revision  Foundations vs. Coherence Theories
 Belief Revision
, 1992
"... this paper, we examine Gardenfors's criticisms of the foundations approach. We argue that the coherence and foundations approaches differ less than has been supposed, in that the fundamental concerns of the coherence approach for conservatism in belief revision apply in exactly the same way in ..."
Abstract

Cited by 49 (6 self)
 Add to MetaCart
this paper, we examine Gardenfors's criticisms of the foundations approach. We argue that the coherence and foundations approaches differ less than has been supposed, in that the fundamental concerns of the coherence approach for conservatism in belief revision apply in exactly the same way in the foundations approach. We also argue that the foundations approach represents the most direct way of mechanizing the coherence approach. Moreover, the computational costs of revisions based on epistemic entrenchment appear to equal or exceed those of revisions based on reasons, in the sense that any entrenchment ordering from which information about reasons may be recovered will be at least as costly to update as the reasons it represents. We conclude that while the coherence approach offers a valuable perspective on belief revision, it does not yet provide an adequate theoretical or practical basis for characterizing or mechanizing belief revision.