Results 1  10
of
28
Probabilistic Horn abduction and Bayesian networks
 Artificial Intelligence
, 1993
"... This paper presents a simple framework for Hornclause abduction, with probabilities associated with hypotheses. The framework incorporates assumptions about the rule base and independence assumptions amongst hypotheses. It is shown how any probabilistic knowledge representable in a discrete Bayesia ..."
Abstract

Cited by 298 (37 self)
 Add to MetaCart
This paper presents a simple framework for Hornclause abduction, with probabilities associated with hypotheses. The framework incorporates assumptions about the rule base and independence assumptions amongst hypotheses. It is shown how any probabilistic knowledge representable in a discrete Bayesian belief network can be represented in this framework. The main contribution is in finding a relationship between logical and probabilistic notions of evidential reasoning. This provides a useful representation language in its own right, providing a compromise between heuristic and epistemic adequacy. It also shows how Bayesian networks can be extended beyond a propositional language. This paper also shows how a language with only (unconditionally) independent hypotheses can represent any probabilistic knowledge, and argues that it is better to invent new hypotheses to explain dependence rather than having to worry about dependence in the language. Scholar, Canadian Institute for Advanced...
Automated Discourse Generation Using Discourse Structure Relations
 Artificial Intelligence
, 1993
"... This paper summarizes work over the past five years on the automated planning and generation of multisentence texts using discourse structure relations, placing it in context of ongoing efforts by Computational Linguists and Linguists to understand the structure of discourse. Based on a series of ..."
Abstract

Cited by 171 (1 self)
 Add to MetaCart
This paper summarizes work over the past five years on the automated planning and generation of multisentence texts using discourse structure relations, placing it in context of ongoing efforts by Computational Linguists and Linguists to understand the structure of discourse. Based on a series of studies by the author and others, the paper describes how the orientation of generation toward communicative intentions illuminates the central structural role played by intersegment discourse relations. It outlines several facets of discourse structure relations as they are required by and used in text planners  their nature, number, and extension to associated tasks such as sentence planning and text formatting. In Artificial Intelligence 63, Special Issue on Natural Language Processing, 1993. This work was partially supported by the Rome Air Development Center under RADC contract FQ76198903326 0001. 1 1 Introduction Every day, people produce thousands of words of connected...
A Prologlike Inference System for Computing MinimumCost Abductive Explanations in NaturalLanguage Interpretation
, 1988
"... By determining what added assumptions would suffice to make the logical form of a sentence in natural language provable, abductive inference can be used in the interpretation of sentences to determine what information should be added to the listener's knowledge, i.e., what he should learn from the s ..."
Abstract

Cited by 47 (1 self)
 Add to MetaCart
By determining what added assumptions would suffice to make the logical form of a sentence in natural language provable, abductive inference can be used in the interpretation of sentences to determine what information should be added to the listener's knowledge, i.e., what he should learn from the sentence. This is a comparatively new application of mechanized abduction. A new form of abductionleast specific abductionis proposed as being more appropriate to the task of interpreting natural language than the forms that have been used in the traditional diagnostic and designsynthesis applications of abduction. The assignment of numerical costs to axioms and assumable literals permits specification of preferences on different abductive explanations. A new Prologlike inference system that computes abductive explanations and their costs is given. To facilitate the computation of minimumcost explanations, the inference system, unlike others such as Prolog, is designed to avoid the repeated use of the same instance of an axiom or assumption.
On the Generation of Alternative Explanations with Implications for Belief Revision
 In Uncertainty in Artificial Intelligence (UAI91
, 1991
"... In general, the best explanation for a given observation makes no promises on how good it is with respect to other alternative explanations. A major deficiency of messagepassing schemes for belief revision in Bayesian networks is their inability to generate alternatives beyond the second best. In t ..."
Abstract

Cited by 41 (5 self)
 Add to MetaCart
In general, the best explanation for a given observation makes no promises on how good it is with respect to other alternative explanations. A major deficiency of messagepassing schemes for belief revision in Bayesian networks is their inability to generate alternatives beyond the second best. In this paper, we present a general approach based on linear constraint systems that naturally generates alternative explanations in an orderly and highly efficient manner. This approach is then applied to costbased abduction problems as well as belief revision in Bayesian networks.
Approaches to Abductive Reasoning  An Overview
 ARTIFICIAL INTELLIGENCE REVIEW
, 1993
"... Abduction is a form of nonmonotonic reasoning that has gained increasing interest in the last few years. The key idea behind it can be represented by the following inference rule
$$O = \mathop C\limits_  N = \mathop P\limits_^  O  \mathop C\limits_^  .$$
i.e., from an occurrence of ohgr an ..."
Abstract

Cited by 40 (1 self)
 Add to MetaCart
Abduction is a form of nonmonotonic reasoning that has gained increasing interest in the last few years. The key idea behind it can be represented by the following inference rule
$$O = \mathop C\limits_  N = \mathop P\limits_^  O  \mathop C\limits_^  .$$
i.e., from an occurrence of ohgr and the rule ldquophiv implies ohgrrdquo, infer an occurrence of phiv as aplausible hypothesis or explanation for ohgr. Thus, in contrast to deduction, abduction is as well as induction a form of ldquodefeasiblerdquo inference, i.e., the formulae sanctioned are plausible and submitted to verification.
In this paper, a formal description of current approaches is given. The underlying reasoning process is treated independently and divided into two parts. This includes a description of methods for hypotheses generation and methods for finding the best explanations among a set of possible ones. Furthermore, the complexity of the abductive task is surveyed in connection with its relationship to default reasoning. We conclude with the presentation of applications of the discussed approaches focusing on plan recognition and plan generation.
Logic Programming, Abduction and Probability: a topdown anytime algorithm for estimating prior and posterior probabilities
 New Generation Computing
, 1993
"... Probabilistic Horn abduction is a simple framework to combine probabilistic and logical reasoning into a coherent practical framework. The numbers can be consistently interpreted probabilistically, and all of the rules can be interpreted logically. The relationship between probabilistic Horn abducti ..."
Abstract

Cited by 39 (8 self)
 Add to MetaCart
Probabilistic Horn abduction is a simple framework to combine probabilistic and logical reasoning into a coherent practical framework. The numbers can be consistently interpreted probabilistically, and all of the rules can be interpreted logically. The relationship between probabilistic Horn abduction and logic programming is at two levels. At the first level probabilistic Horn abduction is an extension of pure Prolog, that is useful for diagnosis and other evidential reasoning tasks. At another level, current logic programming implementation techniques can be used to efficiently implement probabilistic Horn abduction. This forms the basis of an "anytime" algorithm for estimating arbitrary conditional probabilities. The focus of this paper is on the implementation. Scholar, Canadian Institute for Advanced Research Logic Programming, Abduction and Probability 2 1 Introduction Probabilistic Horn Abduction [22, 21, 23] is a framework for logicbased abduction that incorporates proba...
Abductive Theorem Proving for Analyzing Student Explanations
 Journal of Automated Reasoning, Special
, 2003
"... The Why2Atlas tutoring system presents students with qualitative physics questions and encourages them to explain their answers via natural language. Although there are inexpensive techniques for analyzing explanations, we claim that better understanding is necessary to provide substantive feedback ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
The Why2Atlas tutoring system presents students with qualitative physics questions and encourages them to explain their answers via natural language. Although there are inexpensive techniques for analyzing explanations, we claim that better understanding is necessary to provide substantive feedback. In this paper we motivate and describe how the system creates and utilizes a proofbased representation of student essays and provide some preliminary evaluation results.
Representing diagnostic knowledge for probabilistic horn abduction
 Readings in modelbased diagnosis
, 1992
"... This paper presents a simple logical framework for abduction, with probabilities associated with hypotheses. The language is an extension to pure Prolog, and it has straightforward implementations using branch and bound search with either logicprogramming technology or ATMS technology. The main fo ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
This paper presents a simple logical framework for abduction, with probabilities associated with hypotheses. The language is an extension to pure Prolog, and it has straightforward implementations using branch and bound search with either logicprogramming technology or ATMS technology. The main focus of this paper is arguing for a form of representational adequacy of this very simple system for diagnostic reasoning. It is shown how it can represent modelbased knowledge, with and without faults, and with and without nonintermittency assumptions. It is also shown how this representation can represent any probabilistic knowledge representable in a Bayesian belief network. 1
A Fast HillClimbing Approach Without an Energy Function for Probabilistic Reasoning
 In Proceedings of the 5th IEEE International Conference on Tools with Artificial Intelligence
, 1993
"... Integer linear programming (ILP) has long been an important tool for Operations Research akin to our AI search heuristics for NPhard problems. However, there has been relatively little incentive to use it in AI even though it also deals with optimization. The problem stems from the misperception th ..."
Abstract

Cited by 14 (9 self)
 Add to MetaCart
Integer linear programming (ILP) has long been an important tool for Operations Research akin to our AI search heuristics for NPhard problems. However, there has been relatively little incentive to use it in AI even though it also deals with optimization. The problem stems from the misperception that because the general ILP problem is difficult to solve, then it will be difficult for all cases. As we all know, AI search at first glance also seems this way until we begin to apply it to a specific domain. Clearly, there are many gains to be had from studying the problem with a different perspective like ILP. In this paper, we look at probabilistic reasoning with Bayesian networks. For some time now, we have been stalled by its computational complexities. Algorithms have been designed for small classes of networks, but have been mainly inextensible to the general case. In particular, we consider belief revision in Bayesian networks which is the search for the mostprobable explanation for...
Representing Bayesian networks within probabilistic Horn abduction
 In Proc. Seventh Conf. on Uncertainty in Artificial Intelligence
, 1991
"... This paper presents a simple framework for Hornclause abduction, with probabilities associated with hypotheses. It is shown how this representation can represent any probabilistic knowledge representable in a Bayesian belief network. The main contributions are in finding a relationship between logic ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
This paper presents a simple framework for Hornclause abduction, with probabilities associated with hypotheses. It is shown how this representation can represent any probabilistic knowledge representable in a Bayesian belief network. The main contributions are in finding a relationship between logical and probabilistic notions of evidential reasoning. This can be used as a basis for a new way to implement Bayesian Networks that allows for approximations to the value of the posterior probabilities, and also points to a way that Bayesian networks can be extended beyond a propositional language. 1