Results 1  10
of
20
Reasoning about Beliefs and Actions under Computational Resource Constraints
 In Proceedings of the 1987 Workshop on Uncertainty in Artificial Intelligence
, 1987
"... ion Modulation In many cases, it may be more useful to do normative inference on a model that is deemed to be complete at a particular level of abstraction than it is to do an approximate or heuristic analysis of a model that is too large to be analyzed under specific resource constraints. It may pr ..."
Abstract

Cited by 179 (18 self)
 Add to MetaCart
ion Modulation In many cases, it may be more useful to do normative inference on a model that is deemed to be complete at a particular level of abstraction than it is to do an approximate or heuristic analysis of a model that is too large to be analyzed under specific resource constraints. It may prove useful in many cases to store several beliefnetwork representations, each containing propositions at different levels of abstraction. In many domains, models at higher levels of abstraction are more tractable. As the time available for computation decreases, network modules of increasing abstraction can be employed. ffl Local Reformulation Local reformulation is the modification of specific troublesome topologies in a belief network. Approximation methods and heuristics designed to modify the microstructure of belief networks will undoubtedly be useful in the tractable solution of large uncertainreasoning problems. Such strategies might be best applied at knowledgeencoding time. An...
Decision Theory in Expert Systems and Artificial Intelligence
 International Journal of Approximate Reasoning
, 1988
"... Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decision ..."
Abstract

Cited by 89 (18 self)
 Add to MetaCart
Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decisiontheoretic framework. Recent analyses of the restrictions of several traditional AI reasoning techniques, coupled with the development of more tractable and expressive decisiontheoretic representation and inference strategies, have stimulated renewed interest in decision theory and decision analysis. We describe early experience with simple probabilistic schemes for automated reasoning, review the dominant expertsystem paradigm, and survey some recent research at the crossroads of AI and decision science. In particular, we present the belief network and influence diagram representations. Finally, we discuss issues that have not been studied in detail within the expertsystems sett...
Belief Networks Revisited
, 1994
"... this paper, Rumelhart presented compelling evidence that text comprehension must be a distributed process that combines both topdown and bottomup inferences. Strangely, this dual mode of inference, so characteristic of Bayesian analysis, did not match the capabilities of either the "certainty fact ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
this paper, Rumelhart presented compelling evidence that text comprehension must be a distributed process that combines both topdown and bottomup inferences. Strangely, this dual mode of inference, so characteristic of Bayesian analysis, did not match the capabilities of either the "certainty factors" calculus or the inference networks of PROSPECTOR  the two major contenders for uncertainty management in the 1970s. I thus began to explore the possibility of achieving distributed computation in a "pure" Bayesian framework, so as not to compromise its basic capacity to combine bidirectional inferences (i.e., predictive and abductive) . Not caring much about generality at that point, I picked the simplest structure I could think of (i.e., a tree) and tried to see if anything useful can be computed by assigning each variable a simple processor, forced to communicate only with its neighbors. This gave rise to the treepropagation algorithm reported in [15] and, a year later, the KimPearl algorithm [12], which supported not only bidirectional inferences but also intercausal interactions, such as "explainingaway." These two algorithms were described in Section 2 of Fusion.
Why is Diagnosis Using Belief Networks Insensitive to Imprecision in Probabilities?
, 1996
"... Recent research has found that diagnostic performance with Bayesian belief networks is often surprisingly insensitive to imprecision in the numerical probabilities. For example, the authors have recently completed an extensive study in which they applied random noise to the numerical probabilities i ..."
Abstract

Cited by 35 (0 self)
 Add to MetaCart
Recent research has found that diagnostic performance with Bayesian belief networks is often surprisingly insensitive to imprecision in the numerical probabilities. For example, the authors have recently completed an extensive study in which they applied random noise to the numerical probabilities in a set of belief networks for medical diagnosis, subsets of the CPCS network, a subset of the QMR (Quick Medical Reference) focused on liver and bile diseases. The diagnostic performance in terms of the average probabilities assigned to the actual diseases showed small sensitivity even to large amounts of noise. In this paper, we summarize the findings of this study and discuss possible explanations of this low sensitivity. One reason is that the criterion for performance is average probability of the true hypotheses, rather than average error in probability, which is insensitive to symmetric noise distributions. But, we show that even asymmetric, logoddsnormal noise has modest effects. A ...
Towards normative expert systems: part II, probabilitybased representations for efficient knowledge acquisition and inference. Methods of Information in medicine
 Methods of Information in Medicine
, 1992
"... We address practical issues concerning the construction and use of decisiontheoretic or normative expert systems for diagnosis. In particular, we examine Pathfinder, a normative expert system that assists surgical pathologists with the diagnosis of lymphnode diseases, and discuss the representation ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
We address practical issues concerning the construction and use of decisiontheoretic or normative expert systems for diagnosis. In particular, we examine Pathfinder, a normative expert system that assists surgical pathologists with the diagnosis of lymphnode diseases, and discuss the representation of dependencies among pieces of evidence within this system. We describe the belief network, a graphical representation of probabilistic dependencies. We see how Pathfinder uses a belief network to construct differential diagnoses efficiently, even when there are dependencies among pieces of evidence. In addition, we introduce an extension of the beliefnetwork representation called a similarity network, a tool for constructing large and complex belief networks. The representation allows a user to construct independent belief networks for subsets of a given domain. A valid belief network for the entire domain can then be constructed from the individual belief networks. We also introduce the partition, a graphical representation that facilitates the assessment of probabilities associated with a belief network. Finally, we show that the similaritynetwork and partition representations made practical the construction of Pathfinder.
Ideal Reformulation of Belief Networks
"... The intelligent reformulation or restructuring of a belief network can greatly increase the efficiency of inference. However, time expended for reformulation is not available for performing inference. Thus, under time pressure, there is a tradeoff between the time dedicated to reformulating the netw ..."
Abstract

Cited by 30 (6 self)
 Add to MetaCart
The intelligent reformulation or restructuring of a belief network can greatly increase the efficiency of inference. However, time expended for reformulation is not available for performing inference. Thus, under time pressure, there is a tradeoff between the time dedicated to reformulating the network and the time applied to the implementation of a solution. We investigate this partition of resources into time applied to reformulation and time used for inference. We shall describe first general principles for computing the ideal partition of resources under uncertainty. These principles have applicability to a wide variety of problems that can be divided into interdependent phases of problem solving. After, we shall present results of our empirical study of the problem of determining the ideal amount of time to devote to searching for clusters in belief networks. In this work, we acquired and made use of probability distributions that characterize (1) the performance of alternative heuristic search methods for reformulating a network instance into a set of cliques, and (2) the time for executing inference procedures on various belief networks. Given a preference model describing the value of a solution as a function of the delay required for its computation, the system selects an ideal time to devote to reformulation.
Efficient markov network structure discovery using independence tests
 In Proc SIAM Data Mining
, 2006
"... We present two algorithms for learning the structure of a Markov network from discrete data: GSMN and GSIMN. Both algorithms use statistical conditional independence tests on data to infer the structure by successively constraining the set of structures consistent with the results of these tests. GS ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
We present two algorithms for learning the structure of a Markov network from discrete data: GSMN and GSIMN. Both algorithms use statistical conditional independence tests on data to infer the structure by successively constraining the set of structures consistent with the results of these tests. GSMN is a natural adaptation of the GrowShrink algorithm of Margaritis and Thrun for learning the structure of Bayesian networks. GSIMN extends GSMN by additionally exploiting Pearlâ€™s wellknown properties of conditional independence relations to infer novel independencies from known independencies, thus avoiding the need to perform these tests. Experiments on artificial and real data sets show GSIMN can yield savings of up to 70 % with respect to GSMN, while generating a Markov network with comparable or in several cases considerably improved quality. In addition
Optimal Monte Carlo Estimation of Belief Network Inference
 In Proceedings of the 12th Conference on Uncertainty in Artificial Intelligence
, 1996
"... We present two Monte Carlo sampling algorithms for probabilistic inference that guarantee polynomialtime convergence for a larger class of network than current sampling algorithms provide. These new methods are variants of the known likelihood weighting algorithm. We use of recent advances in ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
We present two Monte Carlo sampling algorithms for probabilistic inference that guarantee polynomialtime convergence for a larger class of network than current sampling algorithms provide. These new methods are variants of the known likelihood weighting algorithm. We use of recent advances in the theory of optimal stopping rules for Monte Carlo simulation to obtain an inference approximation with relative error and a small failure probability . We present an empirical evaluation of the algorithms which demonstrates their improved performance. 1
Flexible Policy Construction by Information Refinement
 In Proceedings of the Twelfth Conference on Uncertainty in Artificial Intelligence
, 1998
"... Decision making under uncertainty addresses the problem of deciding which actions to take in the world, when there is uncertainty about the state of the world, and uncertainty as to the outcome of these actions. A rational approach to making good choices is the principle of maximum expected utility: ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Decision making under uncertainty addresses the problem of deciding which actions to take in the world, when there is uncertainty about the state of the world, and uncertainty as to the outcome of these actions. A rational approach to making good choices is the principle of maximum expected utility: the decision maker should act so as to maximize the expected benefits of the possible outcomes. The "textbook" approaches to decision analysis typically make the assumption that the computational costs involved are negligible. This assumption is not always appropriate. When computational costs cannot be ignored, a decision maker must be able to choose a tradeoff between computational costs and object value. This thesis proposes an approach to decision making called information refinement. It is an iterative, heuristic process which a decision maker can use to build a policy. We present three algorithms which use information refinement to construct policies for decision problems expressed...