Results 1  10
of
42
A Guide to the Literature on Learning Probabilistic Networks From Data
, 1996
"... This literature review discusses different methods under the general rubric of learning Bayesian networks from data, and includes some overlapping work on more general probabilistic networks. Connections are drawn between the statistical, neural network, and uncertainty communities, and between the ..."
Abstract

Cited by 172 (0 self)
 Add to MetaCart
This literature review discusses different methods under the general rubric of learning Bayesian networks from data, and includes some overlapping work on more general probabilistic networks. Connections are drawn between the statistical, neural network, and uncertainty communities, and between the different methodological communities, such as Bayesian, description length, and classical statistics. Basic concepts for learning and Bayesian networks are introduced and methods are then reviewed. Methods are discussed for learning parameters of a probabilistic network, for learning the structure, and for learning hidden variables. The presentation avoids formal definitions and theorems, as these are plentiful in the literature, and instead illustrates key concepts with simplified examples. Keywords Bayesian networks, graphical models, hidden variables, learning, learning structure, probabilistic networks, knowledge discovery. I. Introduction Probabilistic networks or probabilistic gra...
Accounting for Model Uncertainty in Survival Analysis Improves Predictive Performance
 In Bayesian Statistics 5
, 1995
"... Survival analysis is concerned with finding models to predict the survival of patients or to assess the efficacy of a clinical treatment. A key part of the modelbuilding process is the selection of the predictor variables. It is standard to use a stepwise procedure guided by a series of significanc ..."
Abstract

Cited by 39 (12 self)
 Add to MetaCart
Survival analysis is concerned with finding models to predict the survival of patients or to assess the efficacy of a clinical treatment. A key part of the modelbuilding process is the selection of the predictor variables. It is standard to use a stepwise procedure guided by a series of significance tests to select a single model, and then to make inference conditionally on the selected model. However, this ignores model uncertainty, which can be substantial. We review the standard Bayesian model averaging solution to this problem and extend it to survival analysis, introducing partial Bayes factors to do so for the Cox proportional hazards model. In two examples, taking account of model uncertainty enhances predictive performance, to an extent that could be clinically useful. 1 Introduction From 1974 to 1984 the Mayo Clinic conducted a doubleblinded randomized clinical trial involving 312 patients to compare the drug DPCA with a placebo in the treatment of primary biliary cirrhosis...
Sensitivity analysis in discrete Bayesian networks
 IEEE Transactions on Systems, Man, and Cybernetics
, 1997
"... The paper presents an efficient computational method for performing sensitivity analysis in discrete Bayesian networks. The method exploits the structure of conditional probabilities of a target node given the evidence. First, the set of parameters which are relevant to the calculation of the condit ..."
Abstract

Cited by 37 (4 self)
 Add to MetaCart
The paper presents an efficient computational method for performing sensitivity analysis in discrete Bayesian networks. The method exploits the structure of conditional probabilities of a target node given the evidence. First, the set of parameters which are relevant to the calculation of the conditional probabilities of the target node is identified. Next, this set is reduced by removing those combinations of the parameters which either contradict the available evidence or are incompatible. Finally, using the canonical components associated with the resulting subset of parameters, the desired conditional probabilities are obtained. In this way, an important saving in the calculations is achieved. The proposed method can also be used to compute exact upper and lower bounds for the conditional probabilities, hence a sensitivity analysis can be easily performed. Examples are used to illustrate the proposed methodology.
Network Engineering for Complex Belief Networks
 In Proc. UAI
, 1996
"... Developing a large belief network, like any large system, requires systems engineering to manage the design and construction process. We propose that network engineering follow a rapid prototyping approach to network construction. We describe criteria for identifying network modules and the use of ` ..."
Abstract

Cited by 33 (4 self)
 Add to MetaCart
Developing a large belief network, like any large system, requires systems engineering to manage the design and construction process. We propose that network engineering follow a rapid prototyping approach to network construction. We describe criteria for identifying network modules and the use of `stubs' within a belief network. We propose an object oriented representation for belief networks which captures the semantic as well as representational knowledge embedded in the variables, their values and their parameters. Methods for evaluating complex networks are described. Throughout the discussion, tools which support the engineering of large belief networks are identified. 1. Introduction As belief networks become more popular and well understood as a tool for modeling uncertainty and as the computational power of belief network inference engines increases, belief networks are being applied to problems of increasing size and complexity. In the early 1990's, Pathfinder, at 109 nodes...
Building probabilistic networks: where do the numbers come from?  a guide to the literature
 IEEE Transactions on Knowledge and Data Engineering
, 2000
"... Probabilistic networks are now fairly well established as practical representations of knowledge for reasoning under uncertainty, as demonstrated by an increasing number of successful applications in such domains as (medical) diagnosis and prognosis, planning, vision, ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
Probabilistic networks are now fairly well established as practical representations of knowledge for reasoning under uncertainty, as demonstrated by an increasing number of successful applications in such domains as (medical) diagnosis and prognosis, planning, vision,
When do Numbers Really Matter?
 Journal of Artificial Intelligence Research
, 2002
"... Common wisdom has it that small distinctions in the probabilities (parameters) quantifying a belief network do not matter much for the results of probabilistic queries. Yet, one can develop realistic scenarios under which small variations in network parameters can lead to significant changes in c ..."
Abstract

Cited by 28 (6 self)
 Add to MetaCart
Common wisdom has it that small distinctions in the probabilities (parameters) quantifying a belief network do not matter much for the results of probabilistic queries. Yet, one can develop realistic scenarios under which small variations in network parameters can lead to significant changes in computed queries. A pending theoretical question is then to analytically characterize parameter changes that do or do not matter. In this paper, we study the sensitivity of probabilistic queries to changes in network parameters and prove some tight bounds on the impact that such parameters can have on queries. Our analytic results pinpoint some interesting situations under which parameter changes do or do not matter. These results are important for knowledge engineers as they help them identify influential network parameters. They also help explain some of the previous experimental results and observations with regards to network robustness against parameter changes.
Sensitivity analysis in Bayesian networks: From single to multiple parameters
 In 20’th Conference on Uncertainty in Artificial Intelligence (UAI
, 2004
"... Previous work on sensitivity analysis in Bayesian networks has focused on single parameters, where the goal is to understand the sensitivity of queries to single parameter changes, and to identify single parameter changes that would enforce a certain query constraint. In this paper, we expand the wo ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
Previous work on sensitivity analysis in Bayesian networks has focused on single parameters, where the goal is to understand the sensitivity of queries to single parameter changes, and to identify single parameter changes that would enforce a certain query constraint. In this paper, we expand the work to multiple parameters which may be in the CPT of a single variable, or the CPTs of multiple variables. Not only do we identify the solution space of multiple parameter changes that would be needed to enforce a query constraint, but we also show how to find the optimal solution, that is, the one which disturbs the current probability distribution the least (with respect to a specific measure of disturbance). We characterize the computational complexity of our new techniques and discuss their applications to developing and debugging Bayesian networks, and to the problem of reasoning about the value (reliability) of new information. 1
Properties of Sensitivity Analysis of Bayesian Belief Networks
 Proceedings of the Joint Session of the 6th Prague Symposium of Asymptotic Statistics and the 13th Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, Union of Czech Mathematicians and Physicists
, 1999
"... The assessments obtained for the various conditional probabilities of a Bayesian belief network inevitably are inaccurate. The inaccuracies involved influence the reliability of the network's output. By subjecting the belief network to a sensitivity analysis with respect to its conditional probabili ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
The assessments obtained for the various conditional probabilities of a Bayesian belief network inevitably are inaccurate. The inaccuracies involved influence the reliability of the network's output. By subjecting the belief network to a sensitivity analysis with respect to its conditional probabilities, the reliability of the output can be investigated. Unfortunately, straightforward sensitivity analysis of a Bayesian belief network is highly timeconsuming. In this paper, we show that, by qualitative considerations, several analyses can be identified as being uninformative as the conditional probabilities under study cannot affect the network's output. In addition, we show that the analyses that are informative comply with simple mathematical functions; more specifically, we show that the network's output can be expressed as a quotient of two functions that are linear in a conditional probability under study. These properties allow for considerably reducing the computational burden of se...
Bayesian ErrorBars for Belief Net Inference
 In Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence
, 2001
"... A Bayesian Belief Network (BN) is a model of a joint distribution over a finite set of variables, with a DAG structure to represent the immediate dependencies between the variables, and a set of parameters (aka CPTables) to represent the local conditional probabilities of a node, given each as ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
A Bayesian Belief Network (BN) is a model of a joint distribution over a finite set of variables, with a DAG structure to represent the immediate dependencies between the variables, and a set of parameters (aka CPTables) to represent the local conditional probabilities of a node, given each assignment to its parents. In many situations, the parameters are themselves treated as random variables  reflecting the uncertainty remaining after drawing on knowledge of domain experts and/or observing data generated by the network. A distribution over the CPtable parameters induces a distribution for the response the BN will return to any "What is P rf H j E g?" query. This paper investigates the distribution of this response, shows that it is asymptotically normal, and derives expressions for its mean and asymptotic variance. We show that this computation has the same complexity as simply computing the (mean value of the) response  i.e., O(n exp(w)), where n is the number of variables and w is the effective tree width. We also provide empirical evidence showing that the errorbars computed from our estimates are fairly accurate in practice, over a wide range of belief net structures and queries. 1
Gradient descent training of Bayesian networks
 Proceedings of the Fifth European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty (ECSQARU
, 1999
"... As shown by Russel et al., 1995 [7], Bayesian networks can be equipped with a gradient descent learning method similar to the training method for neural networks. The calculation of the required gradients can be performed locally along with propagation. We review how this can be done, and we show h ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
As shown by Russel et al., 1995 [7], Bayesian networks can be equipped with a gradient descent learning method similar to the training method for neural networks. The calculation of the required gradients can be performed locally along with propagation. We review how this can be done, and we show how the gradient descent approach can be used for various tasks like tuning and training with training sets of definite as well as nondefinite classifications. We introduce tools for resistance and damping to guide the direction of convergence, and we use them for a new adaptation method which can also handle situations where parameters in the network covary.