## Parameter adjustment in Bayes networks. The generalized noisy OR-gate (1993)

Venue: | IN PROCEEDINGS OF THE 9TH CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE |

Citations: | 69 - 12 self |

### BibTeX

@INPROCEEDINGS{Diez93parameteradjustment,

author = {Francisco Javier Diez},

title = {Parameter adjustment in Bayes networks. The generalized noisy OR-gate},

booktitle = {IN PROCEEDINGS OF THE 9TH CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE},

year = {1993},

pages = {99--105},

publisher = {Morgan Kaufmann}

}

### Years of Citing Articles

### OpenURL

### Abstract

Spiegelhalter and Lauritzen [15] studied sequential learning in Bayesian networks and proposed three models for the representation of conditional probabilities. A forth model, shown here, assumes that the parameter distribution is given by a product of Gaussian functions and updates them from the and messages of evidence propagation. We also generalize the noisy OR-gate for multivalued variables, develop the algorithm to compute probability in time proportional to the number of parents (even in networks with loops) and apply the learning model to this gate.

### Citations

1283 |
Local computations with probabilities on graphical structures and their application to expert systems
- Lauritzen, Spiegelhalter
- 1988
(Show Context)
Citation Context ... of parents (even in networks with loops) and apply the learning model to this gate. 1 INTRODUCTION Knowledge acquisition is one of the bottlenecks in expert systems building. Bayesian networks (BNs) =-=[10, 11, 8]-=-, besides having theoretical grounds supporting them as the soundest framework for uncertain reasoning, offer an important advantage for model construction. In this field, the task consists of buildin... |

373 | Fusion, Propagation and Structuring in Belief Networks
- Pearl
- 1986
(Show Context)
Citation Context ... of parents (even in networks with loops) and apply the learning model to this gate. 1 INTRODUCTION Knowledge acquisition is one of the bottlenecks in expert systems building. Bayesian networks (BNs) =-=[10, 11, 8]-=-, besides having theoretical grounds supporting them as the soundest framework for uncertain reasoning, offer an important advantage for model construction. In this field, the task consists of buildin... |

197 |
Sequential updating of conditional probabilities on directed graphical structures
- Spiegelhalter, Lauritzen
- 1990
(Show Context)
Citation Context ...ent in Bayes networks. The generalized noisy OR--gate F. J. D'iez Dpto. Inform'atica y Autom'atica. UNED Senda del Rey. 28040 Madrid. Spain !FJavier.Diez@uned.es? Abstract Spiegelhalter and Lauritzen =-=[15]-=- studied sequential learning in Bayesian networks and proposed three models for the representation of conditional probabilities. A forth model, shown here, assumes that the parameter distribution is g... |

183 | Theory refinement on Bayesian networks
- Buntine
- 1991
(Show Context)
Citation Context ...unobservable or unobserved variable was not obtained directly but inferred from the values of other variables. Unfortunately, the construction of general BNs from incomplete databases is very complex =-=[1]-=- and no normative algorithm exists yet. So we assume that an initial causal network has been elicited from human experts. The use of Gaussian distributions allows us to integrate easily subjective ass... |

133 |
An algebra of Bayesian belief universes for knowledge-based systems
- Jensen, Olsen, et al.
- 1990
(Show Context)
Citation Context ... X. acquisition. This model can also save storage space, but if a clustering method is chosen for evidence propagation, the conditional probability table of every family must be worked out in advance =-=[7, 8]-=-, thus wasting the computational advantage of the OR--gate. For this reason, after formalizing the model, we will now develop an algorithm for computing probability in time proportional to the number ... |

108 |
Some practical issues in constructing belief networks
- Henrion
- 1987
(Show Context)
Citation Context ...mplifies knowledge acquisition, saves storage space and allows evidence propagation in time proportional to the number of parents. A generalization for multivalued variables was introduced by Henrion =-=[5]-=- in order to simplify knowledge 1 Only eqs. (18) and (20) would be slightly modified. We have here chosen the original definition just for simplicity. u x - Eq. (11) -su x 0 ? oe u x - Eq. (12) - oe u... |

51 |
Fusion and Propagation with Multiple Observations in Belief networks
- Peot, Shachter
- 1991
(Show Context)
Citation Context ...ed by a constant. It is not necessary to have normalized 's, and instead of defining (xj\Theta X ; \Theta + X ) j P (xje + X ; \Theta X ; \Theta + X ); (27) after [10], we could have defined it after =-=[13, 2]-=-: 1 (xj\Theta X ; \Theta + X ) j P (x; e + X j\Theta X ; \Theta + X ): (28) Therefore, this formalism can also be applied when evidence is propagated using the local conditioning algorithm [2] and so ... |

27 | Local conditioning in Bayesian networks
- Diez
- 1996
(Show Context)
Citation Context ...take advantage of this possibility. Section 3.1 generalizes the noisy OR to multivalued variables and develops efficient formulas for propagating evidence. They allow the local conditioning algorithm =-=[2]-=- to exploit the OR--gate even in multiplyconnected networks. 2 PARAMETER ADJUSTMENT 2.1 ASSUMPTIONS We introduce in this section the hypotheses which constitute the basis of our model. Every case i is... |

23 |
Assessment, criticism and improvement of imprecise subjective probabilities for a medical expert system
- Spiegelhalter, Franklin, et al.
- 1989
(Show Context)
Citation Context ... models: discretization of parameters, Dirichlet distributions, and Gaussian distributions for the log-odds relative to the probability of a reference state. The second approach is applied in [9] and =-=[14]-=-. The problem addressed in this paper is, as in S--L, to update sequentially the parameters of a probability distribution. The main difference from their work is that we assume a normal distribution f... |

12 | A statistical semantics for causation
- Pearl, Verma
- 1991
(Show Context)
Citation Context ...n general anomalies--- whose presence can produce X. In other words, a link in the OR--gate represents the intuitive notion of causations("U produces X"), not only the statistical definition=-= given in [12]-=-. The main advantage of the OR--gate is that the number of parameters is proportional to the number of causes, while it was exponential in the general case. As a consequence, the OR--gate simplifies k... |

11 |
Qualitative propagation and scenario-based schemes for explaining probabilistic reasoning
- Henrion, Druzdzel
- 1991
(Show Context)
Citation Context ...te because it appears much more often. An additional advantage of these gates is that they enable us to generate explanations of why the evidence at hand has increased or reduced the probability of X =-=[6]-=-. 3.2 PARAMETER ADJUSTMENT FOR THE OR--GATE We are now going to develop a formalism for parameter adjustment in the OR--gate, similar to that of section 2.3 for the general case. The starting point is... |

8 |
aHugin: A System Creating Adaptive Causal Probabilistic Networks. In: Dubois et al
- Oleson, Lauritzen
- 1992
(Show Context)
Citation Context ...ifferent models: discretization of parameters, Dirichlet distributions, and Gaussian distributions for the log-odds relative to the probability of a reference state. The second approach is applied in =-=[9]-=- and [14]. The problem addressed in this paper is, as in S--L, to update sequentially the parameters of a probability distribution. The main difference from their work is that we assume a normal distr... |

8 |
Probabilistic reasoning in expert systems
- Pearl
- 1988
(Show Context)
Citation Context ... of parents (even in networks with loops) and apply the learning model to this gate. 1 INTRODUCTION Knowledge acquisition is one of the bottlenecks in expert systems building. Bayesian networks (BNs) =-=[10, 11, 8]-=-, besides having theoretical grounds supporting them as the soundest framework for uncertain reasoning, offer an important advantage for model construction. In this field, the task consists of buildin... |

1 |
Distributed reasoning in Bayesian expert systems
- D'iez, Mira
- 1993
(Show Context)
Citation Context ...cally, i.e. considering the messages received at node X and the parameters of its family. This is a consequence of the global independence assumption. It allows a distributed learning capability (see =-=[3]-=- and fig. 2). ffl Eq. (23), forsP (xju), is equivalent to eq. (17) for P (xju; \Theta x ). The only difference is that average values must be taken instead of the original distribution. The same is tr... |