## Objective Bayesian nets (2005)

Venue: | We Will Show Them! Essays in Honour of Dov Gabbay |

Citations: | 12 - 10 self |

### BibTeX

@INPROCEEDINGS{Williamson05objectivebayesian,

author = {Jon Williamson},

title = {Objective Bayesian nets},

booktitle = {We Will Show Them! Essays in Honour of Dov Gabbay},

year = {2005},

pages = {713--730},

publisher = {College Publications}

}

### OpenURL

### Abstract

I present a formalism that combines two methodologies: objective Bayesianism and Bayesian nets. According to objective Bayesianism, an agent’s degrees of belief (i) ought to satisfy the axioms of probability, (ii) ought to satisfy constraints imposed by background knowledge, and (iii) should otherwise be as non-committal as possible (i.e. have maximum entropy). Bayesian nets offer an efficient way of representing and updating probability functions. An objective Bayesian net is a Bayesian net representation of the maximum entropy probability function. I show how objective Bayesian nets can be constructed, updated and combined, and how they can deal with cases in which the agent’s background knowledge includes knowledge of qualitative influence relationships, e.g. causal influences. I then sketch a number of applications of the resulting formalism, showing how it can shed light on probability logic, causal modelling, logical reasoning, semantic reasoning, argumentation

### Citations

6038 |
A mathematical theory of communication
- Shannon
- 1948
(Show Context)
Citation Context ...inciple is often called the maximum entropy principle; it considerably narrows 1 We shall assume throughout that Ω is finite. The extension to the infinite is discussed in Williamson (2005c, §19). 2 (=-=Shannon, 1948-=-) 3 (Jaynes, 1957) 4 (Williamson, 2005b) 4s✓✏ B ✒✑❍❍❍❍❥ ✓✏ A ✒✑ ✟✟✟✟✯ ✓✏ ❍ D ❍❍❍❥ ✒✑ ✓✏ C ✒✑ ✟✟✟✟✯ ❍ ❍❍❍❥ ✓✏ E ✒✑ Figure 1: A directed acyclic graph. down the values one can ascribe to p�l�fdc�. 5 Two... |

1117 | Causality: Models, Reasoning, and Inference - Pearl - 2000 |

668 |
Information theory and statistical mechanics
- Jaynes
- 1957
(Show Context)
Citation Context ...alled the maximum entropy principle; it considerably narrows 1 We shall assume throughout that Ω is finite. The extension to the infinite is discussed in Williamson (2005c, §19). 2 (Shannon, 1948) 3 (=-=Jaynes, 1957-=-) 4 (Williamson, 2005b) 4s✓✏ B ✒✑❍❍❍❍❥ ✓✏ A ✒✑ ✟✟✟✟✯ ✓✏ ❍ D ❍❍❍❥ ✒✑ ✓✏ C ✒✑ ✟✟✟✟✯ ❍ ❍❍❍❥ ✓✏ E ✒✑ Figure 1: A directed acyclic graph. down the values one can ascribe to p�l�fdc�. 5 Two questions remain... |

194 |
Probabilistic Reasoning in Expert Systems: Theory and Algorithms
- Neapolitan
- 1990
(Show Context)
Citation Context ...ollowing strategy: Step 1 determine conditional independencies that p � Pβ must satisfy, Step 2 represent these by a directed acyclic graph G that satisfies the Markov Condition with respect to p, 6 (=-=Neapolitan, 1990-=-) 6s✓✏ A1 ✒✑ ✓✏ A2 ❍❍❍❍ ✒✑ ✓✏ ✒✑ ✓✏ A4 A3 ✟✟✟✟ A5 Figure 2: Constraint graph. ✒✑ ✓✏ ✒✑ Step 3 maximise entropy to calculate the numerical parameters p�ai�par i� in the probability specification S. We ... |

60 |
The Uncertain Reasoner’s Companion
- Paris
- 1994
(Show Context)
Citation Context ...6 (Neapolitan, 1990, Chapters 6–7) 17 N.b. β � β � . Here p � � Pβ � is the function minimising d�p � , c�, where c is the central function that gives the same probability to each elementary outcome (=-=Paris, 1994-=-, p. 120). As long as constraints are all affine, this as the same function as that found by minimising d�p, c� first and then minimising d�p � , p�—see Williams (1980, pp. 139–140). 18 (Williamson, 2... |

48 | Bayesian Nets and Causality: Philosophical and Computational Foundations - Williamson - 2005 |

21 | Motivating objective Bayesianism: From empirical constraints to objective probabilities - Williamson - 2007 |

20 |
Bayesian conditionalisation and the principle of minimum information
- Williams
- 1980
(Show Context)
Citation Context ...associated conditional probability distributions; the rest of the net stays the same. 18 12 (Williamson, 2005a, pp. 99–100) 13 (Williamson, 2005a, Theorem 5.7) 14 (Williamson, 2005a, Theorem 5.8) 15 (=-=Williams, 1980-=-, pp. 134–135) 16 (Neapolitan, 1990, Chapters 6–7) 17 N.b. β � β � . Here p � � Pβ � is the function minimising d�p � , c�, where c is the central function that gives the same probability to each elem... |

6 | Philosophies of probability: objective Bayesianism and its challenges - Williamson - 2004 |

5 |
Intersubjective probability and confirmation theory
- Gillies
- 1991
(Show Context)
Citation Context ...entin’s, then her constraint should override Quentin’s, and only the constraint p�a� � 0.8 should appear in the combined knowledge base β � γ � δ. On the other hand, if neither reference class is 19 (=-=Gillies, 1991-=-) 11snarrower than the other then neither constraint is defeated by the other and the best one can do is include the constraint p�a� � �0.7, 0.8� in β. 20 In general, we can say that Pβ is the smalles... |