## Probability Update: Conditioning vs. Cross-Entropy (1997)

Venue: | In Proc. Thirteenth Conference on Uncertainty in Artificial Intelligence (UAI |

Citations: | 16 - 2 self |

### BibTeX

@INPROCEEDINGS{Grove97probabilityupdate:,

author = {Adam Grove and Joseph Halpern},

title = {Probability Update: Conditioning vs. Cross-Entropy},

booktitle = {In Proc. Thirteenth Conference on Uncertainty in Artificial Intelligence (UAI},

year = {1997},

pages = {208--214}

}

### OpenURL

### Abstract

Conditioning is the generally agreed-upon method for updating probability distributions when one learns that an event is certainly true. But it has been argued that we need other rules, in particular the rule of cross-entropy minimization, to handle updates that involve uncertain information. In this paper we re-examine such a case: van Fraassen's Judy Benjamin problem [1987], which in essence asks how one might update given the value of a conditional probability. We argue that---contrary to the suggestions in the literature---it is possible to use simple conditionalization in this case, and thereby obtain answers that agree fully with intuition. This contrasts with proposals such as cross-entropy, which are easier to apply but can give unsatisfactory answers. Based on the lessons from this example, we speculate on some general philosophical issues concerning probability update. 1 INTRODUCTION How should one update one's beliefs, represented as a probability distribution Pr over some ...

### Citations

1421 |
On information and sufficiency
- Kullback, Leibler
- 1951
(Show Context)
Citation Context ...es the new information and is in some sense the "closest" to the original distribution Pr. Certainly the best known and most studied of these proposals is to use the rule of minimizing cross=-=-entropy [Kullback and Leibler 1951-=-] as a way of updating with general probabilistic information. This rule can also be shown to generalize Jeffrey's rule [Jeffrey 1983], which in turn generalizes conditioning. But is cross-entropy (CE... |

284 |
2006), Scientific Reasoning - The Bayesian Approach, La Salle: Open
- Howson, Urbach
(Show Context)
Citation Context ...ith respect to the parameterization we have chosen, and vice versa. 4 This is, of course, just an instance of the wellknown impossibility of defining a unique notion of uniformsin a continuous space [=-=Howson and Urbach 1989-=-]. Since M(q) is an event in the new space PHQ , Judy should be able to condition on it. One might object that, since M(q) is an event of measure 0, conditioning is not well defined. This is true, but... |

243 |
The Logic of Decision
- Jeffrey
- 1965
(Show Context)
Citation Context ...ility theory so far, conditioning has been a mostly sufficient answer to the problem of update. But many people have argued that conditioning is not a philosophically adequate answer (in particular, [=-=Jeffrey 1983-=-]). Once we try to build a truly intelligent agent interacting in complex ways with a rich world, conditioning may end up being practically inadequate as well. The problem is that some of the informat... |

212 |
Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
- Shore, Johnson
- 1980
(Show Context)
Citation Context ...983], which in turn generalizes conditioning. But is cross-entropy (CE) really such a good rule? The traditional justifications of CE are that it satisfies various sets of criteria (such as those of [=-=Shore and Johnson 1980]) which, -=-while plausible, are certainly not compelling [Uffink 1995]. Van Fraassen, in a paper entitled "A problem for relative information [CE] minimizers in probability kinematics" [1981] instead a... |

60 |
Updating Subjective Probability
- Diaconis, Zabell
- 1982
(Show Context)
Citation Context ...it is not an event in S. Yet it is clearly useful information. So how should we incorporate it? There is in fact a rich literature on the subject (e.g., see [Bacchus, Grove, Halpern, and Koller 1994; =-=Diaconis and Zabell 1982; Jeffrey -=-1983; Jaynes 1983; Paris and Vencovska 1992; Uffink 1995]). Most proposals attempt to find the probability distribution that satisfies the new information and is in some sense the "closest" ... |

20 | Can the maximum entropy principle be explained as a consistency requirement
- Uffink
(Show Context)
Citation Context ... incorporate it? There is in fact a rich literature on the subject (e.g., see [Bacchus, Grove, Halpern, and Koller 1994; Diaconis and Zabell 1982; Jeffrey 1983; Jaynes 1983; Paris and Vencovska 1992; =-=Uffink 1995]). Most p-=-roposals attempt to find the probability distribution that satisfies the new information and is in some sense the "closest" to the original distribution Pr. Certainly the best known and most... |

19 |
Entropy and uncertainty
- Seidenfeld
- 1987
(Show Context)
Citation Context ...e territory"; if q = 0 by simple conditionalization on "Red 2nd company area or Blue territory". 1 We remark that this behavior of CE has also been discussed and criticized in a more ge=-=neral setting [Seidenfeld 1987-=-]. In Proceedings of the Thirteenth Annual Conference on Uncertainty in Artificial Intelligence, Providence, Rhode Island, August 1-3, 1997 This first principle already eliminates the intuitive rule, ... |

13 | Generating new beliefs from old - Bacchus, Grove, et al. - 1994 |

11 | The constraint rule of the maximum entropy principle
- Uffink
- 1996
(Show Context)
Citation Context ...erative: "Make your beliefs be such that this is true!" This interpretation of probabilistic information as constraints is a common one (especially in the context of CE), but is difficult to=-= justify [Uffink 1996-=-]. Van Fraassen is, of course, quite aware of the philosophical issues raised by his interpretation; see [van Fraassen 1980]. But is the interpretation that HQ's statement should be regarded as a cons... |

9 | 1981] “A Problem for Relative Information Minimizers - Fraassen |

5 |
A method for updating justifying minimum cross entropy
- Paris, Vencovska
- 1991
(Show Context)
Citation Context ...ormation. So how should we incorporate it? There is in fact a rich literature on the subject (e.g., see [Bacchus, Grove, Halpern, and Koller 1994; Diaconis and Zabell 1982; Jeffrey 1983; Jaynes 1983; =-=Paris and Vencovska 1992; Uffink 1-=-995]). Most proposals attempt to find the probability distribution that satisfies the new information and is in some sense the "closest" to the original distribution Pr. Certainly the best k... |

5 | Symmetries of personal probability kinematics - Fraassen - 1987 |

4 | Rational belief and probability kinematics - Fraassen - 1980 |