Results 1  10
of
32
Game Theory, Maximum Entropy, Minimum Discrepancy And Robust Bayesian Decision Theory
 ANNALS OF STATISTICS
, 2004
"... ..."
A Field Guide to Recent Work on the Foundations of Statistical Mechanics
 FORTHCOMING IN DEAN RICKLES (ED.): THE ASHGATE COMPANION TO CONTEMPORARY PHILOSOPHY OF PHYSICS. LONDON: ASHGATE.
, 2008
"... ..."
Probability Update: Conditioning vs. CrossEntropy
 In Proc. Thirteenth Conference on Uncertainty in Artificial Intelligence (UAI
, 1997
"... Conditioning is the generally agreedupon method for updating probability distributions when one learns that an event is certainly true. But it has been argued that we need other rules, in particular the rule of crossentropy minimization, to handle updates that involve uncertain information. In thi ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
(Show Context)
Conditioning is the generally agreedupon method for updating probability distributions when one learns that an event is certainly true. But it has been argued that we need other rules, in particular the rule of crossentropy minimization, to handle updates that involve uncertain information. In this paper we reexamine such a case: van Fraassen's Judy Benjamin problem [1987], which in essence asks how one might update given the value of a conditional probability. We argue thatcontrary to the suggestions in the literatureit is possible to use simple conditionalization in this case, and thereby obtain answers that agree fully with intuition. This contrasts with proposals such as crossentropy, which are easier to apply but can give unsatisfactory answers. Based on the lessons from this example, we speculate on some general philosophical issues concerning probability update. 1 INTRODUCTION How should one update one's beliefs, represented as a probability distribution Pr over some ...
Relative entropy and inductive inference
 in AIP Conference Proceedings on Bayesian Inference and Maximum Entropy Methods in Science and Engineering
, 2004
"... ..."
(Show Context)
The Constraint Rule of the Maximum Entropy Principle
, 1995
"... The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distri ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule to equate the expectation values of certain functions with their empirical averages. There are, however, various other ways in which one can construct constraints from empirical data, which makes the maximum entropy principle lead to very different probability assignments. This paper shows that an argument by Jaynes to justify the usual constraint rule is unsatisfactory and investigates several alternative choices. The choice of a constraint rule is also show...
Probability distribution and entropy as a measure of uncertainty, arXiv:condmat/0612076
"... The relationship between three probability distributions and their maximizable entropy forms is discussed without postulating entropy property. For this purpose, the entropy I is defined as a measure of uncertainty of the probability distribution of a random variable x by a variational relationship ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
(Show Context)
The relationship between three probability distributions and their maximizable entropy forms is discussed without postulating entropy property. For this purpose, the entropy I is defined as a measure of uncertainty of the probability distribution of a random variable x by a variational relationship dxxddI − = , a definition underlying the maximization of entropy for corresponding distribution.
Optimal query forgery for private information retrieval
 IEEE Trans. Inform. Theory
, 2010
"... Abstract—We present a mathematical formulation for the optimization of query forgery for private information retrieval, in the sense that the privacy risk is minimized for a given traffic and processing overhead. The privacy risk is measured as an informationtheoretic divergence between the user’ ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
(Show Context)
Abstract—We present a mathematical formulation for the optimization of query forgery for private information retrieval, in the sense that the privacy risk is minimized for a given traffic and processing overhead. The privacy risk is measured as an informationtheoretic divergence between the user’s query distribution and the population’s, which includes the entropy of the user’s distribution as a special case. We carefully justify and interpret our privacy criterion from diverse perspectives. Our formulation poses a mathematically tractable problem that bears substantial resemblance with ratedistortion theory. Index Terms—Entropy, Kullback–Leibler divergence, privacy risk, private information retrieval, query forgery.
Generalizing the lottery paradox
 The British Journal for the Philosophy of Science
"... This paper is concerned with formal solutions to the lottery paradox on which high probability defeasibly warrants acceptance. It considers some recently proposed solutions of this type and presents an argument showing that these solutions are trivial in that they boil down to the claim that perfect ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
This paper is concerned with formal solutions to the lottery paradox on which high probability defeasibly warrants acceptance. It considers some recently proposed solutions of this type and presents an argument showing that these solutions are trivial in that they boil down to the claim that perfect probability is sufficient for rational acceptability. The argument is then generalized, showing that a broad class of similar solutions faces the same problem. Over the past decades, there has been a steadily growing interest in utilizing probability theory to elucidate, or even analyze, concepts central to traditional epistemology. Special attention in this regard has been given to the notion of rational acceptability. Many have found the following thesis at least prima facie a promising starting point for a probabilistic elucidation of that notion: Sufficiency Thesis (ST) A propositionϕis rationally acceptable if Pr(ϕ)>t, where Pr is a probability distribution over propositions and t is a threshold value close to 1. 1 Another plausible constraint is that when some propositions are rationally
Seeing maximum entropy from the principle of virtual work
"... We propose an extension of the principle of virtual work of mechanics to random dynamics of mechanical systems. The total virtual work of the interacting forces and inertial forces on every particle of the system is calculated by considering the motion of each particle. Then according to the princip ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
We propose an extension of the principle of virtual work of mechanics to random dynamics of mechanical systems. The total virtual work of the interacting forces and inertial forces on every particle of the system is calculated by considering the motion of each particle. Then according to the principle of Lagranged’Alembert for dynamical equilibrium, the vanishing ensemble average of the virtual work gives rise to the thermodynamic equilibrium state with maximization of thermodynamic entropy. This approach establishes a close relationship between the maximum entropy approach for statistical mechanics and a fundamental principle of mechanics, and constitutes an attempt to give the maximum entropy approach, considered by many as only an inference principle based on the subjectivity of probability and entropy, the status of fundamental physics law.