Results 1  10
of
17
Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross entropy
 IEEE Trans. Information Theory
, 1980
"... dple of min imum cromentropy (mhlmum dire&d dfvergenoe) are shown tobeunfquelycomxtmethodsforhductiveinf~whennewinformnt ionlsghninthefomlofexpe&edvalues.ReviousjILstit icatioaslLve ..."
Abstract

Cited by 255 (2 self)
 Add to MetaCart
dple of min imum cromentropy (mhlmum dire&d dfvergenoe) are shown tobeunfquelycomxtmethodsforhductiveinf~whennewinformnt ionlsghninthefomlofexpe&edvalues.ReviousjILstit icatioaslLve
A Natural Law of Succession
, 1995
"... Consider the following problem. You are given an alphabet of k distinct symbols and are told that the i th symbol occurred exactly ni times in the past. On the basis of this information alone, you must now estimate the conditional probability that the next symbol will be i. In this report, we presen ..."
Abstract

Cited by 40 (3 self)
 Add to MetaCart
Consider the following problem. You are given an alphabet of k distinct symbols and are told that the i th symbol occurred exactly ni times in the past. On the basis of this information alone, you must now estimate the conditional probability that the next symbol will be i. In this report, we present a new solution to this fundamental problem in statistics and demonstrate that our solution outperforms standard approaches, both in theory and in practice.
IntervalValued Probabilities
, 1998
"... 0 =h 0 in the diagram. The sawtooth line reflects the fact that even when the principle of indifference can be applied, there may be arguments whose strength can be bounded no more precisely than by an adjacent pair of indifference arguments. Note that a=h in the diagram is bounded numerically on ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
(Show Context)
0 =h 0 in the diagram. The sawtooth line reflects the fact that even when the principle of indifference can be applied, there may be arguments whose strength can be bounded no more precisely than by an adjacent pair of indifference arguments. Note that a=h in the diagram is bounded numerically only by 0.0 and the strength of a 00 =h 00 . Keynes' ideas were taken up by B. O. Koopman [14, 15, 16], who provided an axiomatization for Keynes' probability values. The axioms are qualitative, and reflect what Keynes said about probability judgment. (It should be remembered that for Keynes probability judgment was intended to be objective in the sense that logic is objective. Although different people may accept different premises, whether or not a conclusion follows logically from a given set of premises is objective. Though Ramsey [26] attacked this aspect of Keynes' theory, it can be argued
A tutorial introduction to decision theory
 IEEE Transactions on Systems Science and Cybernetics
, 1968
"... AbstractDecision theory provides a rational framework for choosing between alternative courses of action when the consequences resulting from this choice are imperfectly known. Two ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
AbstractDecision theory provides a rational framework for choosing between alternative courses of action when the consequences resulting from this choice are imperfectly known. Two
Dependency Language Modeling
, 1997
"... This report summarizes the work of the Dependency Language Modeling group at the 1996 Summer Speech Workshop at the Center for Language and Speech Processing at Johns Hopkins University (WS96). We motivate and descibe a novel statistical language model that models the syntactic dependencies between ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
This report summarizes the work of the Dependency Language Modeling group at the 1996 Summer Speech Workshop at the Center for Language and Speech Processing at Johns Hopkins University (WS96). We motivate and descibe a novel statistical language model that models the syntactic dependencies between words. The model is formulated in the maximum entropy framework, which expresses statistical constraints on the frequencies of various type of dependencies, as well the standard Ngram statistics. We describe how this model was applied to the recognition of spontaneous English speech from the Switchboard corpus. Due to implementation constraints, only a reduced version of our model could be tested so far. The model gave a modest improvement over an Ngram baseline model. A byproduct of the project is the Maximim Entropy Modeling Toolkit (MEMT), a freely available software package for domainindependent maximum entropy modeling. 1 Introduction Current stateoftheart language models for s...
Uncertain Inferences and Uncertain Conclusions
, 1996
"... Uncertainty may be taken to characterize inferences, their conclusions, their premises or all three. Under some treatments of uncertainty, the inference itself is never characterized by uncertainty. We explore both the significance of uncertainty in the premises and in the conclusion of an arg ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Uncertainty may be taken to characterize inferences, their conclusions, their premises or all three. Under some treatments of uncertainty, the inference itself is never characterized by uncertainty. We explore both the significance of uncertainty in the premises and in the conclusion of an argument that involves uncertainty. We argue that for uncertainty to characterize the conclusion of an inference is natural, but that there is an interplay between uncertainty in the premises and uncertainty in the procedure of argument itself. We show that it is possible in principle to incorporate all uncertainty in the premises, rendering uncertainty arguments deductively valid. But we then argue (1) that this does not reflect human argument, (2) that it is computationally costly, and (3) that the gain in simplicity obtained by allowing uncertainty in inference can sometimes outweigh the loss of flexibility it entails. 1 BEING UNCERTAIN AND BEING ABOUT UNCERTAINTY There are ...
Semantics for Interval Probabilities
 In Proc. of 15th International FLAIRS Conference, 253–257, AAAI
, 2002
"... We look at the several sorts of semantics that have been provided for “probability”, and explore the possibilities of generalizing them to imprecise probabilities. The Support Set of a statement is a useful auxiliary construct, however probability is interpreted. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
We look at the several sorts of semantics that have been provided for “probability”, and explore the possibilities of generalizing them to imprecise probabilities. The Support Set of a statement is a useful auxiliary construct, however probability is interpreted.
On derivation of the extreme value (EV) type III distribution for low flows using entropy
"... ABSTRACT The extreme value type III distribution was derived by using the principle of maximum entropy. The derivation required only two constraints to be determined from data, and yielded a procedure for estimation of distribution parameters. This method of parameter estimation was comparable to th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
ABSTRACT The extreme value type III distribution was derived by using the principle of maximum entropy. The derivation required only two constraints to be determined from data, and yielded a procedure for estimation of distribution parameters. This method of parameter estimation was comparable to the methods of moments (MOM) and maximum likelihood estimation (MLE) for the low flow data used. Sur la determination de la distribution des valeurs extrêmes de type III pour les basses eaux a partir de la notion d'entropie RESUME La valeur extrême de type III de la distribution a été déterminée en utilisant le principe de l'entropie maximale. Cette détermination nécessite seulement des conditions aux limites qui sont a determiner a partir des données. Elle fournit une méthodologie pour estimer les paramètres de la distribution. Cette méthode d'estimation des paramètres est comparable aux méthodes des moments (MOM) et du maximum de vraisemblance (likelihood estimation, MLE) pour les données de basses eaux utilisées.
Information Physics: The New Frontier
"... Abstract. At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, rest on the foundations of probability and entropy. The last century saw several significant fundamental advances in our understanding of the process of inference, which make it clear that these ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, rest on the foundations of probability and entropy. The last century saw several significant fundamental advances in our understanding of the process of inference, which make it clear that these are inferential theories. That is, rather than being a description of the behavior of the universe, these theories describe how observers can make optimal predictions about the universe. In such a picture, information plays a critical role. What is more is that little clues, such as the fact that black holes have entropy, continue to suggest that information is fundamental to physics in general. In the last decade, our fundamental understanding of probability theory has led to a Bayesian revolution. In addition, we have come to recognize that the foundations go far deeper and that Cox’s approach of generalizing a Boolean algebra to a probability calculus is the first specific example of the more fundamental idea of assigning valuations to partiallyordered sets. By considering this as a natural way to introduce quantification to the more fundamental notion of ordering, one obtains an entirely new way of deriving physical laws. I will introduce this new way of thinking by demonstrating how one can quantify partiallyordered sets and, in the process, derive physical laws. The implication is that physical law does not reflect the order in the universe, instead it is derived from the order imposed by our description of the universe. Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science.
STATISTICAL MECHANICS, Professor
, 1061
"... In his course on SELECTED TOPICS IN ..."
(Show Context)