Results 1 
4 of
4
The Maximum Entropy Approach and Probabilistic IR Models
 ACM TRANSACTIONS ON INFORMATION SYSTEMS
, 1998
"... The Principle of Maximum Entropy is discussed and two classic probabilistic models of information retrieval, the Binary Independence Model of Robertson and Sparck Jones and the Combination Match Model of Croft and Harper are derived using the maximum entropy approach. The assumptions on which the cl ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
The Principle of Maximum Entropy is discussed and two classic probabilistic models of information retrieval, the Binary Independence Model of Robertson and Sparck Jones and the Combination Match Model of Croft and Harper are derived using the maximum entropy approach. The assumptions on which the classical models are based are not made. In their place, the probability distribution of maximum entropy consistent with a set of constraints is determined. It is argued that this subjectivist approach is more philosophically coherent than the frequentist conceptualization of probability that is often assumed as the basis of probabilistic modeling and that this philosophical stance has important practical consequences with respect to the realization of information retrieval research.
Whatever happened to Information Theory in psychology
 Review of General Psychology
, 2003
"... Although Shannon’s information theory is alive and well in a number of fields, after an initial fad in psychology during the 1950s and 1960s it no longer is much of a factor, beyond the word bit, in psychological theory. The author discusses what seems to him (and others) to be the root causes of an ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Although Shannon’s information theory is alive and well in a number of fields, after an initial fad in psychology during the 1950s and 1960s it no longer is much of a factor, beyond the word bit, in psychological theory. The author discusses what seems to him (and others) to be the root causes of an actual incompatibility between information theory and the psychological phenomena to which it has been applied. Claude Shannon, the creator of information theory, or communication theory as he preferred to call it, died on February 24, 2001, at age 84. So, I would like to dedicate this brief piece to his memory and in particular to recall his seminal contribution “A Mathematical Theory of Communication, ” which was published in two
The Political Entropy of Vote Choice: An Empirical Test of Uncertainty Reduction
 Presented at the 1997 Annual Meeting of the American Political Science Association, August 27–31
, 1997
"... Recent literature in voting theory has developed the idea that individual voting preferences are probabilistic rather than strictly deterministic. This work builds upon spatial voting models (Enelow and Hinich 1981, Ferejohn and Fiorina 1974, Davis, DeGroot and Hinich 1972, Farquharson 1969) by intr ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Recent literature in voting theory has developed the idea that individual voting preferences are probabilistic rather than strictly deterministic. This work builds upon spatial voting models (Enelow and Hinich 1981, Ferejohn and Fiorina 1974, Davis, DeGroot and Hinich 1972, Farquharson 1969) by introducing probabilistic uncertainty into the calculus of voting decision on an individual level. Some suggest that the voting decision can be modeled with traditional probabilistic tools of uncertainty (Coughlin 1990, Coughlin and Nitzen 1981). Entropy is a measure of uncertainty that originated in statistical thermodynamics. Essentially, entropy indicates the amount of uncertainty in probability distributions (Soofi 1992), or it can be thought of as signifying a lack of human knowledge about some random event (Denbigh and Denbigh, 1985). Entropy in statistics developed with Kolmogorov (1959), Kinchin (1957), and Shannon (1948), but has rarely been applied to social science problems. Exception...
www.mdpi.org/entropy/ An Alternative to Entropy in the Measurement of Information
"... Abstract Entropy has been the main tool in the analysis of the concept of information since information theory was conceived in the work of Shannon more than fifty years ago. There were some attempts to find more general measure of information, but their outcomes were more of formal, theoretical int ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract Entropy has been the main tool in the analysis of the concept of information since information theory was conceived in the work of Shannon more than fifty years ago. There were some attempts to find more general measure of information, but their outcomes were more of formal, theoretical interest, and neither has provided better insight into the nature of