Results 1  10
of
83
Probabilistic Models for Information Retrieval based on Divergence from Randomness
 ACM Transactions on Information Systems
, 2002
"... We introduce and create a framework for deriving probabilistic models of Information Retrieval. The models are nonparametric models of IR obtained in the language model approach. We derive termweighting models by measuring the divergence of the actual term distribution from that obtained under a ra ..."
Abstract

Cited by 149 (5 self)
 Add to MetaCart
We introduce and create a framework for deriving probabilistic models of Information Retrieval. The models are nonparametric models of IR obtained in the language model approach. We derive termweighting models by measuring the divergence of the actual term distribution from that obtained under a random process. Among the random processes we study the binomial distribution and Bose–Einstein statistics. We define two types of term frequency normalization for tuning term weights in the document–query matching process. The first normalization assumes that documents have the same length and measures the information gain with the observed term once it has been accepted as a good descriptor of the observed document. The second normalization is related to the document length and to other statistics. These two normalization methods are applied to the basic models in succession to obtain weighting formulae. Results show that our framework produces different nonparametric models forming baseline alternatives to the standard tfidf model.
Perspectives on the Theory and Practice of Belief Functions
 International Journal of Approximate Reasoning
, 1990
"... The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answer ..."
Abstract

Cited by 86 (7 self)
 Add to MetaCart
The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answers to that question. The theory of belief functions is more flexible; it allows us to derive degrees of belief for a question from probabilities for a related question. These degrees of belief may or may not have the mathematical properties of probabilities; how much they differ from probabilities will depend on how closely the two questions are related. Examples of what we would now call belieffunction reasoning can be found in the late seventeenth and early eighteenth centuries, well before Bayesian ideas were developed. In 1689, George Hooper gave rules for combining testimony that can be recognized as special cases of Dempster's rule for combining belief functions (Shafer 1986a). Similar rules were formulated by Jakob Bernoulli in his Ars Conjectandi, published posthumously in 1713, and by JohannHeinrich Lambert in his Neues Organon, published in 1764 (Shafer 1978). Examples of belieffunction reasoning can also be found in more recent work, by authors
From Laplace To Supernova Sn 1987a: Bayesian Inference In Astrophysics
, 1990
"... . The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
. The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions to wellposed statistical problems, and is historically the original approach to statistics. The reasons for earlier rejection of Bayesian methods are discussed, and it is noted that the work of Cox, Jaynes, and others answers earlier objections, giving Bayesian inference a firm logical and mathematical foundation as the correct mathematical language for quantifying uncertainty. The Bayesian approaches to parameter estimation and model comparison are outlined and illustrated by application to a simple problem based on the gaussian distribution. As further illustrations of the Bayesian paradigm, Bayesian solutions to two interesting astrophysical problems are outlined: the measurement of wea...
Computations with Imprecise Parameters in Engineering Design: Background and Theory
 ASME JOURNAL OF MECHANISMS, TRANSMISSIONS, AND AUTOMATION IN DESIGN
, 1989
"... A technique to perform design calculations on imprecise representations of parameters has been developed and is presented. The level of imprecision in the description of design elements is typically high in the preliminary phase of engineering design. This imprecision is represented using the fuzzy ..."
Abstract

Cited by 51 (18 self)
 Add to MetaCart
A technique to perform design calculations on imprecise representations of parameters has been developed and is presented. The level of imprecision in the description of design elements is typically high in the preliminary phase of engineering design. This imprecision is represented using the fuzzy calculus. Calculations can be performed using this method, to produce (imprecise) performance parameters from imprecise (input) design parameters. The Fuzzy Weighted Average technique is used to perform these calculations. A new metric, called the γlevel measure, is introduced to determine the relative coupling between imprecise inputs and outputs. The background and theory supporting this approach are presented, along with one example.
Engineering Design Calculations with Fuzzy Parameters. Fuzzy Sets and Systems
, 1992
"... Uncertainty in engineering analysis usually pertains to stochastic uncertainty, i.e.,variance in product or process parameters characterized by probability (uncertainty in truth). Methods for calculating under stochastic uncertainty are well documented. It has been proposed by the authors that other ..."
Abstract

Cited by 34 (13 self)
 Add to MetaCart
Uncertainty in engineering analysis usually pertains to stochastic uncertainty, i.e.,variance in product or process parameters characterized by probability (uncertainty in truth). Methods for calculating under stochastic uncertainty are well documented. It has been proposed by the authors that other forms of uncertainty exist in engineering design. Imprecision, or the concept of uncertainty in choice, is one such form. This paper considers realtime techniques for calculating with imprecise parameters. These techniques utilize interval mathematics and the notion of αcuts from the fuzzy calculus. The extremes or anomalies of the techniques are also investigated, particularly the evaluation of singular or multivalued functions. It will be shown that realistic engineering functions can be used in imprecision calculations, with reasonable computational performance.
Lattice duality: The origin of probability and entropy
 In press: Neurocomputing
, 2005
"... Bayesian probability theory is an inference calculus, which originates from a generalization of inclusion on the Boolean lattice of logical assertions to a degree of inclusion represented by a real number. Dual to this lattice is the distributive lattice of questions constructed from the ordered set ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
Bayesian probability theory is an inference calculus, which originates from a generalization of inclusion on the Boolean lattice of logical assertions to a degree of inclusion represented by a real number. Dual to this lattice is the distributive lattice of questions constructed from the ordered set of downsets of assertions, which forms the foundation of the calculus of inquiry—a generalization of information theory. In this paper we introduce this novel perspective on these spaces in which machine learning is performed and discuss the relationship between these results and several proposed generalizations of information theory in the literature.
Bayesian Analysis. I. Parameter Estimation Using Quadrature NMR Models
 J. Magn. Reson
, 1990
"... . In the analysis of magnetic resonance data, a great deal of prior information is available which is ordinarily not used. For example, considering high resolution NMR spectroscopy, one knows in general terms what functional form the signal will take (e.g., sum of exponentially decaying sinusoids) a ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
. In the analysis of magnetic resonance data, a great deal of prior information is available which is ordinarily not used. For example, considering high resolution NMR spectroscopy, one knows in general terms what functional form the signal will take (e.g., sum of exponentially decaying sinusoids) and that, for quadrature measurements, it will be the same in both channels except for a 90 ffi phase shift. When prior information is incorporated into the analysis of time domain data, the frequencies, decay rate constants, and amplitudes may be estimated much more precisely than by direct use of discrete Fourier transforms. Here, Bayesian probability theory is used to estimate parameters using quadrature models of NMR data. The calculation results in an interpretation of the quadrature model fitting that allows one to understand on an intuitive level what frequencies and decay rates will be estimated and why. Introduction Probability theory when interpreted as logic is a quantitative th...
A hierarchical Bayesian model of human decisionmaking on an optimal stopping problem
 Cognitive Science
, 2006
"... Wiener diffusion accounts of human decisionmaking are among the most successful and best developed formal models in the psychological sciences. We reconsider these models from a Bayesian perspective, using graphical modeling, and Markov Chain MonteCarlo methods for posterior sampling. By analyzing ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
Wiener diffusion accounts of human decisionmaking are among the most successful and best developed formal models in the psychological sciences. We reconsider these models from a Bayesian perspective, using graphical modeling, and Markov Chain MonteCarlo methods for posterior sampling. By analyzing seminal data from a brightness discrimination task, we show how the Bayesian approach offers several avenues for extending and improving diffusion models. These possibilities include the hierarchical modeling of stimulus properties, and modeling the role of contaminant processes in generating experimental data. We also argue that the Bayesian approach challenges some basic assumptions of previous diffusion models, involving how variability in decisionmaking should be interpreted. We conclude that adopting a Bayesian approach to relating diffusion models and human decisionmaking data will sharpen the theoretical and empirical questions, and improve our understanding of a basic human cognitive ability. BAYESIAN DIFFUSION DECISIONMAKING 2
Philosophy and the practice of Bayesian statistics
, 2010
"... A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypotheticodeductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework.