Results 1 
7 of
7
Discriminative Models for Information Retrieval
 SIGIR '04
, 2004
"... Discriminative models have been preferred over generative models in many machine learning problems in the recent past owing to some of their attractive theoretical properties. In this paper, we explore the applicability of discriminative classifiers for IR. We have compared the performance of two po ..."
Abstract

Cited by 75 (1 self)
 Add to MetaCart
Discriminative models have been preferred over generative models in many machine learning problems in the recent past owing to some of their attractive theoretical properties. In this paper, we explore the applicability of discriminative classifiers for IR. We have compared the performance of two popular discriminative models, namely the maximum entropy model and support vector machines with that of language modeling, the stateoftheart generative model for IR. Our experiments on adhoc retrieval indicate that although maximum entropy is significantly worse than language models, support vector machines are on par with language models. We argue that the main reason to prefer SVMs over language models is their ability to learn arbitrary features automatically as demonstrated by our experiments on the homepage finding task of TREC10.
Term Dependence: Truncating the Bahadur Lazarsfeld Expansion
 Information Processing and Management
, 1994
"... The performance of probabilistic information retrieval systems is studied where differing statistical dependence assumptions are used when estimating the probabilities inherent in the retrieval model. Experimental results using the Bahadur Lazarsfeld expansion suggest that the greatest degree of ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
The performance of probabilistic information retrieval systems is studied where differing statistical dependence assumptions are used when estimating the probabilities inherent in the retrieval model. Experimental results using the Bahadur Lazarsfeld expansion suggest that the greatest degree of performance increase is achieved by incorporating term dependence information in estimating . It is suggested that incorporating dependence in to degree 3 be used; incorporating more dependence information results in relatively little increase in performance. Experiments examine the span of dependence in natural language text, the window of terms in which dependencies are computed and their effect on information retrieval performance. Results provide additional support for the notion of a window of to terms in width; terms in this window may be most useful when computing dependence. 2 1 Introduction Those who study information retrieval often assume that the features or terms use...
The Maximum Entropy Approach and Probabilistic IR Models
 ACM TRANSACTIONS ON INFORMATION SYSTEMS
, 1998
"... The Principle of Maximum Entropy is discussed and two classic probabilistic models of information retrieval, the Binary Independence Model of Robertson and Sparck Jones and the Combination Match Model of Croft and Harper are derived using the maximum entropy approach. The assumptions on which the cl ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
The Principle of Maximum Entropy is discussed and two classic probabilistic models of information retrieval, the Binary Independence Model of Robertson and Sparck Jones and the Combination Match Model of Croft and Harper are derived using the maximum entropy approach. The assumptions on which the classical models are based are not made. In their place, the probability distribution of maximum entropy consistent with a set of constraints is determined. It is argued that this subjectivist approach is more philosophically coherent than the frequentist conceptualization of probability that is often assumed as the basis of probabilistic modeling and that this philosophical stance has important practical consequences with respect to the realization of information retrieval research.
Optimum Probability Estimation from Empirical Distributions
 Information Processing and Management
, 1989
"... Probability estimation is important for the application of probabilistic models as well as for any evaluation in IR. We discuss the interdependencies between parameter estimation and certain properties of probabilistic models: dependence assumptions, binary vs. nonbinary features, estimation sample ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Probability estimation is important for the application of probabilistic models as well as for any evaluation in IR. We discuss the interdependencies between parameter estimation and certain properties of probabilistic models: dependence assumptions, binary vs. nonbinary features, estimation sample selection. Then we define an optimum estimate for binary features which can be applied to various typical estimation problems in IR. A method for computing this estimate using empirical data is described. Some experiments show the applicability of our method, whereas comparable approaches are partially based on false assumptions or yield biased estimates. 1 Parameter estimation in IR In IR the development of theoretical models and their evaluation in experiments is of equal importance: A model which cannot be evaluated (applied) is of very little use, while an evaluation can show its weaknesses and strengths and give evidence for further developments. As will be discussed below, any evaluation in IR involves some kind of parameter estimation, even for nonprobabilistic models. So it is interesting to note that the problem of parameter estimation has been discussed only by a few authors ( [Rijsbergen 77], [Robertson & Bovey 82], [Bookstein 83], [?]). In this paper, an attempt is
Probabilistic Information Retrieval Model for Dependency Structured Indexing System
 In Proceedings of the ACM SIGIRâ€™02 Workshop on Mathematical/Formal Methods in Information Retrieval, 2002. Proceedings of the Third NTCIR Workshop
, 2002
"... statistically independent from each another. However, independence assumption is obviously and openly understood to be wrong, so we present a new method of incorporating term dependence in probabilistic retrieval model by adapting a structural index system using dependency parse tree and the Chow Ex ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
statistically independent from each another. However, independence assumption is obviously and openly understood to be wrong, so we present a new method of incorporating term dependence in probabilistic retrieval model by adapting a structural index system using dependency parse tree and the Chow Expansion to compensate the weakness of the assumption. In this paper, we describe a theoretic process to apply the Chow Expansion to the general probabilistic models and the stateoftheart 2Poisson model, and we reexamine the weight of phrase terms. Through the experiments on document collections, ETRIKEMONG in Korean, we demonstrate that the incorporation of term dependences using the Chow Expansion contribute to the improvement of performance in Probabilistic IR systems. Keywords term dependence, phrasal indexing, Chow Expansion, probabilistic model, 2Poisson model 1.
Distance, Minimum CrossEntropy, and Path methods. Background and Purpose of the Study
, 1988
"... The maximum entropy principle may be applied to the design of probabilistic retrieval systems. When there are inconsistent expert judgments, the resulting optimization problem cannot be solved. The inconsistency of the expert judgments can be revealed by solving a linear programming formulation. In ..."
Abstract
 Add to MetaCart
The maximum entropy principle may be applied to the design of probabilistic retrieval systems. When there are inconsistent expert judgments, the resulting optimization problem cannot be solved. The inconsistency of the expert judgments can be revealed by solving a linear programming formulation. In the case of inconsistent judgment, four plausible schemes are proposed in order to find revised judgments which are consistent with the true data structure but still reflect the original expert judgment. These schemes are the Interactive, Minimum