Results 1  10
of
73,870
Bayes Factors
, 1995
"... In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null ..."
Abstract

Cited by 1766 (74 self)
 Add to MetaCart
is onehalf. Although there has been much discussion of Bayesian hypothesis testing in the context of criticism of P values, less attention has been given to the Bayes factor as a practical tool of applied statistics. In this paper we review and discuss the uses of Bayes factors in the context of five
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fields, including bioinformatics, communication theory, statistical physics, combinatorial optimization, signal and image processing, information retrieval and statistical machine learning. Many problems that arise in specific instances — including the key problems of computing marginals and modes of probability distributions — are best studied in the general setting. Working with exponential family representations, and exploiting the conjugate duality between the cumulant function and the entropy for exponential families, we develop general variational representations of the problems of computing likelihoods, marginal probabilities and most probable configurations. We describe how a wide varietyof algorithms — among them sumproduct, cluster variational methods, expectationpropagation, mean field methods, maxproduct and linear programming relaxation, as well as conic programming relaxations — can all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in largescale statistical models.
A Signal Processing Approach To Fair Surface Design
, 1995
"... In this paper we describe a new tool for interactive freeform fair surface design. By generalizing classical discrete Fourier analysis to twodimensional discrete surface signals  functions defined on polyhedral surfaces of arbitrary topology , we reduce the problem of surface smoothing, or fai ..."
Abstract

Cited by 668 (15 self)
 Add to MetaCart
In this paper we describe a new tool for interactive freeform fair surface design. By generalizing classical discrete Fourier analysis to twodimensional discrete surface signals  functions defined on polyhedral surfaces of arbitrary topology , we reduce the problem of surface smoothing, or fairing, to lowpass filtering. We describe a very simple surface signal lowpass filter algorithm that applies to surfaces of arbitrary topology. As opposed to other existing optimizationbased fairing methods, which are computationally more expensive, this is a linear time and space complexity algorithm. With this algorithm, fairing very large surfaces, such as those obtained from volumetric medical data, becomes affordable. By combining this algorithm with surface subdivision methods we obtain a very effective fair surface design technique. We then extend the analysis, and modify the algorithm accordingly, to accommodate different types of constraints. Some constraints can be imposed without any modification of the algorithm, while others require the solution of a small associated linear system of equations. In particular, vertex location constraints, vertex normal constraints, and surface normal discontinuities across curves embedded in the surface, can be imposed with this technique. CR Categories and Subject Descriptors: I.3.3 [Computer Graphics]: Picture/image generation  display algorithms; I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling  curve, surface, solid, and object representations;J.6[Com puter Applications]: ComputerAided Engineering  computeraided design General Terms: Algorithms, Graphics. 1
Capitalism, Socialism and Democracy
, 1942
"... or method of economic change and not only never is but never can be stationary. ..."
Abstract

Cited by 2162 (0 self)
 Add to MetaCart
or method of economic change and not only never is but never can be stationary.
Cumulated Gainbased Evaluation of IR Techniques
 ACM Transactions on Information Systems
, 2002
"... Modem large retrieval environments tend to overwhelm their users by their large output. Since all documents are not of equal relevance to their users, highly relevant documents should be identified and ranked first for presentation to the users. In order to develop IR techniques to this direction, i ..."
Abstract

Cited by 656 (3 self)
 Add to MetaCart
Modem large retrieval environments tend to overwhelm their users by their large output. Since all documents are not of equal relevance to their users, highly relevant documents should be identified and ranked first for presentation to the users. In order to develop IR techniques to this direction, it is necessary to develop evaluation approaches and methods that credit IR methods for their ability to retrieve highly relevant documents. This can be done by extending traditional evaluation methods, i.e., recall and precision based on binary relevance assessments, to graded relevance assessments. Alternatively, novel measures based on graded relevance assessments may be developed. This paper proposes three novel measures that compute the cumulative gain the user obtains by examining the retrieval result up to a given ranked position. The first one accumulates the relevance scores of retrieved documents along the ranked result list. The second one is similar but applies a discount factor on the relevance scores in order to devaluate lateretrieved documents. The third one computes the relativetothe ideal performance of IR techniques, based on the cumulative gain they are able to yield. The novel measures are defined and discussed and then their use is demonstrated in a case study using TREC data  sample system run results for 20 queries in TREC7. As relevance base we used novel graded relevance assessments on a fourpoint scale. The test results indicate that the proposed measures credit IR methods for their ability to retrieve highly relevant documents and allow testing of statistical significance of effectiveness differences. The graphs based on the measures also provide insight into the performance IR techniques and allow interpretation, e.g., from the user point of ...
Induction of Decision Trees
 MACH. LEARN
, 1986
"... The technology for building knowledgebased systems by inductive inference from examples has been demonstrated successfully in several practical applications. This paper summarizes an approach to synthesizing decision trees that has been used in a variety of systems, and it describes one such syste ..."
Abstract

Cited by 4303 (4 self)
 Add to MetaCart
The technology for building knowledgebased systems by inductive inference from examples has been demonstrated successfully in several practical applications. This paper summarizes an approach to synthesizing decision trees that has been used in a variety of systems, and it describes one such system, ID3, in detail. Results from recent studies show ways in which the methodology can be modified to deal with information that is noisy and/or incomplete. A reported shortcoming of the basic algorithm is discussed and two means of overcoming it are compared. The paper concludes with illustrations of current research directions.
Universals in the content and structure of values: theoretical advances and empirical tests in 20 countries
 ADVANCES IN EXPERIMENTAL SOCIAL PSYCHOLOGY
, 1992
"... ..."
Knowledge management and knowledge management systems: Conceptual foundations and an agenda . . .
, 1998
"... ..."
Verb Semantics And Lexical Selection
, 1994
"... ... structure. As Levin has addressed (Levin 1985), the decomposition of verbs is proposed for the purposes of accounting for systematic semanticsyntactic correspondences. This results in a series of problems for MT systems: inflexible verb sense definitions; difficulty in handling metaphor and new ..."
Abstract

Cited by 520 (4 self)
 Add to MetaCart
... structure. As Levin has addressed (Levin 1985), the decomposition of verbs is proposed for the purposes of accounting for systematic semanticsyntactic correspondences. This results in a series of problems for MT systems: inflexible verb sense definitions; difficulty in handling metaphor and new usages; imprecise lexical selection and insufficient system coverage. It seems one approach is to apply probability methods and statistical models for some of these problems. However, the question reminds: has PSR exhausted the potential of the knowledgebased approach? If not, are there any alternatives that can improve the handling of these problems? We suggest an alternative that represents verb semantic knowledge and accounts for not only finetuned systematic semanticsyntactic correspondences, but also semanticinterpretation correspondences. A verb is not represented by a predicate or simple primitives, but by a set of semantic components that are sensitive to the syntactic altern
Results 1  10
of
73,870