Results 1  10
of
10
ContextSpecific Independence in Bayesian Networks
, 1996
"... Bayesiannetworks provide a languagefor qualitatively representing the conditional independence properties of a distribution. This allows a natural and compact representation of the distribution, eases knowledge acquisition, and supports effective inference algorithms. ..."
Abstract

Cited by 296 (30 self)
 Add to MetaCart
Bayesiannetworks provide a languagefor qualitatively representing the conditional independence properties of a distribution. This allows a natural and compact representation of the distribution, eases knowledge acquisition, and supports effective inference algorithms.
Discriminative Reranking for Natural Language Parsing
, 2005
"... This article considers approaches which rerank the output of an existing probabilistic parser. The base parser produces a set of candidate parses for each input sentence, with associated probabilities that define an initial ranking of these parses. A second model then attempts to improve upon this i ..."
Abstract

Cited by 268 (9 self)
 Add to MetaCart
This article considers approaches which rerank the output of an existing probabilistic parser. The base parser produces a set of candidate parses for each input sentence, with associated probabilities that define an initial ranking of these parses. A second model then attempts to improve upon this initial ranking, using additional features of the tree as evidence. The strength of our approach is that it allows a tree to be represented as an arbitrary set of features, without concerns about how these features interact or overlap and without the need to define a derivation or a generative model which takes these features into account. We introduce a new method for the reranking task, based on the boosting approach to ranking problems described in Freund et al. (1998). We apply the boosting method to parsing the Wall Street Journal treebank. The method combined the loglikelihood under a baseline model (that of Collins [1999]) with evidence from an additional 500,000 features over parse trees that were not included in the original model. The new model achieved 89.75 % Fmeasure, a 13 % relative decrease in Fmeasure error over the baseline model’s score of 88.2%. The article also introduces a new algorithm for the boosting approach which takes advantage of the sparsity of the feature space in the parsing data. Experiments show significant efficiency gains for the new algorithm over the obvious implementation of the boosting approach. We argue that the method is an appealing alternative—in terms of both simplicity and efficiency—to work on feature selection methods within loglinear (maximumentropy) models. Although the experiments in this article are on natural language parsing (NLP), the approach should be applicable to many other NLP problems which are naturally framed as ranking tasks, for example, speech recognition, machine translation, or natural language generation.
SPUDD: Stochastic planning using decision diagrams
 In Proceedings of the Fifteenth Conference on Uncertainty in Artificial Intelligence
, 1999
"... Recently, structured methods for solving factored Markov decisions processes (MDPs) with large state spaces have been proposed recently to allow dynamic programming to be applied without the need for complete state enumeration. We propose and examine a new value iteration algorithm for MDPs that use ..."
Abstract

Cited by 178 (17 self)
 Add to MetaCart
Recently, structured methods for solving factored Markov decisions processes (MDPs) with large state spaces have been proposed recently to allow dynamic programming to be applied without the need for complete state enumeration. We propose and examine a new value iteration algorithm for MDPs that uses algebraic decision diagrams (ADDs) to represent value functions and policies, assuming an ADD input representation of the MDP. Dynamic programming is implemented via ADD manipulation. We demonstrate our method on a class of large MDPs (up to 63 million states) and show that significant gains can be had when compared to treestructured representations (with up to a thirtyfold reduction in the number of nodes required to represent optimal value functions). 1
Application of Spreading Activation Techniques in Information Retrieval
 Artificial Intelligence Review
, 1997
"... This paper surveys the use of Spreading Activation techniques on Semantic Networks in Associative Information Retrieval. The major Spreading Activation models are presented and their applications to IR is surveyed. A number of works in this area are critically analyzed in order to study the relevanc ..."
Abstract

Cited by 115 (3 self)
 Add to MetaCart
This paper surveys the use of Spreading Activation techniques on Semantic Networks in Associative Information Retrieval. The major Spreading Activation models are presented and their applications to IR is surveyed. A number of works in this area are critically analyzed in order to study the relevance of Spreading Activation for associative IR. Key words: spreading activation, information storage and retrieval, semantic networks, associative information retrieval, information processing, knowledge representation.
A Probabilistic Framework for Perceptual Grouping of Features for Human Face Detection
 INT. CONF. AUTOMATIC FACE AND GESTURE RECOGNITION
, 1996
"... Present approaches to human face detection have made several assumptions that restrict their ability to be extended to general imaging conditions. We identify that the key factor in a generic and robust system is that of exploitinga large amount of evidence, related and reinforced by model knowledge ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
Present approaches to human face detection have made several assumptions that restrict their ability to be extended to general imaging conditions. We identify that the key factor in a generic and robust system is that of exploitinga large amount of evidence, related and reinforced by model knowledge through a probabilistic framework. In this paper, we propose a face detection framework that groups image features into meaningful entities using perceptual organization, assigns probabilities to each of them, and reinforce these probabilities using Bayesian reasoning techniques. True hypotheses of faces will be reinforced to a high probability. The detection of faces under scale, orientation and viewpoint variations will be examined in a subsequent paper.
Certaintyfactorlike Structures in Bayesian Belief Networks
 KnowledgeBased Systems
, 2001
"... The certaintyfactor model was one of the most popular models for the representation and manipulation of uncertain knowledge in the early rulebased expert systems of the 1980s. After the model was criticised by researchers in artificial intelligence and statistics as being adhoc in nature, research ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
The certaintyfactor model was one of the most popular models for the representation and manipulation of uncertain knowledge in the early rulebased expert systems of the 1980s. After the model was criticised by researchers in artificial intelligence and statistics as being adhoc in nature, researchers and developers have stopped looking at the model. Nowadays, it is often stated that the model is merely interesting from a historical point of view. Its place has been taken over by more expressive formalisms for the representation and manipulation of uncertain knowledge, in particular, by the formalism of Bayesian belief networks. In this paper, it is shown that this view underestimates the importance of the principles underlying the certaintyfactor model. In particular, it is shown that certaintyfactorlike structures occur frequently in practical Bayesian network models as causal independence assumptions. In fact, the noisyOR and noisyAND models, two probabilistic models frequently employed, appear to be reinventions of combination functions previously introduced as part of the certaintyfactor model. This insight may lead to a reappraisal of the certaintyfactor model. 2001 Elsevier Science B.V. All fights reserved.
Structured arc reversal and simulation of dynamic probabilistic networks
 In UAI
, 1997
"... We present an algorithm for arc reversal in Bayesian networks with treestructured conditional probability tables, and consider some of its advantages, especially for the simulation of dynamic probabilistic networks. In particular, the method allows one to produce CPTs for nodes involved in the reve ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We present an algorithm for arc reversal in Bayesian networks with treestructured conditional probability tables, and consider some of its advantages, especially for the simulation of dynamic probabilistic networks. In particular, the method allows one to produce CPTs for nodes involved in the reversal that exploit regularities in the conditional distributions. We argue that this approach alleviates some of the overhead associated with arc reversal, plays an important role in evidence integration and can be used to restrict sampling of variables in DPNs. We also provide an algorithm that detects the dynamic irrelevance of state variables in forward simulation. This algorithm exploits the structured CPTs in a reversed network to determine, in a timeindependent fashion, the conditions under which a variable does or does not need to be sampled. 1
Plausibility Measures and Default Reasoning: An Overview
 Proceedings, 14th Symposium on Logic in Computer Science
, 1999
"... We introduce a new approach to modeling uncertainty based on plausibility measures. This approach is easily seen to generalize other approaches to modeling uncertainty, such as probability measures, belief functions, and possibility measures. We then consider one application of plausibility measures ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We introduce a new approach to modeling uncertainty based on plausibility measures. This approach is easily seen to generalize other approaches to modeling uncertainty, such as probability measures, belief functions, and possibility measures. We then consider one application of plausibility measures: default reasoning. In recent years, a number of different semantics for defaults have been proposed, such as preferential structures, fflsemantics, possibilistic structures, and rankings, that have been shown to be characterized by the same set of axioms, known as the KLM properties. While this was viewed as a surprise, we show here that it is almost inevitable. In the framework of plausibility measures, we can give a necessary condition for the KLM axioms to be sound, and an additional condition necessary and sufficient to ensure that the KLM axioms are complete. This additional condition is so weak that it is almost always met whenever the axioms are sound. In particular, it is easily ...
SPUDD: Stochastic Planning using Decision Diagrams
 IN PROCEEDINGS OF THE FIFTEENTH CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE
, 1999
"... Recently, structured methods for solving factored Markov decisions processes (MDPs) with large state spaces have been proposed recently to allow dynamic programming to be applied without the need for complete state enumeration. We propose and examine a new value iteration algorithm for MDPs th ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Recently, structured methods for solving factored Markov decisions processes (MDPs) with large state spaces have been proposed recently to allow dynamic programming to be applied without the need for complete state enumeration. We propose and examine a new value iteration algorithm for MDPs that uses algebraic decision diagrams (ADDs) to represent value functions and policies, assuming an ADD input representation of the MDP. Dynamic programming is implemented via ADD manipulation. We demonstrate our method on a class of large MDPs (up to 63 million states) and show that significant gains can be had when compared to treestructured representations (with up to a thirtyfold reduction in the number of nodes required to represent optimal value functions).
Graphes d'aspects probabilis es Probabilistic Aspect Graphs
"... Direct perception is incomplete: objects may show ambiguous appearances, and sensors have a limited sensitivity. As a consequence, the recognition of complex 3D objects must have an exploratory nature. Appearance variations when the viewpoint is modified or when the sensor parameters are changed are ..."
Abstract
 Add to MetaCart
Direct perception is incomplete: objects may show ambiguous appearances, and sensors have a limited sensitivity. As a consequence, the recognition of complex 3D objects must have an exploratory nature. Appearance variations when the viewpoint is modified or when the sensor parameters are changed are characteristic of the object. They can be organized in the form of an aspect graph. Standard geometric aspect graphs are difficult to build. This article presents a generalized probabilistic version of this concept. Tridimensional object recognition is transformed here in a problem of Markov chain estimationwhich yields a statistical characterization of aspect graphs complexity. Keywords Active vision, aspect graphs, Markov chains, large deviations of hypotheses testing. 1 Graphes d'aspects geometriques Les scenes tridimensionnelles sont apprehendees par des organes sensoriels bidimensionnels, naturels (retine) ou artificiels (camera). Cette reduction de dimensionnalite, gen eratric...