Results 1  10
of
1,404
Loopy belief propagation for approximate inference: An empirical study. In:
 Proceedings of Uncertainty in AI,
, 1999
"... Abstract Recently, researchers have demonstrated that "loopy belief propagation" the use of Pearl's polytree algorithm in a Bayesian network with loops can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannonlimit performanc ..."
Abstract

Cited by 676 (15 self)
 Add to MetaCart
with a single loop • Unless all the conditional probabilities are deter ministic, belief propagation will converge. • There is an analytic expression relating the cor rect marginals to the loopy marginals. The ap proximation error is related to the convergence rate of the messages the faster
Approximating probabilistic inference in Bayesian belief networks is NPhard
, 1991
"... Abstract A belief network comprises a graphical representation of dependencies between variables of a domain and a set of conditional probabilities associated with each dependency. Unless P=NP, an efficient, exact algorithm does not exist to compute probabilistic inference in belief networks. Stoch ..."
Abstract

Cited by 291 (4 self)
 Add to MetaCart
Abstract A belief network comprises a graphical representation of dependencies between variables of a domain and a set of conditional probabilities associated with each dependency. Unless P=NP, an efficient, exact algorithm does not exist to compute probabilistic inference in belief networks
Probabilistic robot navigation in partially observable environments
 In Proc. of the International Joint Conference on Artificial Intelligence (IJCAI
, 1995
"... Autonomous mobile robots need very reliable navigation capabilities in order to operate unattended for long periods of time. This paper reports on first results of a research program that uses partially observable Markov models to robustly track a robot’s location in office environments and to direc ..."
Abstract

Cited by 293 (13 self)
 Add to MetaCart
Autonomous mobile robots need very reliable navigation capabilities in order to operate unattended for long periods of time. This paper reports on first results of a research program that uses partially observable Markov models to robustly track a robot’s location in office environments
Learning firstorder probabilistic models with combining rules
 IN PROCEEDINGS OF THE INTERNATIONAL CONFERENCE IN MACHINE LEARNING
, 2005
"... Many realworld domains exhibit rich relational structure and stochasticity and motivate the development of models that combine predicate logic with probabilities. These models describe probabilistic influences between attributes of objects that are related to each other through known domain relatio ..."
Abstract

Cited by 38 (15 self)
 Add to MetaCart
Many realworld domains exhibit rich relational structure and stochasticity and motivate the development of models that combine predicate logic with probabilities. These models describe probabilistic influences between attributes of objects that are related to each other through known domain
Approximate Inference for FirstOrder Probabilistic Languages
 In Proc. International Joint Conference on Artificial Intelligence
, 2001
"... A new, general approach is described for approximate inference in firstorder probabilistic languages, using Markov chain Monte Carlo (MCMC) techniques in the space of concrete possible worlds underlying any given knowledge base. The simplicity of the approach and its lazy construction of poss ..."
Abstract

Cited by 48 (4 self)
 Add to MetaCart
A new, general approach is described for approximate inference in firstorder probabilistic languages, using Markov chain Monte Carlo (MCMC) techniques in the space of concrete possible worlds underlying any given knowledge base. The simplicity of the approach and its lazy construction
Irrelevance and Conditioning in FirstOrder Probabilistic Logic
 In Proceedings, Thirteenth National Conference on Artificial Intelligence (AAAI '96
, 1996
"... Firstorder probabilistic logic is a powerful knowledge representation language. Unfortunately, deductive reasoning based on the standard semantics for this logic does not support certain desirable patterns of reasoning, such as indifference to irrelevant information or substitution of constants int ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Firstorder probabilistic logic is a powerful knowledge representation language. Unfortunately, deductive reasoning based on the standard semantics for this logic does not support certain desirable patterns of reasoning, such as indifference to irrelevant information or substitution of constants
Firstorder probabilistic models for coreference resolution
 In HLT/NAACL
, 2007
"... Traditional noun phrase coreference resolution systems represent features only of pairs of noun phrases. In this paper, we propose a machine learning method that enables features over sets of noun phrases, resulting in a firstorder probabilistic model for coreference. We outline a set of approximat ..."
Abstract

Cited by 86 (20 self)
 Add to MetaCart
Traditional noun phrase coreference resolution systems represent features only of pairs of noun phrases. In this paper, we propose a machine learning method that enables features over sets of noun phrases, resulting in a firstorder probabilistic model for coreference. We outline a set
Lifted firstorder belief propagation
 In Association for the Advancement of Artificial Intelligence (AAAI
, 2008
"... Unifying firstorder logic and probability is a longstanding goal of AI, and in recent years many representations combining aspects of the two have been proposed. However, inference in them is generally still at the level of propositional logic, creating all ground atoms and formulas and applying s ..."
Abstract

Cited by 115 (15 self)
 Add to MetaCart
Unifying firstorder logic and probability is a longstanding goal of AI, and in recent years many representations combining aspects of the two have been proposed. However, inference in them is generally still at the level of propositional logic, creating all ground atoms and formulas and applying
Asymptotic Conditional Probabilities for FirstOrder Logic
 In Proc. 24th ACM Symp. on Theory of Computing
, 1992
"... Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for firstorder formulas. That is, given firstorder formulas ' and `, we consider the number of structures with domain f1; : : : ; Ng that satisfy `, and c ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for firstorder formulas. That is, given firstorder formulas ' and `, we consider the number of structures with domain f1; : : : ; Ng that satisfy
FirstOrder Bayesian Logic
, 2005
"... Uncertainty is a fundamental and irreducible aspect of our knowledge about the world. Until recently, classical firstorder logic has reigned as the de facto standard logical foundation for artificial intelligence. The lack of a builtin, semantically grounded capability for reasoning under uncertai ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
uncertainty renders classical firstorder logic inadequate for many important classes of problems. Generalpurpose languages are beginning to emerge for which the fundamental logical basis is probability. Increasingly expressive probabilistic languages demand a theoretical foundation that fully integrates
Results 1  10
of
1,404