Results 11  20
of
71
Bayesian Modality Fusion: Probabilistic Integration Of Multiple Vision Algorithms for Head Tracking
 FOURTH ASIAN CONFERENCE ON COMPUTER VISION (ACCV)
, 2000
"... We describe a headtracking system that harnesses Bayesian modality fusion, a technique for integrating the analyses of multiple visual tracking algorithms within a probabilistic framework. At the heart of the approach is a Bayesian network model that includes random variables that serve as context ..."
Abstract

Cited by 56 (4 self)
 Add to MetaCart
We describe a headtracking system that harnesses Bayesian modality fusion, a technique for integrating the analyses of multiple visual tracking algorithms within a probabilistic framework. At the heart of the approach is a Bayesian network model that includes random variables that serve as contextsensitive indicators of reliability of the different tracking algorithms. Parameters of the Bayesian model are learned from data in an offline training phase using groundtruth data from a Polhemus tracking device. In our implementation for a realtime head tracking task, algorithms centering on color, motion, and background subtraction modalities are fused into a single estimate of head position in an image. Results demonstrate the effectiveness of Bayesian modality fusion in environments undergoing a variety of visual perturbances.
Learning Bayesian Networks: A unification for discrete and Gaussian domains
 PROCEEDINGS OF ELEVENTH CONFERENCE ON UNCERTAINTY INARTI CIAL INTELLIGENCE
, 1995
"... We examine Bayesian methods for learning Bayesian networks from a combination of prior knowledge and statistical data. In particular, we unify the approaches we presented at last year's conference for discrete and Gaussian domains. We derive a general Bayesian scoring metric, appropriate for bo ..."
Abstract

Cited by 53 (5 self)
 Add to MetaCart
We examine Bayesian methods for learning Bayesian networks from a combination of prior knowledge and statistical data. In particular, we unify the approaches we presented at last year's conference for discrete and Gaussian domains. We derive a general Bayesian scoring metric, appropriate for both domains. We then use this metric in combination with wellknown statistical facts about the Dirichlet and normal{Wishart distributions to derive our metrics for discrete and Gaussian domains.
Asymptotic model selection for directed networks with hidden variables
, 1996
"... We extend the Bayesian Information Criterion (BIC), an asymptotic approximation for the marginal likelihood, to Bayesian networks with hidden variables. This approximation can be used to select models given large samples of data. The standard BIC as well as our extension punishes the complexity of a ..."
Abstract

Cited by 51 (15 self)
 Add to MetaCart
We extend the Bayesian Information Criterion (BIC), an asymptotic approximation for the marginal likelihood, to Bayesian networks with hidden variables. This approximation can be used to select models given large samples of data. The standard BIC as well as our extension punishes the complexity of a model according to the dimension of its parameters. We argue that the dimension of a Bayesian network with hidden variables is the rank of the Jacobian matrix of the transformation between the parameters of the network and the parameters of the observable variables. We compute the dimensions of several networks including the naive Bayes model with a hidden root node. 1
Userexpertise modeling with empirically derived probabilistic implication networks
, 1996
"... ..."
Theorybased causal inference
 In
, 2003
"... People routinely make sophisticated causal inferences unconsciously, effortlessly, and from very little data – often from just one or a few observations. We argue that these inferences can be explained as Bayesian computations over a hypothesis space of causal graphical models, shaped by strong top ..."
Abstract

Cited by 31 (3 self)
 Add to MetaCart
(Show Context)
People routinely make sophisticated causal inferences unconsciously, effortlessly, and from very little data – often from just one or a few observations. We argue that these inferences can be explained as Bayesian computations over a hypothesis space of causal graphical models, shaped by strong topdown prior knowledge in the form of intuitive theories. We present two case studies of our approach, including quantitative models of human causal judgments and brief comparisons with traditional bottomup models of inference. 1
Likelihoods and Parameter Priors for Bayesian Networks
, 1995
"... We develop simple methods for constructing likelihoods and parameter priors for learning about the parameters and structure of a Bayesian network. In particular, we introduce several assumptions that permit the construction of likelihoods and parameter priors for a large number of Bayesiannetwork s ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
We develop simple methods for constructing likelihoods and parameter priors for learning about the parameters and structure of a Bayesian network. In particular, we introduce several assumptions that permit the construction of likelihoods and parameter priors for a large number of Bayesiannetwork structures from a small set of assessments. The most notable assumption is that of likelihood equivalence, which says that data can not help to discriminate network structures that encode the same assertions of conditional independence. We describe the constructions that follow from these assumptions, and also present a method for directly computing the marginal likelihood of a random sample with no missing observations. Also, we show how these assumptions lead to a general framework for characterizing parameter priors of multivariate distributions. Keywords: Bayesian network, learning, likelihood equivalence, Dirichlet, normalWishart. 1 Introduction A Bayesian network is a graphical repres...
2006) Computational inference of neural information flow networks PLoS Computational Biology 2:e161
, 2006
"... Determining how information flows along anatomical brain pathways is a fundamental requirement for understanding how animals perceive their environments, learn, and behave. Attempts to reveal such neural information flow have been made using linear computational methods, but neural interactions are ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
(Show Context)
Determining how information flows along anatomical brain pathways is a fundamental requirement for understanding how animals perceive their environments, learn, and behave. Attempts to reveal such neural information flow have been made using linear computational methods, but neural interactions are known to be nonlinear. Here, we demonstrate that a dynamic Bayesian network (DBN) inference algorithm we originally developed to infer nonlinear transcriptional regulatory networks from gene expression data collected with microarrays is also successful at inferring nonlinear neural information flow networks from electrophysiology data collected with microelectrode arrays. The inferred networks we recover from the songbird auditory pathway are correctly restricted to a subset of known anatomical paths, are consistent with timing of the system, and reveal both the importance of reciprocal feedback in auditory processing and greater information flow to higherorder auditory areas when birds hear natural as opposed to synthetic sounds. A linear method applied to the same data incorrectly produces networks with information flow to nonneural tissue and over paths known not to exist. To our knowledge, this study represents the first biologically validated demonstration of an algorithm to successfully infer neural information flow networks.
Learning Causal Networks from Data: A survey and a new algorithm for recovering possibilistic causal networks
, 1997
"... Introduction Reasoning in terms of cause and effect is a strategy that arises in many tasks. For example, diagnosis is usually defined as the task of finding the causes (illnesses) from the observed effects (symptoms). Similarly, prediction can be understood as the description of a future plausible ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
Introduction Reasoning in terms of cause and effect is a strategy that arises in many tasks. For example, diagnosis is usually defined as the task of finding the causes (illnesses) from the observed effects (symptoms). Similarly, prediction can be understood as the description of a future plausible situation where observed effects will be in accordance with the known causal structure of the phenomenon being studied. Causal models are a summary of the knowledge about a phenomenon expressed in terms of causation. Many areas of the ap # This work has been partially supported by the Spanish Comission Interministerial de Ciencia y Tecnologia Project CICYTTIC96 0878. plied sciences (econometry, biomedics, engineering, etc.) have used such a term to refer to models that yield explanations, allow for prediction and facilitate planning and decision making. Causal reasoning can be viewed as inference guided by a causation theory. That kind of inference can be further specialised into induc
Finding Temporal Relations: Causal Bayesian Networks vs. C4.5
 The 12th International Symposium on Methodologies for Intelligent Systems (ISMIS'2000
"... Abstract. Observing the world and finding trends and relations among the variables of interest is an important and common learning activity. In this paper we apply TETRAD, a program that uses Bayesian networks to discover causal rules, and C4.5, which creates decision trees, to the problem of discov ..."
Abstract

Cited by 20 (10 self)
 Add to MetaCart
(Show Context)
Abstract. Observing the world and finding trends and relations among the variables of interest is an important and common learning activity. In this paper we apply TETRAD, a program that uses Bayesian networks to discover causal rules, and C4.5, which creates decision trees, to the problem of discovering relations among a set of variables in the controlled environment of an Artificial Life simulator. All data in this environment are generated by a single entity over time. The rules in the domain are known, so we are able to assess the effectiveness of each method. The agent's sensings of its environment and its own actions are saved in data records over time. We first compare TETRAD and C4.5 in discovering the relations between variables in a single record. We next attempt to find temporal relations among the variables of consecutive records. Since both these programs disregard the passage of time among the records, we introduce the flattening operation as a way to span time and bring the variables of interest together in a new single record. We observe that flattening allows C4.5 to discover relations among variables over time, while it does not improve TETRAD's output. 1