Results 1  10
of
11
Building LargeScale Bayesian Networks
, 1999
"... Bayesian Networks (BNs) model problems that involve uncertainty. A BN is a directed graph, whose nodes are the uncertain variables and whose edges are the causal or influential links between the variables. Associated with each node is a set of conditional probability functions that model the unce ..."
Abstract

Cited by 55 (16 self)
 Add to MetaCart
(Show Context)
Bayesian Networks (BNs) model problems that involve uncertainty. A BN is a directed graph, whose nodes are the uncertain variables and whose edges are the causal or influential links between the variables. Associated with each node is a set of conditional probability functions that model the uncertain relationship between the node and its parents. The benefits of using BNs to model uncertain domains are well known, especially since the recent breakthroughs in algorithms and tools to implement them. However, there have been serious problems for practitioners trying to use BNs to solve realistic problems. This is because, although the tools make it possible to execute largescale BNs efficiently, there have been no guidelines on building BNs. Specifically, practitioners face two significant barriers. The first barrier is that of specifying the graph structure such that it is a sensible model of the types of reasoning being applied. The second barrier is that of eliciting the conditional probability values, from a domain expert, for a graph containing many combinations of nodes, where each may have a large number of discrete or continuous values. We have tackled both of these practical problems in recent research projects and have produced partial solutions for both that have been applied extensively on a number of reallife applications. In this paper we shall concentrate on this first problem, that of specifying a sensible BN graph structure. Our solution is based on the notion of generally applicable `building blocks', called idioms, which can be combined together into modular subnets. These can then in turn be combined into larger BNs, using simple combination rules and by exploiting recent ideas on modular and Object Oriented BNs (OOBNs). This appr...
Building probabilistic networks: where do the numbers come from?  a guide to the literature
 IEEE Transactions on Knowledge and Data Engineering
, 2000
"... Probabilistic networks are now fairly well established as practical representations of knowledge for reasoning under uncertainty, as demonstrated by an increasing number of successful applications in such domains as (medical) diagnosis and prognosis, planning, vision, ..."
Abstract

Cited by 33 (3 self)
 Add to MetaCart
(Show Context)
Probabilistic networks are now fairly well established as practical representations of knowledge for reasoning under uncertainty, as demonstrated by an increasing number of successful applications in such domains as (medical) diagnosis and prognosis, planning, vision,
Learning recursive Bayesian multinets for data clustering by means of constructive induction
, 2001
"... This paper introduces and evaluates a new class of knowledge model, the recursive Bayesian multinet (RBMN), which encodes the joint probability distribution of a given database. RBMNs extend Bayesian networks (BNs) as well as partitional clustering systems. Briefly, a RBMN is a decision tree with co ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
This paper introduces and evaluates a new class of knowledge model, the recursive Bayesian multinet (RBMN), which encodes the joint probability distribution of a given database. RBMNs extend Bayesian networks (BNs) as well as partitional clustering systems. Briefly, a RBMN is a decision tree with component BNs at the leaves. A RBMN is learnt using a greedy, heuristic approach akin to that used by many supervised decision tree learners, but where BNs are learnt at leaves using constructive induction. A key idea is to treat expected data as real data. This allows us to complete the database and to take advantage of a closed form for the marginal likelihood of the expected complete data that factorizes into separate marginal likelihoods for each family (a node and its parents). Our approach is evaluated on synthetic and realworld databases.
Geographical Clustering of Cancer Incidence by Means of Bayesian Networks and Conditional Gaussian Networks
, 2001
"... With the aim of improving knowledge on the geographical distribution and characterization of malignant tumors in the Autonomous Communityofthe Basque Country (Spain), agestandardized cancer incidence rates of the 6 most frequent cancer types for patients of each sex between 1986 and 1994 are analyz ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
With the aim of improving knowledge on the geographical distribution and characterization of malignant tumors in the Autonomous Communityofthe Basque Country (Spain), agestandardized cancer incidence rates of the 6 most frequent cancer types for patients of each sex between 1986 and 1994 are analyzed, in relation to the towns of the Community. Concretely, we perform a geographical clustering of the towns of the Community by means of Bayesian networks and conditional Gaussian networks. We present several maps that show the clusterings encoded by the learnt models. In addition to this, we outline the cancer incidence profile for each of the obtained clusters.
Study of Four Types of Learning Bayesian Networks Cases
, 2014
"... Abstract: As the combination of parameter learning and structure learning, learning Bayesian networks can also be examined, Parameter learning is estimation of the dependencies in the network. Structural learning is the estimation of the links of the network. In terms of whether the structure of the ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: As the combination of parameter learning and structure learning, learning Bayesian networks can also be examined, Parameter learning is estimation of the dependencies in the network. Structural learning is the estimation of the links of the network. In terms of whether the structure of the network is known and whether the variables are all observable, there are four types of learning Bayesian networks cases. In this paper, first introduce two cases of learning Bayesian networks from complete data: known structure and unobservable variables and unknown structure and unobservable variables. Next, we study two cases of learning Bayesian networks from incomplete data: known network structure and unobservable variables, unknown network structure and unobservable variables.
A Bayesian Network Approach to the Selforganization and Learning in Intelligent Agents
, 2000
"... A Bayesian network approach to selforganization and learning is introduced for use with intelligent agents. Bayesian networks, with the help of influence diagrams, are employed to create a decisiontheoretic intelligent agent. Influence diagrams combine both Bayesian networks and utility theory. In ..."
Abstract
 Add to MetaCart
(Show Context)
A Bayesian network approach to selforganization and learning is introduced for use with intelligent agents. Bayesian networks, with the help of influence diagrams, are employed to create a decisiontheoretic intelligent agent. Influence diagrams combine both Bayesian networks and utility theory. In this research, an intelligent agent is modeled by its belief, preference, and capabilities attributes. Each agent is assumed to have its own belief about its environment. The belief aspect of the intelligent agent is accomplished by a Bayesian network. The goal of an intelligent agent is said to be the preference of the agent and is represented with a utility function in the decision theoretic intelligent agent. Capabilities are represented with a set of possible actions of the decisiontheoretic intelligent agent. Influence diagrams have utility nodes and decision nodes to handle the preference and capabilities of the decisiontheoretic intelligent agent, respectively.
London UK
, 2005
"... Although a number of approaches have been taken to quality prediction for software, none have achieved widespread applicability. Our aim here is to produce a single method to combine the diverse forms of, often causal, evidence available in software development in a more natural and efficient way th ..."
Abstract
 Add to MetaCart
Although a number of approaches have been taken to quality prediction for software, none have achieved widespread applicability. Our aim here is to produce a single method to combine the diverse forms of, often causal, evidence available in software development in a more natural and efficient way than done previously. We use graphical probability models (also known as Bayesian Networks) as the appropriate formalism for representing this evidence. We can use the subjective judgements of experienced project managers to build the probability models and use these models to produce forecasts about the software quality throughout the development life cycle. Moreover, the causal or influence structure of the model more naturally mirrors the real world sequence of events and relations than can be achieved with other formalisms. The work described was the result of an extended research programme involving Philips and a number of other partners. The work led to both a set of easily tailorable models and a tool (AgenaRisk) that enables the models to be run efficiently by nonexpert endusers. We describe the results of extensive industrial evaluation. In particular, we focus on the defect prediction models evaluated at the Philips Software Centre, Bangalore. This produced very encouraging results; a correlation of 95 % between predicted and actual defects found across a highly diverse range of projects. 1
London and Agena Ltd, London, UK.
"... Bayesian Networks (BNs) model problems that involve uncertainty. A BN is a directed graph, whose nodes are the uncertain variables and whose edges are the causal or influential links between the variables. Associated with each node is a set of conditional probability functions that model the uncerta ..."
Abstract
 Add to MetaCart
Bayesian Networks (BNs) model problems that involve uncertainty. A BN is a directed graph, whose nodes are the uncertain variables and whose edges are the causal or influential links between the variables. Associated with each node is a set of conditional probability functions that model the uncertain relationship between the node and its parents. The benefits of using BNs to model uncertain domains are well known, especially since the recent breakthroughs in algorithms and tools to implement them. However, there have been serious problems for practitioners trying to use BNs to solve realistic problems. This is because, although the tools make it possible to execute largescale BNs efficiently, there have been no guidelines on building BNs. Specifically, practitioners face two significant barriers. The first barrier is that of specifying the graph structure such that it is a sensible model of the types of reasoning being applied. The second barrier is that of eliciting the conditional probability values. In this paper we concentrate on this first problem. Our solution is based on the notion of generally applicable “building blocks”, called idioms, which serve solution patterns. These can then