Results 1  10
of
28
A dynamic Bayesian network for diagnosing ventilatorassociated pneumonia in ICU patients
, 2007
"... Diagnosing ventilatorassociated pneumonia in mechanically ventilated patients in intensive care units is seen as a clinical challenge. The difficulty in diagnosing ventilatorassociated pneumonia stems from the lack of a simple yet accurate diagnostic test. To assist clinicians in diagnosing and tr ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Diagnosing ventilatorassociated pneumonia in mechanically ventilated patients in intensive care units is seen as a clinical challenge. The difficulty in diagnosing ventilatorassociated pneumonia stems from the lack of a simple yet accurate diagnostic test. To assist clinicians in diagnosing and treating patients with pneumonia, a decisiontheoretic network had been designed with the help of domain experts. A major limitation of this network is that it does not represent pneumonia as a dynamic process that evolves over time. In this paper, we construct a dynamic Bayesian network that explicitly captures the development of the disease over time. We discuss how probability elicitation from domain experts served to quantify the dynamics involved and how the nature of the patient data helps reduce the computational burden of inference. We evaluate the diagnostic performance of our dynamic model for a number of real patients and report promising results.
Using ranked nodes to model qualitative judgements in Bayesian Networks
 IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING
, 2007
"... Although Bayesian Nets (BNs) are increasingly being used to solve real world risk problems, their use is still constrained by the difficulty of constructing the node probability tables (NPTs). A key challenge is to construct relevant NPTs using the minimal amount of expert elicitation, recognising t ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
Although Bayesian Nets (BNs) are increasingly being used to solve real world risk problems, their use is still constrained by the difficulty of constructing the node probability tables (NPTs). A key challenge is to construct relevant NPTs using the minimal amount of expert elicitation, recognising that it is rarely costeffective to elicit complete sets of probability values. We describe a simple approach to defining NPTs for a large class of commonly occurring nodes (called ranked nodes). The approach is based on the doubly truncated Normal distribution with a central tendency that is invariably a type of weighted function of the parent nodes. In extensive realworld case studies we have found that this approach is sufficient for generating the NPTs of a very large class of nodes. We describe one such case study for validation purposes. The approach has been fully automated in a commercial tool, called AgenaRisk, and is thus accessible to all types of domain experts. We believe this work represents a useful contribution to BN research and technology since its application makes the difference between being able to build realistic BN models and not.
Rigorously defining and analyzing medical processes: An experience report
 In First International Workshop on ModelBased Design of Trustworthy Health Information Systems
, 2007
"... Abstract. This paper describes experiences in using the precise definition of a process for chemotherapy administration and as the basis for analyses aimed at finding and correcting defects, leading to improvements in efficiency and patient safety. The work is a collaboration between Computer Scienc ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Abstract. This paper describes experiences in using the precise definition of a process for chemotherapy administration and as the basis for analyses aimed at finding and correcting defects, leading to improvements in efficiency and patient safety. The work is a collaboration between Computer Science researchers and members of the professional staff of a major regional cancer center. The work entails the use of the LittleJIL process definition language for creating the precise definitions, the PROPEL system for creating precise specifications of process requirements, and the FLAVERS systems for analyzing process definitions. The paper describes the details of using these technologies, by demonstrating how they have been applied to successfully identify defects in the chemotherapy process. Although this work is still ongoing, early experiences suggest that our approach is viable and promising. The work has also helped us to learn about the desiderata for process definition and analysis technologies that are expected to be more broadly applicable to other domains. 1
Contextspecific Signpropagation in Qualitative Probabilistic Networks
 Artificial Intelligence
, 2001
"... Qualitative probabilistic networks represent probabilistic influences between variables. Due to the level of representation detail provided, knowledge about influences that hold only in specific contexts cannot be expressed. The results computed from a qualitative network, as a consequence, can ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Qualitative probabilistic networks represent probabilistic influences between variables. Due to the level of representation detail provided, knowledge about influences that hold only in specific contexts cannot be expressed. The results computed from a qualitative network, as a consequence, can be quite weak and uninformative. We extend the basic formalism of qualitative probabilistic networks by providing for the inclusion of contextspecific information about influences and show that exploiting this information upon inference has the ability to forestall unnecessarily weak results.
Engineering Medical Processes to Improve Their Safety: An Experience Report
"... Abstract. This paper describes experiences in using precise definitions of medical processes as the basis for analyses aimed at finding and correcting defects leading to improvements in patient safety. The work entails the use of the LittleJIL process definition language for creating the precise de ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. This paper describes experiences in using precise definitions of medical processes as the basis for analyses aimed at finding and correcting defects leading to improvements in patient safety. The work entails the use of the LittleJIL process definition language for creating the precise definitions, the Propel system for creating precise specifications of process requirements, and the FLAVERS systems for analyzing process definitions. The paper describes the details of using these technologies, employing a blood transfusion process as an example. Although this work is still ongoing, early experiences suggest that our approach is viable and promising. The work has also helped us to learn about the desiderata for process definition and analysis technologies that are intended to be used to engineer methods. 1
The Elicitation of Probabilities A Review of the Statistical Literature
, 2005
"... “We live in an uncertain world, and probability risk assessment deals as directly with that fact as anything we do. Uncertainty arises partly because we are fallible. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
“We live in an uncertain world, and probability risk assessment deals as directly with that fact as anything we do. Uncertainty arises partly because we are fallible.
Building knowledgebased systems by credal networks: a tutorial
 ADVANCES IN MATHEMATICS RESEARCH
, 2010
"... Knowledgebased systems are computer programs achieving expertlevel competence in solving problems for specific task areas. This chapter is a tutorial on the implementation of this kind of systems in the framework of credal networks. Credal networks are a generalization of Bayesian networks where c ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Knowledgebased systems are computer programs achieving expertlevel competence in solving problems for specific task areas. This chapter is a tutorial on the implementation of this kind of systems in the framework of credal networks. Credal networks are a generalization of Bayesian networks where credal sets, i.e., closed convex sets of probability measures, are used instead of precise probabilities. This allows for a more flexible model of the knowledge, which can represent ambiguity, contrast and contradiction in a natural and realistic way. The discussion guides the reader through the different steps involved in the specification of a system, from the evocation and elicitation of the knowledge to the interaction with the system by adequate inference algorithms. Our approach is characterized by a sharp distinction between the domain knowledge and the process linking this knowledge to the perceived evidence, which we call the observational process. This distinction leads to a very flexible representation of both domain knowledge and knowledge about the way the information is collected, together with a technique to aggregate information coming from different sources. The overall procedure is illustrated throughout the chapter by a simple knowledgebased system for the prediction of the result of a football match.
Local monotonicity in probabilistic networks
 Ninth European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty. October 31November 2, 2007, Hammamet, Tunisia, volume 4724 of LNCS
, 2007
"... Abstract. It is often desirable that a probabilistic network is monotone, e.g., more severe symptoms increase the likeliness of a more serious disease. Unfortunately, determining whether a network is monotone is highly intractable. Often, approximation algorithms are employed that work on a local sc ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract. It is often desirable that a probabilistic network is monotone, e.g., more severe symptoms increase the likeliness of a more serious disease. Unfortunately, determining whether a network is monotone is highly intractable. Often, approximation algorithms are employed that work on a local scale. For these algorithms, the monotonicity of the arcs (rather than the network as a whole) is determined. However, in many situations monotonicity depends on the ordering of the values of the nodes, which is sometimes rather arbitrary. Thus, it is desirable to order the values of these variables such that as many arcs as possible are monotone. We introduce the concept of local monotonicity, discuss the computational complexity of finding an optimal ordering of the values of the nodes in a network, and sketch a branchandbound exact algorithm to find such an optimal solution. 1
Analyzing Medical Processes ∗
"... This paper shows how software engineering technologies used to define and analyze complex software systems can also be effective in detecting defects in humanintensive processes used to administer healthcare. The work described here builds upon earlier work demonstrating that healthcare processes c ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper shows how software engineering technologies used to define and analyze complex software systems can also be effective in detecting defects in humanintensive processes used to administer healthcare. The work described here builds upon earlier work demonstrating that healthcare processes can be defined precisely. This paper describes how finitestate verification can be used to help find defects in such processes as well as find errors in the process definitions and property specifications. The paper includes a detailed example, based upon a realworld process for transfusing blood, where the process defects that were found led to improvements in the process.
Multidimensional Bayesian Network Classifiers
"... We introduce the family of multidimensional Bayesian network classifiers which include one or more class variables and multiple feature variables. The family does not require that every feature variable is modelled as being dependent on every class variable, which results in better modelling capabi ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We introduce the family of multidimensional Bayesian network classifiers which include one or more class variables and multiple feature variables. The family does not require that every feature variable is modelled as being dependent on every class variable, which results in better modelling capabilities than families of models with a single class variable. Yet, our family of multidimensional classifiers includes as special cases the wellknown naive Bayesian and treeaugmented classifiers. We describe the learning problem for a subfamily of multidimensional classifiers and show that the complexity of the solution algorithm is polynomial in the number of variables involved. Preliminary experimental results illustrate the benefits of the multidimensionality of our classifiers. 1