Results 1  10
of
10
ContextSpecific Independence in Bayesian Networks
, 1996
"... Bayesiannetworks provide a languagefor qualitatively representing the conditional independence properties of a distribution. This allows a natural and compact representation of the distribution, eases knowledge acquisition, and supports effective inference algorithms. ..."
Abstract

Cited by 338 (28 self)
 Add to MetaCart
(Show Context)
Bayesiannetworks provide a languagefor qualitatively representing the conditional independence properties of a distribution. This allows a natural and compact representation of the distribution, eases knowledge acquisition, and supports effective inference algorithms.
Random Algorithms for the Loop Cutset Problem
 Journal of Artificial Intelligence Research
, 1999
"... We show how to find a minimum loop cutset in a Bayesian network with high probability. Finding such a loop cutset is the first step in Pearl's method of conditioning for inference. Our random algorithm for finding a loop cutset, called RepeatedWGuessI, outputs a minimum loop cutset, after ..."
Abstract

Cited by 93 (1 self)
 Add to MetaCart
We show how to find a minimum loop cutset in a Bayesian network with high probability. Finding such a loop cutset is the first step in Pearl's method of conditioning for inference. Our random algorithm for finding a loop cutset, called RepeatedWGuessI, outputs a minimum loop cutset, after O(c \Delta 6 k kn) steps, with probability at least 1 \Gamma (1 \Gamma 1 6 k ) c6 k , where c ? 1 is a constant specified by the user, k is the size of a minimum weight loop cutset, and n is the number of vertices. We also show empirically that a variant of this algorithm, called WRA, often finds a loop cutset that is closer to the minimum loop cutset than the ones found by the best deterministic algorithms known. 1
Approximation Algorithms for the Feedback Vertex Set Problem with Applications to Constraint Satisfaction and Bayesian Inference
, 1998
"... A feedback vertex set of an undirected graph is a subset of vertices that intersects with the vertex set of each cycle in the graph. Given an undirected graph G with n vertices and weights on its vertices, polynomialtime algorithms are provided for approximating the problem of finding a feedback ve ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
A feedback vertex set of an undirected graph is a subset of vertices that intersects with the vertex set of each cycle in the graph. Given an undirected graph G with n vertices and weights on its vertices, polynomialtime algorithms are provided for approximating the problem of finding a feedback vertex set of G with a smallest weight. When the weights of all vertices in G are equal, the performance ratio attained by these algorithms is 4 \Gamma (2=n). This improves a previous algorithm which achieved an approximation factor of O( p log n) for this case. For general vertex weights, the performance ratio becomes minf2\Delta 2 ; 4 log 2 ng where \Delta denotes the maximum degree in G. For the special case of planar graphs this ratio is reduced to 10. An interesting special case of weighted graphs where a performance ratio of 4 \Gamma (2=n) is achieved is the one where a prescribed subset of the vertices, so called blackout vertices, is not allowed to participate in any feedback verte...
Approximation Algorithms for the Vertex Feedback Set Problem with Applications to Constraint Satisfaction and Bayesian Inference
"... A vertex feedback set of an undirected graph is a subset of vertices that intersects with the vertex set of each cycle in the graph. Given an undirected graph G with n vertices and weights on its vertices, polynomialtime algorithms are provided for approximating the problem of finding a vertex feed ..."
Abstract

Cited by 29 (5 self)
 Add to MetaCart
A vertex feedback set of an undirected graph is a subset of vertices that intersects with the vertex set of each cycle in the graph. Given an undirected graph G with n vertices and weights on its vertices, polynomialtime algorithms are provided for approximating the problem of finding a vertex feedback set of G with a smallest weight. When the weights of all vertices in G are equal, the performance ratio attained by these algorithms is 4 (2/n). This improves a previous algorithm which achieved an approximation factor of G for this case. For general vertex weights, the performance ratio becomes min{2A², 41og, n} where A denotes the maximum degree in G. For the special case of planar graphs this ratio is reduced to 10. An interesting special case of weighted graphs where a performance ratio of 4 (2/n) is achieved is
Local Conditioning in Bayesian Networks
 Artificial Intelligence
, 1996
"... Local conditioning (LC) is an exact algorithm for computing probability in Bayesian networks, developed as an extension of Kim and Pearl's algorithm for singlyconnected networks. A list of variables associated to each node guarantees that only the nodes inside a loop are conditioned on the var ..."
Abstract

Cited by 29 (6 self)
 Add to MetaCart
(Show Context)
Local conditioning (LC) is an exact algorithm for computing probability in Bayesian networks, developed as an extension of Kim and Pearl's algorithm for singlyconnected networks. A list of variables associated to each node guarantees that only the nodes inside a loop are conditioned on the variable which breaks it. The main advantage of this algorithm is that it computes the probability directly on the original network instead of building a cluster tree, and this can save time when debugging a model and when the sparsity of evidence allows a pruning of the network. The algorithm is also advantageous when some families in the network interact through AND/OR gates. A parallel implementation of the algorithm with a processor for each node is possible even in the case of multiplyconnected networks. 1 Introduction A Bayesian network is an acyclic directed graph in which every node represents a random variable, together with a probability distribution such that P (x 1 ; : : : ; x n ) = ...
Optimization of Pearl's Method of Conditioning and GreedyLike Approximation Algorithms for the Vertex Feedback Set Problem
 Artificial Intelligence
, 1997
"... We show how to find a small loop cutset in a Bayesian network. ..."
Abstract

Cited by 27 (4 self)
 Add to MetaCart
(Show Context)
We show how to find a small loop cutset in a Bayesian network.
Planning to Lead to Mission Success, CSER
"... One critical, yet seldom measured, factor to the mission success is the leadership preferences of those in positions of authority. While project and program management methods are assessed and correlated to mission success, the leadership factor has been neglected. This work uses existing personalit ..."
Abstract
 Add to MetaCart
One critical, yet seldom measured, factor to the mission success is the leadership preferences of those in positions of authority. While project and program management methods are assessed and correlated to mission success, the leadership factor has been neglected. This work uses existing personality and leadership preference measuring instruments (MBTI and MLQ) to give a general quantification to this leadership factor and phenotypes to define systems by their complexity and precedented/unprecedented nature. By examining that system in a current context, leaders can generate a risk profile of how their actual style of leadership may not be wellmatched to the challenges that tend to be experienced by systems of that phenotype. By identifying these leadership style/system phenotype mismatchs, the leader can create a leadership development plan that targets developing specific styles in priority order.
Exact Algorithms for LOOP CUTSET
"... The LOOP CUTSET problem was historically posed by Pearl as a subroutine in Pearl’s algorithm for computing inference in probabilistic networks. The efficiency of the algorithm that solves the probabilistic inference highly depends on the size of the smallest known LOOP CUTSET. This justifies the sea ..."
Abstract
 Add to MetaCart
(Show Context)
The LOOP CUTSET problem was historically posed by Pearl as a subroutine in Pearl’s algorithm for computing inference in probabilistic networks. The efficiency of the algorithm that solves the probabilistic inference highly depends on the size of the smallest known LOOP CUTSET. This justifies the search for exact algorithms for finding a minimum LOOP CUTSET. In this thesis we are investigating the algorithmic complexity of the problem. We will look at both the unparameterized problem and the problem parameterized by the treewidth of the input graph. For both we give an exact exponential time algorithm. The running times of these algorithms are � ⋆ (1.7548n) and � ⋆ (4tw) respectively, where tw is the treewidth of the input graph. Finally, we prove a lower bound of 3tw for the parameterized problem.