Results 1 
7 of
7
Belief Networks Revisited
, 1994
"... this paper, Rumelhart presented compelling evidence that text comprehension must be a distributed process that combines both topdown and bottomup inferences. Strangely, this dual mode of inference, so characteristic of Bayesian analysis, did not match the capabilities of either the "certainty fact ..."
Abstract

Cited by 36 (6 self)
 Add to MetaCart
this paper, Rumelhart presented compelling evidence that text comprehension must be a distributed process that combines both topdown and bottomup inferences. Strangely, this dual mode of inference, so characteristic of Bayesian analysis, did not match the capabilities of either the "certainty factors" calculus or the inference networks of PROSPECTOR  the two major contenders for uncertainty management in the 1970s. I thus began to explore the possibility of achieving distributed computation in a "pure" Bayesian framework, so as not to compromise its basic capacity to combine bidirectional inferences (i.e., predictive and abductive) . Not caring much about generality at that point, I picked the simplest structure I could think of (i.e., a tree) and tried to see if anything useful can be computed by assigning each variable a simple processor, forced to communicate only with its neighbors. This gave rise to the treepropagation algorithm reported in [15] and, a year later, the KimPearl algorithm [12], which supported not only bidirectional inferences but also intercausal interactions, such as "explainingaway." These two algorithms were described in Section 2 of Fusion.
Probabilistic Arithmetic
, 1989
"... This thesis develops the idea of probabilistic arithmetic. The aim is to replace arithmetic operations on numbers with arithmetic operations on random variables. Specifically, we are interested in numerical methods of calculating convolutions of probability distributions. The longterm goal is to ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
This thesis develops the idea of probabilistic arithmetic. The aim is to replace arithmetic operations on numbers with arithmetic operations on random variables. Specifically, we are interested in numerical methods of calculating convolutions of probability distributions. The longterm goal is to be able to handle random problems (such as the determination of the distribution of the roots of random algebraic equations) using algorithms which have been developed for the deterministic case. To this end, in this thesis we survey a number of previously proposed methods for calculating convolutions and representing probability distributions and examine their defects. We develop some new results for some of these methods (the Laguerre transform and the histogram method), but ultimately find them unsuitable. We find that the details on how the ordinary convolution equations are calculated are
Techniques for the Fast Simulation of Models of Highly Dependable Systems
 IEEE Transactions on Reliability
, 2001
"... this paper, we review some of the importancesampling techniques that have been developed in recent years to e#ciently estimate dependability measures in Markovian and nonMarkovian models of highly dependable systems. 1 Acronyms MTTF Mean time to failure. MTBF Mean time between failures. CTMC ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
this paper, we review some of the importancesampling techniques that have been developed in recent years to e#ciently estimate dependability measures in Markovian and nonMarkovian models of highly dependable systems. 1 Acronyms MTTF Mean time to failure. MTBF Mean time between failures. CTMC Continuoustime Markov chain. DTMC Discretetime Markov chain. GSMP Generalized semiMarkov process. SAVE System AVailability Estimator. CLT Central limit theorem. VRR Variance reduction ratio. TRR Total e#ort reduction ratio. MSDIS Measurespecific dynamic importance sampling. BLBLR Balance over links balanced likelihood ratio. BLBLRC Balance over links balanced likelihood ratio with cuts. 1 INTRODUCTION High dependability requirements of today's critical and/or commercial systems often lead to complicated and costly designs. The ability to predict relevant dependability measures for such complex systems is essential, not only to guarantee hig
A DiameterConstrained Network Reliability model to determine the Probability that a Communication Network meets Delay Constraints
"... Abstract: In this paper we provide a summary of results and applications pertaining a Diameterconstrained Network reliability model. Classical network reliability models measure the probability that there exist endtoend paths between network nodes, not taking into account the length of these path ..."
Abstract
 Add to MetaCart
Abstract: In this paper we provide a summary of results and applications pertaining a Diameterconstrained Network reliability model. Classical network reliability models measure the probability that there exist endtoend paths between network nodes, not taking into account the length of these paths. For many applications this is inadequate because the connection will only be established or attain the required quality if the distance between the connecting nodes does not exceed a given value. The Diameterconstrained reliability of a network (DCR) introduced recently considers not only the underlying topology, but also imposes a bound on the diameter, which is the maximum distance between the nodes of the network. We present a synopsis of the known results and applications of the DCR for networks that can either be modeled by directed as well as undirected graphs. Moreover important combinatorial and computational properties of this reliability measure are discussed. As the DCR subsumes the classical reliability measure (i.e., where no distance constraints are imposed on the paths connecting the nodes), as a byproduct we prove wellknown classical results.
© Hindawi Publishing Corp. RELIABILITY OF COMMUNICATION NETWORKS WITH DELAY CONSTRAINTS: COMPUTATIONAL COMPLEXITY AND COMPLETE TOPOLOGIES
, 2003
"... Let G = (V,E) be a graph with a distinguished set of terminal vertices K ⊆ V. We define the Kdiameter of G as the maximum distance between any pair of vertices of K. If the edges fail randomly and independently with known probabilities (vertices are always operational), the diameterconstrained Kt ..."
Abstract
 Add to MetaCart
Let G = (V,E) be a graph with a distinguished set of terminal vertices K ⊆ V. We define the Kdiameter of G as the maximum distance between any pair of vertices of K. If the edges fail randomly and independently with known probabilities (vertices are always operational), the diameterconstrained Kterminal reliability of G, RK(G,D), is defined as the probability that surviving edges span a subgraph whose Kdiameter does not exceed D. In general, the computational complexity of evaluating RK(G,D) is NPhard, as this measure subsumes the classical Kterminal reliability RK(G), known to belong to this complexity class. In this note, we show that even though for two terminal vertices s and t and D = 2, R{s,t}(G,D) can be determined in polynomial time, the problem of calculating R{s,t}(G,D) forfixedvaluesofD, D ≥ 3, is NPhard. We also generalize this result for any fixed number of terminal vertices. Although it is very unlikely that general efficient algorithms exist, we present a recursive formulation for the calculation of R{s,t}(G,D) that yields a polynomial time evaluation algorithm in the case of complete topologies where the edge set can be partitioned into at most four equireliable classes. 2000 Mathematics Subject Classification: 05C99, 90B25.