Results 1 
8 of
8
Robust messagepassing for statistical inference in sensor networks
 IN: PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE ON INFORMATION PROCESSING IN SENSOR NETWORKS IPSN’07
, 2007
"... Largescale sensor network applications require innetwork processing and data fusion to compute statistically relevant summaries of the sensed measurements. This paper studies distributed messagepassing algorithms, in which neighboring nodes in the network pass local information relevant to a glob ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Largescale sensor network applications require innetwork processing and data fusion to compute statistically relevant summaries of the sensed measurements. This paper studies distributed messagepassing algorithms, in which neighboring nodes in the network pass local information relevant to a global computation, for performing statistical inference. We focus on the class of reweighted belief propagation (RBP) algorithms, which includes as special cases the standard sumproduct and maxproduct algorithms for general networks with cycles, but in contrast to standard algorithms has attractive theoretical properties (uniqueness of fixed points, convergence, and robustness). Our main contribution is to design and implement a practical and modular architecture for implementing RBP algorithms in real networks. In addition, we show how intelligent scheduling of RBP messages can be used to minimize communication between motes and prolong the lifetime of the network. Our simulation and Mica2 mote deployment indicate that the proposed algorithms achieve accurate results despite realworld problems such as dying motes, dead and asymmetric links, and dropped messages. Overall, the class of RBP provides provides an ideal fit for sensor networks due to their distributed nature, requiring only local knowledge and coordination, and little requirements on other services such as reliable transmission.
Stochastic belief propagation: A lowcomplexity alternative to the sumproduct algorithms
 COMPUTING RESEARCH REPOSITORY
, 2011
"... The belief propagation (BP) or sumproduct algorithm is a widelyused messagepassing method for computing marginal distributions in graphical models. At the core of the BP message updates, when applied to a graphical model involving discrete variables with pairwise interactions, lies a matrixvect ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
The belief propagation (BP) or sumproduct algorithm is a widelyused messagepassing method for computing marginal distributions in graphical models. At the core of the BP message updates, when applied to a graphical model involving discrete variables with pairwise interactions, lies a matrixvector product with complexity that is quadratic in the state dimension d, and requires transmission of a (d − 1)dimensional vector of real numbers (messages) to its neighbors. Since various applications involve very large state dimensions, such computation and communication complexities can be prohibitively complex. In this paper, we propose a lowcomplexity variant of BP, referred to as stochastic belief propagation (SBP). As suggested by the name, it is an adaptively randomized version of the BP message updates in which each node passes randomly chosen information to each of its neighbors. The SBP message updates reduce the computational complexity (per iteration) from quadratic to linear in d, without assuming any particular structure of the potentials, and also reduce the communication complexity significantly, requiring only log 2d bits transmission per edge. Moreover, we establish a number of theoretical guarantees for the performance of SBP, showing that it converges almost surely to the BP fixed point for any treestructured graph, and for any graph with cycles satisfying a contractivity condition. In addition, for these graphical models, we provide nonasymptotic upper bounds on the convergence rate, showing that the ℓ ∞ norm of the error vector decays no slower than O ( 1 / √ t) with the number of iterations t on trees and the normalized meansquared error decays as O ( 1/t) for general graphs. This analysis, also supported by experimental results, shows that SBP can provably yield reductions in computational and communication complexities for various classes of graphical models.
Belief Propagation for Continuous State Spaces: Stochastic MessagePassing with Quantitative Guarantees
, 2012
"... The sumproduct or belief propagation (BP) algorithm is a widely used messagepassing technique for computing approximate marginals in graphical models. We introduce a new technique, called stochastic orthogonal series messagepassing (SOSMP), for computing the BP fixed point in models with continuo ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The sumproduct or belief propagation (BP) algorithm is a widely used messagepassing technique for computing approximate marginals in graphical models. We introduce a new technique, called stochastic orthogonal series messagepassing (SOSMP), for computing the BP fixed point in models with continuous random variables. It is based on a deterministic approximation of the messages via orthogonal series expansion, and a stochastic approximation via Monte Carlo estimates of the integral updates of the basis coefficients. We prove that the SOSMP iterates converge to a δneighborhood of the unique BP fixed point for any treestructured graph, and for any graphs with cycles in which the BP updates satisfy a contractivity condition. In addition, we demonstrate how to choose the number of basis coefficients as a function of the desired approximation accuracy δ and smoothness of the compatiblity functions. We illustrate our theory with both simulated examples and in application to optical flow estimation.
What Cannot be Learned with Bethe Approximations
"... We address the problem of learning the parameters in graphical models when inference is intractable. A common strategy in this case is to replace the partition function with its Bethe approximation. We show that there exists a regime of empirical marginals where such Bethe learning will fail. By fai ..."
Abstract
 Add to MetaCart
We address the problem of learning the parameters in graphical models when inference is intractable. A common strategy in this case is to replace the partition function with its Bethe approximation. We show that there exists a regime of empirical marginals where such Bethe learning will fail. By failure we mean that the empirical marginals cannot be recovered from the approximated maximum likelihood parameters (i.e., moment matching is not achieved). We provide several conditions on empirical marginals that yield outer and inner bounds on the set of Bethe learnable marginals. An interesting implication of
Toulouse
, 2013
"... spécialité Sciences et Technologies de l’Information et de la Communication par ..."
Abstract
 Add to MetaCart
spécialité Sciences et Technologies de l’Information et de la Communication par