Results 1 
2 of
2
On the Optimality of Solutions of the MaxProduct Belief Propagation Algorithm in Arbitrary Graphs
, 2001
"... Graphical models, suchasBayesian networks and Markov random fields, represent statistical dependencies of variables by a graph. The maxproduct "belief propagation" algorithm is a localmessage passing algorithm on this graph that is known to converge to a unique fixed point when the graph is a tr ..."
Abstract

Cited by 185 (15 self)
 Add to MetaCart
Graphical models, suchasBayesian networks and Markov random fields, represent statistical dependencies of variables by a graph. The maxproduct "belief propagation" algorithm is a localmessage passing algorithm on this graph that is known to converge to a unique fixed point when the graph is a tree. Furthermore, when the graph is a tree, the assignment based on the fixedpoint yields the most probable a posteriori (MAP) values of the unobserved variables given the observed ones. Recently, good
Lowcomplexity approaches to SlepianWolf nearlossless distributed data compression
 IEEE TRANS. INFORM. THEORY
, 2006
"... This paper discusses the Slepian–Wolf problem of distributed nearlossless compression of correlated sources. We introduce practical new tools for communicating at all rates in the achievable region. The technique employs a simple “sourcesplitting” strategy that does not require common sources of ra ..."
Abstract

Cited by 23 (6 self)
 Add to MetaCart
This paper discusses the Slepian–Wolf problem of distributed nearlossless compression of correlated sources. We introduce practical new tools for communicating at all rates in the achievable region. The technique employs a simple “sourcesplitting” strategy that does not require common sources of randomness at the encoders and decoders. This approach allows for pipelined encoding and decoding so that the system operates with the complexity of a single user encoder and decoder. Moreover, when this splitting approach is used in conjunction with iterative decoding methods, it produces a significant simplification of the decoding process. We demonstrate this approach for synthetically generated data. Finally, we consider the Slepian–Wolf problem when linear codes are used as syndromeformers and consider a linear programming relaxation to maximumlikelihood (ML) sequence decoding. We note that the fractional vertices of the relaxed polytope compete with the optimal solution in a manner analogous to that observed when the “minsum ” iterative decoding algorithm is applied. This relaxation exhibits the MLcertificate property: if an integral solution is found, it is the ML solution. For symmetric binary joint distributions, we show that selecting easily constructable “expander”style lowdensity parity check codes (LDPCs) as syndromeformers admits a positive error exponent and therefore provably good performance.