Results 1  10
of
68
Dependency parsing by belief propagation
 In Proceedings of EMNLP
, 2008
"... We formulate dependency parsing as a graphical model with the novel ingredient of global constraints. We show how to apply loopy belief propagation (BP), a simple and effective tool for approximate learning and inference. As a parsing algorithm, BP is both asymptotically and empirically efficient. E ..."
Abstract

Cited by 65 (7 self)
 Add to MetaCart
We formulate dependency parsing as a graphical model with the novel ingredient of global constraints. We show how to apply loopy belief propagation (BP), a simple and effective tool for approximate learning and inference. As a parsing algorithm, BP is both asymptotically and empirically efficient. Even with secondorder features or latent variables, which would make exact parsing considerably slower or NPhard, BP needs only O(n3) time with a small constant factor. Furthermore, such features significantly improve parse accuracy over exact firstorder methods. Incorporating additional features would increase the runtime additively rather than multiplicatively. 1
MRF energy minimization and beyond via dual decomposition
 IN: IEEE PAMI. (2011
"... This paper introduces a new rigorous theoretical framework to address discrete MRFbased optimization in computer vision. Such a framework exploits the powerful technique of Dual Decomposition. It is based on a projected subgradient scheme that attempts to solve an MRF optimization problem by first ..."
Abstract

Cited by 35 (2 self)
 Add to MetaCart
This paper introduces a new rigorous theoretical framework to address discrete MRFbased optimization in computer vision. Such a framework exploits the powerful technique of Dual Decomposition. It is based on a projected subgradient scheme that attempts to solve an MRF optimization problem by first decomposing it into a set of appropriately chosen subproblems and then combining their solutions in a principled way. In order to determine the limits of this method, we analyze the conditions that these subproblems have to satisfy and we demonstrate the extreme generality and flexibility of such an approach. We thus show that, by appropriately choosing what subproblems to use, one can design novel and very powerful MRF optimization algorithms. For instance, in this manner we are able to derive algorithms that: 1) generalize and extend stateoftheart messagepassing methods, 2) optimize very tight LPrelaxations to MRF optimization, 3) and take full advantage of the special structure that may exist in particular MRFs, allowing the use of efficient inference techniques such as, e.g, graphcut based methods. Theoretical analysis on the bounds related with the different algorithms derived from our framework and experimental results/comparisons using synthetic and real data for a variety of tasks in computer vision demonstrate the extreme potentials of our approach.
A selfmanaged distributed channel selection algorithm for WLANs
 Proceedings of 4th International Symposium on Modeling and Optimization in Mobile, Ad Hoc and Wireless Networks
, 2006
"... Abstract — In this paper we consider the problem of a wireless LAN selecting a channel to minimise interference with other WLANs. We focus on interfering infrastructuremode networks, where each access point (AP) or base station has a wired backhaul link. We introduce a new fully distributed and sel ..."
Abstract

Cited by 33 (10 self)
 Add to MetaCart
Abstract — In this paper we consider the problem of a wireless LAN selecting a channel to minimise interference with other WLANs. We focus on interfering infrastructuremode networks, where each access point (AP) or base station has a wired backhaul link. We introduce a new fully distributed and selfmanaged channel selection algorithm that does not require direct communication between APs nor explicit estimation of the network interference graph. The sole information required by the algorithm is feedback to each WLAN on the presence of interference on a chosen channel; such feedback is already commonly provided by WLAN protocols such as 802.11. We establish that convergence of the distributed algorithm is guaranteed provided that the channel selection problem is feasible. Extensive simulation results are presented that demonstrate rapid convergence under a wide range of network conditions and topologies. While the scope of the present paper is confined to infrastructure networks with static topology, the utility of the proposed algorithm in situations where the network topology is timevarying is briefly discussed. I.
How much parallelism is there in irregular applications
 In PPoPP
, 2009
"... Irregular programs are programs organized around pointerbased data structures such as trees and graphs. Recent investigations by the Galois project have shown that many irregular programs have a generalized form of dataparallelism called amorphous dataparallelism. However, in many programs, amorp ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
Irregular programs are programs organized around pointerbased data structures such as trees and graphs. Recent investigations by the Galois project have shown that many irregular programs have a generalized form of dataparallelism called amorphous dataparallelism. However, in many programs, amorphous dataparallelism cannot be uncovered using static techniques, and its exploitation requires runtime strategies such as optimistic parallel execution. This raises a natural question: how much amorphous dataparallelism actually exists in irregular programs? In this paper, we describe the design and implementation of a tool called ParaMeter that produces parallelism profiles for irregular programs. Parallelism profiles are an abstract measure of the amount of amorphous dataparallelism at different points in the execution of an algorithm, independent of implementationdependent details such as the number of cores, cache sizes, loadbalancing, etc. ParaMeter can also generate constrained parallelism profiles for a fixed number of cores. We show parallelism profiles for seven irregular applications, and explain how these profiles provide insight into the behavior of these applications.
Complete convergence of message passing algorithms for some satisfiability problems
, 2008
"... ..."
BeliefPropagation for Weighted bMatchings on Arbitrary Graphs and its Relation to Linear Programs with Integer Solutions
 in arXiv, http://www.arxiv.org/abs/0709.1190v1
, 2007
"... We consider the general problem of finding the minimum weight bmatching on arbitrary graphs. We prove that, whenever the linear programming (LP) relaxation of the problem has no fractional solutions, then the belief propagation (BP) algorithm converges to the correct solution. This result is notabl ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
We consider the general problem of finding the minimum weight bmatching on arbitrary graphs. We prove that, whenever the linear programming (LP) relaxation of the problem has no fractional solutions, then the belief propagation (BP) algorithm converges to the correct solution. This result is notable in several regards: (1) It is one of a very small number of proofs showing correctness of BP without any constraint on the graph structure. (2) Instead of showing that BP leads to a PTAS, we give a finite bound for the number of iterations after which BP has converged to the exact solution. (3) Variants of the proof work for both synchronous and asynchronous BP; to the best of our knowledge, it is the first proof of convergence and correctness of an asynchronous BP algorithm for a combinatorial optimization problem. (4) It works for both ordinary bmatchings and the more difficult case of perfect bmatchings. (5) Together with the recent work of Sanghavi, Malioutov and Wilskly [41] they are the first complete proofs showing that tightness of LP implies correctness of BP. 1
Characterizing propagation methods for Boolean satisfiability
 IN: PROC. OF 9TH INTERNATIONAL CONFERENCE ON THEORY AND APPLICATIONS OF SATISFIABILITY TESTING (SAT ’06
, 2006
"... Iterative algorithms such as Belief Propagation and Survey Propagation can handle some of the largest randomlygenerated satisfiability problems (SAT) created to this point. But they can make inaccurate estimates or fail to converge on instances whose underlying constraint graphs contain small loo ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
Iterative algorithms such as Belief Propagation and Survey Propagation can handle some of the largest randomlygenerated satisfiability problems (SAT) created to this point. But they can make inaccurate estimates or fail to converge on instances whose underlying constraint graphs contain small loops–a particularly strong concern with structured problems. More generally, their behavior is only wellunderstood in terms of statistical physics on a specific underlying model. Our alternative characterization of propagation algorithms presents them as value and variable ordering heuristics whose operation can be codified in terms of the Expectation Maximization (EM) method. Besides explaining failure to converge in the general case, understanding the equivalence between Propagation and EM yields new versions of such algorithms. When these are applied to SAT, such an understanding even yields a slight modification that guarantees convergence.
Structuredriven optimizations for amorphous dataparallel programs
 In PPoPP ’10: Proceedings of the 15th ACM SIGPLAN symposium on Principles and practice of parallel computing (2010), ACM
"... Irregular algorithms are organized around pointerbased data structures such as graphs and trees, and they are ubiquitous in applications. Recent work by the Galois project has provided a systematic approach for parallelizing irregular applications based on the idea of optimistic or speculative exec ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
Irregular algorithms are organized around pointerbased data structures such as graphs and trees, and they are ubiquitous in applications. Recent work by the Galois project has provided a systematic approach for parallelizing irregular applications based on the idea of optimistic or speculative execution of programs. However, the overhead of optimistic parallel execution can be substantial. In this paper, we show that many irregular algorithms have structure that can be exploited and present three key optimizations that take advantage of algorithmic structure to reduce speculative overheads. We describe the implementation of these optimizations in the Galois system and present experimental results to demonstrate their benefits. To the best of our knowledge, this is the first system to exploit algorithmic structure to optimize the execution of irregular programs.
A Parallel Framework For Loopy Belief Propagation ABSTRACT
"... There are many innovative proposals introduced in the literature under the evolutionary computation field, from which estimation of distribution algorithms (EDAs) is one of them. Their main characteristic is the use of probabilistic models to represent the (in)dependencies between the variables of a ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
There are many innovative proposals introduced in the literature under the evolutionary computation field, from which estimation of distribution algorithms (EDAs) is one of them. Their main characteristic is the use of probabilistic models to represent the (in)dependencies between the variables of a concrete problem. Such probabilistic models have also been applied to the theoretical analysis of EDAs, providing a platform for the implementation of other optimization methods that can be incorporated into the EDA framework. Some of these methods, typically used for probabilistic inference, are belief propagation algorithms. In this paper we present a parallel approach for one of these inferencebased algorithms, the loopy belief propagation algorithm for factor graphs. Our parallel implementation was designed to provide an algorithm that can be executed in clusters of computers or multiprocessors in order to reduce the total execution time. In addition, this framework was also designed as a flexible tool where many parameters, such as scheduling rules or stopping criteria, can be adjusted according to the requirements of each particular experiment and problem.
Solving constraint satisfaction problems through belief propagationguided decimation
 in Proc. of the Allerton Conf. on Commun., Control, and Computing
"... Abstract — Message passing algorithms have proved surprisingly successful in solving hard constraint satisfaction problems on sparse random graphs. In such applications, variables are fixed sequentially to satisfy the constraints. Message passing is run after each step. Its outcome provides an heuri ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
Abstract — Message passing algorithms have proved surprisingly successful in solving hard constraint satisfaction problems on sparse random graphs. In such applications, variables are fixed sequentially to satisfy the constraints. Message passing is run after each step. Its outcome provides an heuristic to make choices at next step. This approach has been referred to as ‘decimation, ’ with reference to analogous procedures in statistical physics. The behavior of decimation procedures is poorly understood. Here we consider a simple randomized decimation algorithm based on belief propagation (BP), and analyze its behavior on random ksatisfiability formulae. In particular, we propose a tree model for its analysis and we conjecture that it provides asymptotically exact predictions in the limit of large instances. This conjecture is confirmed by numerical simulations. I.