Results 1 
9 of
9
RANDOM SAMPLING IN CUT, FLOW, AND NETWORK DESIGN PROBLEMS
, 1999
"... We use random sampling as a tool for solving undirected graph problems. We show that the sparse graph, or skeleton, that arises when we randomly sample a graph’s edges will accurately approximate the value of all cuts in the original graph with high probability. This makes sampling effective for pro ..."
Abstract

Cited by 70 (11 self)
 Add to MetaCart
We use random sampling as a tool for solving undirected graph problems. We show that the sparse graph, or skeleton, that arises when we randomly sample a graph’s edges will accurately approximate the value of all cuts in the original graph with high probability. This makes sampling effective for problems involving cuts in graphs. We present fast randomized (Monte Carlo and Las Vegas) algorithms for approximating and exactly finding minimum cuts and maximum flows in unweighted, undirected graphs. Our cutapproximation algorithms extend unchanged to weighted graphs while our weightedgraph flow algorithms are somewhat slower. Our approach gives a general paradigm with potential applications to any packing problem. It has since been used in a nearlinear time algorithm for finding minimum cuts, as well as faster cut and flow algorithms. Our sampling theorems also yield faster algorithms for several other cutbased problems, including approximating the best balanced cut of a graph, finding a kconnected orientation of a 2kconnected graph, and finding integral multicommodity flows in graphs with a great deal of excess capacity. Our methods also improve the efficiency of some parallel cut and flow algorithms. Our methods also apply to the network design problem, where we wish to build a network satisfying certain connectivity requirements between vertices. We can purchase edges of various costs and wish to satisfy the requirements at minimum total cost. Since our sampling theorems apply even when the sampling probabilities are different for different edges, we can apply randomized rounding to solve network design problems. This gives approximation algorithms that guarantee much better approximations than previous algorithms whenever the minimum connectivity requirement is large. As a particular example, we improve the best approximation bound for the minimum kconnected subgraph problem from 1.85 to 1 � O(�log n)/k).
Experimental Study of Minimum Cut Algorithms
 PROCEEDINGS OF THE EIGHTH ANNUAL ACMSIAM SYMPOSIUM ON DISCRETE ALGORITHMS (SODA)
, 1997
"... Recently, several new algorithms have been developed for the minimum cut problem. These algorithms are very different from the earlier ones and from each other and substantially improve worstcase time bounds for the problem. We conduct experimental evaluation the relative performance of these algor ..."
Abstract

Cited by 40 (2 self)
 Add to MetaCart
Recently, several new algorithms have been developed for the minimum cut problem. These algorithms are very different from the earlier ones and from each other and substantially improve worstcase time bounds for the problem. We conduct experimental evaluation the relative performance of these algorithms. In the process, we develop heuristics and data structures that substantially improve practical performance of the algorithms. We also develop problem families for testing minimum cut algorithms. Our work leads to a better understanding of practical performance of the minimum cut algorithms and produces very efficient codes for the problem.
Lower bounds on twoterminal network reliability
, 1985
"... One measure of twoterminal network reliability, termed probabilistic connectedness, is the probability that two specified communication centers can communicate. A standard model of a network is a graph in which nodes represent communications centers and edges represent links between communication c ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
One measure of twoterminal network reliability, termed probabilistic connectedness, is the probability that two specified communication centers can communicate. A standard model of a network is a graph in which nodes represent communications centers and edges represent links between communication centers. Edges are assumed to have statistically independent probabilities of failing and nodes are assumed to be perfectly reliable. Exact calculation of twoterminal reliability for general networks has been shown to be #Pcomplete. As a result is desirable to compute upper and lower bounds that avoid the exponential computation likely required by exact algorithms. Two methods are considered for computing lower bounds on twoterminal reliability
Randomized Approximation Schemes for Cuts and Flows in Capacitated Graphs
, 2011
"... We describe random sampling techniques for approximately solving problems that involve cuts and flows in graphs. We give a nearlineartime randomized combinatorial construction that transforms any graph on n vertices into an O(n log n)edge graph on the same vertices whose cuts have approximately t ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
We describe random sampling techniques for approximately solving problems that involve cuts and flows in graphs. We give a nearlineartime randomized combinatorial construction that transforms any graph on n vertices into an O(n log n)edge graph on the same vertices whose cuts have approximately the same value as the original graph’s. In this new graph, for example, we can run the Õ(m3/2)time maximum flow algorithm of Goldberg and Rao to find an s– t minimum cut in Õ(n3/2) time. This corresponds to a (1 + ɛ)times minimum s–t cut in the original graph. A related approach leads to a randomized divide and conquer algorithm producing an approximately maximum flow in Õ(m √ n) time. Our algorithm is also used to improve the running time of sparsest cut algorithms from Õ(mn) to Õ(n²). Our approach also accelerates several other recent cut and flow algorithms. Our algorithms are based on a general theorem analyzing the concentration of cut values near their expectation in random graphs.
Reliability polynomials and their asymptotic limits for families of graphs
 J. Statist. Phys
, 2003
"... We present exact calculations of reliability polynomials R(G,p) for lattice strips G of fixed width and arbitrarily great length with various boundary conditions. We introduce the notion of a reliability per vertex, r({G},p) = lim V → ∞ R(G,p) 1/V  where V  denotes the number of vertices in G ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We present exact calculations of reliability polynomials R(G,p) for lattice strips G of fixed width and arbitrarily great length with various boundary conditions. We introduce the notion of a reliability per vertex, r({G},p) = lim V → ∞ R(G,p) 1/V  where V  denotes the number of vertices in G and {G} denotes the formal limit lim V → ∞ G. We calculate this exactly for various families of graphs. We also study the zeros of R(G,p) in the complex p plane and determine exactly the asymptotic accumulation set of these zeros B, across which r({G}) is nonanalytic.
Experimental Study of Minimum Cut Algorithms
 M.S. DISSERTATION, MIT
, 1997
"... Recently, several new algorithms have been developed for the minimum cut problem that substantially improve worstcase time bounds for the problem. These algorithms are very different from the earlier ones and from each other. We conduct an experimental evaluation of the relative performance of thes ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Recently, several new algorithms have been developed for the minimum cut problem that substantially improve worstcase time bounds for the problem. These algorithms are very different from the earlier ones and from each other. We conduct an experimental evaluation of the relative performance of these algorithms. In the process, we develop heuristics and data structures that substantially improve practical performance of the algorithms. We also develop problem families for testing minimum cut algorithms. Our work leads to a better understanding of practical performance of the minimum cut algorithms and produces very efficient codes for the problem.
Linear codes and character sums
 Combinatorica
, 1999
"... 2. Suppose the smallest Hamming weight of nonzero vectors in V is d. (In codingtheoretic terminology, V is a linear code of length n, rate r and distance d.) We settle two extremal problems on such spaces. First, we prove a (weak form) of a conjecture by Kalai and Linial and show that the fraction ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
2. Suppose the smallest Hamming weight of nonzero vectors in V is d. (In codingtheoretic terminology, V is a linear code of length n, rate r and distance d.) We settle two extremal problems on such spaces. First, we prove a (weak form) of a conjecture by Kalai and Linial and show that the fraction of vectors in V with weight d is exponentially small. Specifically, in the interesting case of a small r, this fraction does not exceed 2 \Gamma \Omega ( r 2 log(1=r)+1 n). We also answer a question of BenOr and show that if r? 1
Combining Network Reductions and Simulation to Estimate Network Reliability
"... Network reduction techniques are mainly used with exact approaches such as factoring to compute network reliability. However, exact computation of network reliability is feasible only for small sized networks. Simulation is an alternative approach to estimate network reliability. This paper discuses ..."
Abstract
 Add to MetaCart
Network reduction techniques are mainly used with exact approaches such as factoring to compute network reliability. However, exact computation of network reliability is feasible only for small sized networks. Simulation is an alternative approach to estimate network reliability. This paper discuses the effect of using network reductions before estimating network reliability using a simulation. Theoretical and empirical results are provided to understand the source of variance reduction in simulation due to network reductions. 1
Sparse Reliable Graph Backbones
"... Given a connected graph G and a failure probability p(e) for each edge e in G, the reliability of G is the probability that G remains connected when each edge e is removed independently with probability p(e). In this paper it is shown that every nvertex graph contains a sparse backbone, i.e., a spa ..."
Abstract
 Add to MetaCart
Given a connected graph G and a failure probability p(e) for each edge e in G, the reliability of G is the probability that G remains connected when each edge e is removed independently with probability p(e). In this paper it is shown that every nvertex graph contains a sparse backbone, i.e., a spanning subgraph with O(n log n) edges whose reliability is at least (1−n −Ω(1) ) times that of G. Moreover, for any pair of vertices s, t in G, the (s, t)reliability of the backbone, namely, the probability that s and t remain connected, is also at least (1 − n −Ω(1) ) times that of G. Our proof is based on a polynomial time randomized algorithm for constructing the backbone. In addition, it is shown that the constructed backbone has nearly the same Tutte polynomial as the original graph (in the quarterplane x ≥ 1, y> 1), and hence the graph and its backbone share many additional features encoded by the Tutte polynomial.