Results 1  10
of
25
The Size Of The Giant Component Of A Random Graph With A Given Degree Sequence
 COMBIN. PROBAB. COMPUT
, 2000
"... Given a sequence of nonnegative real numbers 0 ; 1 ; : : : which sum to 1, we consider a random graph having approximately i n vertices of degree i. In [12] the authors essentially show that if P i(i \Gamma 2) i ? 0 then the graph a.s. has a giant component, while if P i(i \Gamma 2) i ! 0 ..."
Abstract

Cited by 170 (0 self)
 Add to MetaCart
(Show Context)
Given a sequence of nonnegative real numbers 0 ; 1 ; : : : which sum to 1, we consider a random graph having approximately i n vertices of degree i. In [12] the authors essentially show that if P i(i \Gamma 2) i ? 0 then the graph a.s. has a giant component, while if P i(i \Gamma 2) i ! 0 then a.s. all components in the graph are small. In this paper we analyze the size of the giant component in the former case, and the structure of the graph formed by deleting that component. We determine
The phase transition in inhomogeneous random graphs, preprint available from http://www.arxiv.org/abs/math.PR/0504589
"... Abstract. The ‘classical ’ random graph models, in particular G(n, p), are ‘homogeneous’, in the sense that the degrees (for example) tend to be concentrated around a typical value. Many graphs arising in the real world do not have this property, having, for example, powerlaw degree distributions. ..."
Abstract

Cited by 156 (32 self)
 Add to MetaCart
(Show Context)
Abstract. The ‘classical ’ random graph models, in particular G(n, p), are ‘homogeneous’, in the sense that the degrees (for example) tend to be concentrated around a typical value. Many graphs arising in the real world do not have this property, having, for example, powerlaw degree distributions. Thus there has been a lot of recent interest in defining and studying ‘inhomogeneous ’ random graph models. One of the most studied properties of these new models is their ‘robustness’, or, equivalently, the ‘phase transition ’ as an edge density parameter is varied. For G(n, p), p = c/n, the phase transition at c = 1 has been a central topic in the study of random graphs for well over 40 years. Many of the new inhomogenous models are rather complicated; although there are exceptions, in most cases precise questions such as determining exactly the critical point of the phase transition are approachable only when there is independence between the edges. Fortunately, some models studied have this already, and others can be approximated by models with
A Parallelization of Dijkstra's Shortest Path Algorithm
 IN PROC. 23RD MFCS'98, LECTURE NOTES IN COMPUTER SCIENCE
, 1998
"... The single source shortest path (SSSP) problem lacks parallel solutions which are fast and simultaneously workefficient. We propose simple criteria which divide Dijkstra's sequential SSSP algorithm into a number of phases, such that the operations within a phase can be done in parallel. We giv ..."
Abstract

Cited by 35 (6 self)
 Add to MetaCart
(Show Context)
The single source shortest path (SSSP) problem lacks parallel solutions which are fast and simultaneously workefficient. We propose simple criteria which divide Dijkstra's sequential SSSP algorithm into a number of phases, such that the operations within a phase can be done in parallel. We give a PRAM algorithm based on these criteria and analyze its performance on random digraphs with random edge weights uniformly distributed in [0, 1]. We use
The scaling window of the 2sat transition
, 1999
"... Abstract. We consider the random 2satisfiability problem, in which each instance is a formula that is the conjunction of m clauses of the form x ∨ y, chosen uniformly at random from among all 2clauses on n Boolean variables and their negations. As m and n tend to infinity in the ratio m/n → α, the ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
Abstract. We consider the random 2satisfiability problem, in which each instance is a formula that is the conjunction of m clauses of the form x ∨ y, chosen uniformly at random from among all 2clauses on n Boolean variables and their negations. As m and n tend to infinity in the ratio m/n → α, the problem is known to have a phase transition at αc = 1, below which the probability that the formula is satisfiable tends to one and above which it tends to zero. We determine the finitesize scaling about this transition, namely the scaling of the maximal window W(n,δ) = (α−(n,δ),α+(n,δ)) such that the probability of satisfiability is greater than 1 − δ for α < α − and is less than δ for α> α+. We show that W(n,δ) = (1 − Θ(n −1/3),1 + Θ(n −1/3)), where the constants implicit in Θ depend on δ. We also determine the rates at which the probability of satisfiability approaches one and zero at the boundaries of the window. Namely, for m = (1 + ε)n, where ε may depend on n as long as ε  is sufficiently small and εn 1/3 is sufficiently large, we show that the probability of satisfiability decays like exp ( −Θ ( nε 3)) above the window, and goes to one like 1 − Θ ( n −1 ε  −3) below the window. We prove these results by defining an order parameter for the transition and establishing its scaling behavior in n both inside and outside the window. Using this order parameter, we prove that the 2SAT phase transition is continuous with an order parameter critical exponent of 1. We also determine the values of two other critical exponents, showing that the exponents of 2SAT are identical to those of the random graph.
Experimental evaluation of classical automata constructions
 In In LPAR 2005, LNCS 3835
, 2005
"... Abstract. There are several algorithms for producing the canonical DFA from a given NFA. While the theoretical complexities of these algorithms are known, there has not been a systematic empirical comparison between them. In this work we propose a probabilistic framework for testing the performance ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
(Show Context)
Abstract. There are several algorithms for producing the canonical DFA from a given NFA. While the theoretical complexities of these algorithms are known, there has not been a systematic empirical comparison between them. In this work we propose a probabilistic framework for testing the performance of automatatheoretic algorithms. We conduct a direct experimental comparison between Hopcroft’s and Brzozowski’s algorithms. We show that while Hopcroft’s algorithm has better overall performance, Brzozowski’s algorithm performs better for “highdensity” NFA. We also consider the universality problem, which is traditionally solved explicitly via the subset construction. We propose an encoding that allows this problem to be solved symbolically via a modelchecker. We compare the performance of this approach to that of the standard explicit algorithm, and show that the explicit approach performs significantly better. 1
The phase transition in random graphs  a simple proof
, 2012
"... The classical result of Erdős and Rényi asserts that the random graph G(n,p) experiences sharp phase transition around p = 1 n of G(n,p) are typically of size Oǫ(logn), while for p = 1+ǫ ..."
Abstract

Cited by 20 (10 self)
 Add to MetaCart
The classical result of Erdős and Rényi asserts that the random graph G(n,p) experiences sharp phase transition around p = 1 n of G(n,p) are typically of size Oǫ(logn), while for p = 1+ǫ
The critical random graph, with martingales
, 2006
"... We give a short proof that the largest component of the random graph G(n, 1/n) is of size approximately n 2/3. The proof gives explicit bounds for the probability that the ratio is very large or very small. ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
We give a short proof that the largest component of the random graph G(n, 1/n) is of size approximately n 2/3. The proof gives explicit bounds for the probability that the ratio is very large or very small.
Critical percolation on random regular graphs
, 2007
"... We describe the component sizes in critical independent pbond percolation on a random dregular graph on n vertices, where d ≥ 3 is fixed and n grows. We prove meanfield behavior around the critical probability pc = 1 d−1. In particular, we show that there is a scaling window of width n −1/3 aroun ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
We describe the component sizes in critical independent pbond percolation on a random dregular graph on n vertices, where d ≥ 3 is fixed and n grows. We prove meanfield behavior around the critical probability pc = 1 d−1. In particular, we show that there is a scaling window of width n −1/3 around pc in which the sizes of the largest components are roughly n 2/3 and we describe their limiting joint distribution. We also show that for the subcritical regime, i.e. p = (1 − ε(n))pc where ε(n) = o(1) but ε(n)n 1/3 → ∞, the sizes of the largest components are concentrated around an explicit function of n and ε(n) which is of order o(n 2/3). In the supercritical regime, i.e. p = (1 + ε(n))pc where ε(n) = o(1) but ε(n)n1/3 → ∞, the size of the largest component is concentrated around the value 2d d−2ε(n)n and a duality principle holds: other component sizes are distributed as in the subcritical regime.
A Dynamic Algorithm for Topologically Sorting Directed Acyclic Graphs
, 2004
"... We consider how to maintain the topological order of a directed acyclic graph (DAG) in the presence of edge insertions and deletions. We present a new algorithm and, although this has marginally inferior time complexity compared with the best previously known result, we find that its simplicity lead ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
We consider how to maintain the topological order of a directed acyclic graph (DAG) in the presence of edge insertions and deletions. We present a new algorithm and, although this has marginally inferior time complexity compared with the best previously known result, we find that its simplicity leads to better performance in practice. In addition, we provide an empirical comparison against three alternatives over a large number of random DAG's. The results show our algorithm is the best for sparse graphs and, surprisingly, that an alternative with poor theoretical complexity performs marginally better on dense graphs.