Results 1  10
of
13
All Pairs Almost Shortest Paths
 SIAM Journal on Computing
, 1996
"... Let G = (V; E) be an unweighted undirected graph on n vertices. A simple argument shows that computing all distances in G with an additive onesided error of at most 1 is as hard as Boolean matrix multiplication. Building on recent work of Aingworth, Chekuri and Motwani, we describe g) time ..."
Abstract

Cited by 96 (7 self)
 Add to MetaCart
Let G = (V; E) be an unweighted undirected graph on n vertices. A simple argument shows that computing all distances in G with an additive onesided error of at most 1 is as hard as Boolean matrix multiplication. Building on recent work of Aingworth, Chekuri and Motwani, we describe g) time algorithm APASP 2 for computing all distances in G with an additive onesided error of at most 2. The algorithm APASP 2 is simple, easy to implement, and faster than the fastest known matrix multiplication algorithm. Furthermore, for every even k ? 2, we describe an g) time algorithm APASP k for computing all distances in G with an additive onesided error of at most k.
Regularity lemmas and combinatorial algorithms
 In Proc. FOCS
"... Abstract — We present new combinatorial algorithms for Boolean matrix multiplication (BMM) and preprocessing a graph to answer independent set queries. We give the first asymptotic improvements on combinatorial algorithms for dense BMM in many years, improving on the “Four Russians ” O(n 3 /(w log n ..."
Abstract

Cited by 19 (3 self)
 Add to MetaCart
Abstract — We present new combinatorial algorithms for Boolean matrix multiplication (BMM) and preprocessing a graph to answer independent set queries. We give the first asymptotic improvements on combinatorial algorithms for dense BMM in many years, improving on the “Four Russians ” O(n 3 /(w log n)) bound for machine models with wordsize w. (For a pointer machine, we can set w = log n.) The algorithms utilize notions from Regularity Lemmas for graphs in a novel way. • We give two randomized combinatorial algorithms for BMM. The first algorithm is essentially a reduction from BMM to the Triangle Removal Lemma. The best known bounds for the Triangle Removal Lemma only imply an O ` (n 3 log β)/(βw log n) ´ time algorithm for BMM where β = (log ∗ n) δ for some δ> 0, but improvements on the Triangle Removal Lemma would yield corresponding runtime improvements. The second algorithm applies the Weak Regularity Lemma of Frieze and Kannan along with “ several information compression ideas, running in O n 3 (log log n) 2 /(log n) 9/4 ”) time with probability exponentially “ close to 1. When w ≥ log n, it can be implemented in O n 3 (log log n) 2 /(w log n) 7/6 ”) time. Our results immediately imply improved combinatorial methods for CFG parsing, detecting trianglefreeness, and transitive closure. Using Weak Regularity, we also give an algorithm for answering queries of the form is S ⊆ V an independent set? in a graph. Improving on prior work, we show how to randomly preprocess a graph in O(n 2+ε) time (for all ε> 0) so that with high probability, all subsequent batches of log n independent “ set queries can be answered deterministically in O n 2 (log log n) 2 /((log n) 5/4 ”) time. When w ≥ log n, w queries can be answered in O n 2 (log log n) 2 /((log n) 7/6 ” time. In addition to its nice applications, this problem is interesting in that it is not known how to do better than O(n 2) using “algebraic ” methods. 1.
Arc Minimization in Finite State Decoding Graphs with CrossWord Acoustic Context
 In Proc. ICSLP’02
, 2002
"... Recent approaches to large vocabulary decoding with finite state graphs have focused on the use of state minimization algorithms to produce relatively compact graphs. This paper extends the finite state approach by developing complementary arcminimization techniques. The use of these techniques in ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Recent approaches to large vocabulary decoding with finite state graphs have focused on the use of state minimization algorithms to produce relatively compact graphs. This paper extends the finite state approach by developing complementary arcminimization techniques. The use of these techniques in concert with state minimization allows us to statically compile decoding graphs in which the acoustic models utilize a full word of crossword context. This is in significant contrast to typical systems which use only a single phone. We show that the particular arcminimization problem that arises is in fact an NPcomplete combinatorial optimization problem, and describe the reduction from 3SAT. We present experimental results that illustrate the moderate sizes and runtimes of graphs for the Switchboard task. 1.
A Simplification Algorithm for Visualizing the Structure of Complex Graphs
"... Complex graphs, ones containing thousands of nodes of high degree, are difficult to visualize. Displaying all of the nodes and edges of these graphs can create an incomprehensible cluttered output. This paper presents a simplification algorithm that may be applied to a complex graph in order to prod ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Complex graphs, ones containing thousands of nodes of high degree, are difficult to visualize. Displaying all of the nodes and edges of these graphs can create an incomprehensible cluttered output. This paper presents a simplification algorithm that may be applied to a complex graph in order to produce a controlled thinning of the graph. Using importance metrics, the simplification process removes nodes from the graph, leaving the central structure for visualization and evaluation. The simplification algorithm consists of two steps, calculation of the importance metrics and pruning. Several metrics based on various topological graph properties are described. The metrics are then used in a pruning process to simplify the graph. Nodes, along with their corresponding edges, are removed from the graph, while maintaining the graph’s overall connectivity. This simplified graph provides a cleaner, more meaningful visual representation of the graph’s structure; thus aiding the analysis of the graph’s underlying data. 1
9 Linear Time Approximation Algorithms for Degree Constrained Subgraph Problems
"... Summary. Many realworld problems require graphs of such large size that polynomial time algorithms are too costly as soon as their runtime is superlinear. Examples include problems in VLSIdesign or problems in bioinformatics. For such problems the question arises: What is the best solution that ca ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Summary. Many realworld problems require graphs of such large size that polynomial time algorithms are too costly as soon as their runtime is superlinear. Examples include problems in VLSIdesign or problems in bioinformatics. For such problems the question arises: What is the best solution that can be obtained in linear time? We survey linear time approximation algorithms for some classical problems from combinatorial optimization, e.g. matchings and branchings. For many combinatorial optimization problems arising from realworld applications, efficient, i.e., polynomial time algorithms are known for computing an optimum solution. However, there exist several applications for which the input size can easily exceed 10 9. In such cases polynomial time algorithms with a runtime that is quadratic
A WeightScaling Algorithm for MinCost Imperfect Matchings in Bipartite Graphs
, 2012
"... Call a bipartite graph G = (X; Y;E) balanced when X = Y. Given a balanced bipartite graph G with edge costs, the assignment problem asks for a perfect matching in G of minimum total cost. The Hungarian Method can solve assignment problems in time O(mn+n 2 log n), where n: = X = Y and m: = ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Call a bipartite graph G = (X; Y;E) balanced when X = Y. Given a balanced bipartite graph G with edge costs, the assignment problem asks for a perfect matching in G of minimum total cost. The Hungarian Method can solve assignment problems in time O(mn+n 2 log n), where n: = X = Y and m: = E. If the edge weights are integers bounded in magnitude by C> 1, then algorithms using weight scaling, such as that of Gabow and Tarjan, can lower the time to (log(nC)). There are important applications in which G is unbalanced, with X ≠ Y, and we require a mincost matching in G of size r: = min(X, Y) or, more generally, of some specified size s ≤ r. The Hungarian Method extends easily to find such a matching in time O(ms+s 2 log r), but weightscaling algorithms do not extend so easily. We introduce new machinery that allows us to find such a matching in time ( log(nC)) via weight scaling. Our results also provide insight into the design space of efficient weightscaling matching algorithms.
An introduction to Stream Data Management on Large Information Networks ∗
"... In recent times there has been a surge of large scale information networks arising in various application domains, ranging from communication networks, cellphone call networks, social networks, email networks, road traffic networks, financial transaction networks, to name a few. In such applications ..."
Abstract
 Add to MetaCart
(Show Context)
In recent times there has been a surge of large scale information networks arising in various application domains, ranging from communication networks, cellphone call networks, social networks, email networks, road traffic networks, financial transaction networks, to name a few. In such applications there is a need to manage and process large data streams in nearreal time. Examples of such queries include finding breaking news in Twitter stream, detecting network intrusion from network communication data, detecting congestion from traffic information, and so on. Some of the key data management challenges in supporting such realtime stream processing are very high update rates, dynamic structural changes in the underlying networks, complex relationships between data items, low latency requirements for processing queries. In this paper we present an introduction to the problem of stream data management in large scale information networks. We present some of the related works in the area of stream data management, continuous aggregation queries in sensor networks, intrusion detection, and processing social network data streams. We also present a glimpse of our ongoing work to manage large graphs and efficiently evaluate realtime aggregation queries on them. 1
Finding one of many Disjoint Perfect Matchings in a Bipartite Graph
, 2002
"... We demonstrate how to find a perfect matching in a bipartite graph containing {\sqrt n \sigma^3} disjoint perfect matchings in time O(\sqrt n m/\sigma ). ..."
Abstract
 Add to MetaCart
We demonstrate how to find a perfect matching in a bipartite graph containing {\sqrt n \sigma^3} disjoint perfect matchings in time O(\sqrt n m/\sigma ).
Rajeev Motwani (19622009)
, 2012
"... Rajeev Motwani was a preeminent theoretical computer scientist of his generation, a technology thought leader, an insightful venture capitalist, and a mentor to some of the most influential entrepreneurs in Silicon Valley in the first decade of the 21st century. This article presents an overview o ..."
Abstract
 Add to MetaCart
Rajeev Motwani was a preeminent theoretical computer scientist of his generation, a technology thought leader, an insightful venture capitalist, and a mentor to some of the most influential entrepreneurs in Silicon Valley in the first decade of the 21st century. This article presents an overview of Rajeev’s research, and provides a window to his early life and the various influences that shaped his research and professional career—it is a small celebration of his wonderful life and many achievements.
Covering a Tree by a Forest
"... Abstract. Consider a tree T and a forest F. The paper discusses the following new problems: The Forest vertexcover problem (FVC): cover the vertices of T by a minimum number of copies of trees of F, such that every vertex of T is covered exactly once. The Forest edgecover problem (FEC): cover the ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Consider a tree T and a forest F. The paper discusses the following new problems: The Forest vertexcover problem (FVC): cover the vertices of T by a minimum number of copies of trees of F, such that every vertex of T is covered exactly once. The Forest edgecover problem (FEC): cover the edges of T by a minimum number of copies of trees of F, such that every edge of T is covered exactly once. For a solution to always exist, we assume that F contains a one vertex (one edge) tree. Two versions of Problem FVC are considered: ordered covers (OFVC), and unordered covers (UFVC). Three versions of Problem FEC are considered: ordered covers (OFEC), unordered covers (UFEC) and consecutive covers (CFEC). We describe polynomial time algorithms for Problems OFVC, UFVC and CFEC, and prove that Problems OFEC and UFEC are NPcomplete.