Results 1 
8 of
8
Assessing the Vulnerability of the Fiber Infrastructure to Disasters
"... Abstract—Communication networks are vulnerable to natural disasters, such as earthquakes or floods, as well as to physical attacks, such as an Electromagnetic Pulse (EMP) attack. Such realworld events happen in specific geographical locations and disrupt specific parts of the network. Therefore, the ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
Abstract—Communication networks are vulnerable to natural disasters, such as earthquakes or floods, as well as to physical attacks, such as an Electromagnetic Pulse (EMP) attack. Such realworld events happen in specific geographical locations and disrupt specific parts of the network. Therefore, the geographical layout of the network determines the impact of such events on the network’s connectivity. In this paper, we focus on assessing the vulnerability of (geographical) networks to such disasters. In particular, we aim to identify the most vulnerable parts of the network. That is, the locations of disasters that would have the maximum disruptive effect on the network in terms of capacity and connectivity. We consider graph models in which nodes and links are geographically located on a plane, and model the disaster event as a line segment or a circular cut. We develop algorithms that find a worstcase line segment cut and a worstcase circular cut. Then, we obtain numerical results for a specific backbone network, thereby demonstrating the applicability of our algorithms to realworld networks. Our novel approach provides a promising new direction for network design to avert geographical disasters or attacks. Index Terms—Network survivability, geographic networks, fiberoptic, Internet, Electromagnetic Pulse (EMP). I.
An integrated model of traffic, geography and economy in the internet
 SIGCOMM Comput. Commun. Rev
, 2008
"... Modeling Internet growth is important both for understanding the current network and to predict and improve its future. To date, Internet models have typically attempted to explain a subset of the following characteristics: network structure, traffic flow, geography, and economy. In this paper we pr ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
Modeling Internet growth is important both for understanding the current network and to predict and improve its future. To date, Internet models have typically attempted to explain a subset of the following characteristics: network structure, traffic flow, geography, and economy. In this paper we present a discrete, agentbased model, that integrates all of them. We show that the model generates networks with topologies, dynamics, and more speculatively spatial distributions that are similar to the Internet. Categories and Subject Descriptors
Network Reliability With Geographically Correlated Failures
"... Abstract—Fiberoptic networks are vulnerable to natural disasters, such as tornadoes or earthquakes, as well as to physical failures, such as an anchor cutting underwater fiber cables. Such realworld events occur in specific geographical locations and disrupt specific parts of the network. Therefor ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Abstract—Fiberoptic networks are vulnerable to natural disasters, such as tornadoes or earthquakes, as well as to physical failures, such as an anchor cutting underwater fiber cables. Such realworld events occur in specific geographical locations and disrupt specific parts of the network. Therefore, the geography of the network determines the effect of physical events on the network’s connectivity and capacity. In this paper, we develop tools to analyze network failures after a ‘random ’ geographic disaster. The random location of the disaster allows us to model situations where the physical failures are not targeted attacks. In particular, we consider disasters that take the form of a ‘random ’ line in a plane. Using results from geometric probability, we are able to calculate some network performance metrics to such a disaster in polynomial time. In particular, we can evaluate average twoterminal reliability in polynomial time under ‘random ’ linecuts. This is in contrast to the case of independent link failures for which there exists no known polynomial time algorithm to calculate this reliability metric. We also present some numerical results to show the significance of geometry on the survivability of the network and discuss network design in the context of random linecuts. Our novel approach provides a promising new direction for modeling and designing networks to lessen the effects of geographical disasters or attacks. I.
Approximating the number of Network Motifs
"... Abstract. World Wide Web, the Internet, coupled biological and chemical systems, neural networks, and social interacting species, are only a few examples of systems composed by a large number of highly interconnected dynamical units. These networks contain characteristic patterns, termed network mot ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. World Wide Web, the Internet, coupled biological and chemical systems, neural networks, and social interacting species, are only a few examples of systems composed by a large number of highly interconnected dynamical units. These networks contain characteristic patterns, termed network motifs, which occur far more often than in randomized networks with the same degree sequence. Several algorithms have been suggested for counting or detecting the number of induced or noninduced occurrences of network motifs in the form of trees and bounded treewidth subgraphs of size O(log n), and of size at most 7 for some motifs. In addition, counting the number of motifs a node is part of was recently suggested as a method to classify nodes in the network. The promise is that the distribution of motifs a node participate in is an indication of its function in the network. Therefore, counting the number of network motifs anodeispartofprovides a major challenge. However, no such practical algorithm exists. We present several algorithms with time complexity O ( e 2k k · n ·E·log 1 δ /ɛ2) that, for the first time, approximate for every vertex the number of noninduced occurrences of the motif the vertex is part of, for klength cycles, klength cycles with a chord, and (k − 1)length paths, where k = O(log n), and for all motifs of size of at most four. In addition, we show algorithms that approximate the total number of noninduced occurrences of these network motifs, when no efficient algorithm exists. Some of our algorithms use the color coding technique.
BOUNDING THE BIAS OF TREELIKE SAMPLING IN IP TOPOLOGIES
, 2008
"... Abstract. It is widely believed that the Internet’s ASgraph degree distribution obeys a powerlaw form. However, it was recently argued that since Internet data is collected in a treelike fashion, it only produces a sample of the degree distribution, and this sample may be biased. This argument wa ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. It is widely believed that the Internet’s ASgraph degree distribution obeys a powerlaw form. However, it was recently argued that since Internet data is collected in a treelike fashion, it only produces a sample of the degree distribution, and this sample may be biased. This argument was backed by simulation data and mathematical analysis, which demonstrated that under certain conditions a tree sampling procedure can produce an artificial powerlaw in the degree distribution. Thus, although the observed degree distribution of the ASgraph follows a powerlaw, this phenomenon may be an artifact of the sampling process. In this work we provide some evidence to the contrary. We show, by analysis and simulation, that when the underlying graph degree distribution obeys a powerlaw with an exponent γ> 2, a treelike sampling process produces a negligible bias in the sampled degree distribution. Furthermore, recent data collected from the DIMES project, which is not based on single source sampling, indicates that the Internet indeed obeys a powerlaw degree distribution with an exponent γ> 2. Combining this empirical data with our simulation of traceroute experiments on DIMESmeasured ASgraph as the underlying graph, and with our analysis, we conclude that the bias in the degree distribution calculated from BGP data is negligible. 1. Introduction. The
Project Summary
"... A growing consensus among experts is that the routing system is approaching a critical architectural breaking point [1] which any significant deployment of IPv6 will only exacerbate. The issue has recently drawn so much concern from engineering, operational, and policy communities that the Internet ..."
Abstract
 Add to MetaCart
A growing consensus among experts is that the routing system is approaching a critical architectural breaking point [1] which any significant deployment of IPv6 will only exacerbate. The issue has recently drawn so much concern from engineering, operational, and policy communities that the Internet Architecture Board [2] held a workshop in November 2006 trying to identify the factors that limit routing scalability, and formulate a coherent statement of the problem [1]. Their conclusion was expected: the most acutely scalelimiting parameter of the current routing system is routing table size, not so much for its memory requirements as for its reaction to network dynamics. Specifically, topology changes require recalculation of routing tables, a computational burden as well as a performance hit since traffic is often delayed or even lost as nodes converge to the updated routing state. Already having articulated the need for a fundamental reexamination of the routing and concomitant addressing architecture, our previous NeTS proposal allowed us to rigorously examine and evaluate known routing schemes in pursuit of one that would work on Internetlike scalefree topologies without a radical architectural shift. We learned that there are no existing dynamic routing schemes with reasonable scalability bounds on Internetlike graphs. Our current work [3],
Evolution of the Internet . ..
, 2009
"... We present an analytically tractable model of Internet evolution at the level of Autonomous Systems (ASs). We call our model the multiclass preferential attachment (MPA) model. As its name suggests, it is based on preferential attachment. All of its parameters are measurable from available Interne ..."
Abstract
 Add to MetaCart
We present an analytically tractable model of Internet evolution at the level of Autonomous Systems (ASs). We call our model the multiclass preferential attachment (MPA) model. As its name suggests, it is based on preferential attachment. All of its parameters are measurable from available Internet topology data. Given the estimated values of these parameters, our analytic results predict a definitive set of statistics characterizing the AS topology structure. These statistics are not part of model formulation. The MPA model thus closes the “measuremodelvalidatepredict” loop, and provides further evidence that preferential attachment is the main driving force behind Internet evolution.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination. IEEE/ACM TRANSACTIONS ON NETWORKING 1 Assessing the Vulnerability of the Fiber Infrastructure to Disasters
"... Abstract—Communication networks are vulnerable to natural disasters, such as earthquakes or floods, as well as to physical attacks, such as an electromagnetic pulse (EMP) attack. Such realworld events happen in specific geographical locations and disrupt specific parts of the network. Therefore, the ..."
Abstract
 Add to MetaCart
Abstract—Communication networks are vulnerable to natural disasters, such as earthquakes or floods, as well as to physical attacks, such as an electromagnetic pulse (EMP) attack. Such realworld events happen in specific geographical locations and disrupt specific parts of the network. Therefore, the geographical layout of the network determines the impact of such events on the network’s connectivity. In this paper, we focus on assessing the vulnerability of (geographical) networks to such disasters. In particular, we aim to identify the most vulnerable parts of the network. That is, the locations of disasters that would have the maximum disruptive effect on the network in terms of capacity and connectivity. We consider graph models in which nodes and links are geographically located on a plane. First, we consider a simplistic bipartite graph model and present a polynomialtime algorithm for finding a worstcase vertical line segment cut. We then generalize the network model to graphs with nodes at arbitrary locations. We model the disaster event as a line segment or a disk and develop polynomialtime algorithms that find a worstcase line segment cut and a worstcase circular cut. Finally, we obtain numerical results for a specific backbone network, thereby demonstrating the applicability of our algorithms to realworld networks. Our novel approach provides a promising new direction for network design to avert geographical disasters or attacks. Index Terms—Electromagnetic pulse (EMP), fiberoptic, geographically correlated failures, network survivability. I.