Results 1  10
of
435
SelfSimilarity Through HighVariability: Statistical Analysis of Ethernet LAN Traffic at the Source Level
 IEEE/ACM TRANSACTIONS ON NETWORKING
, 1997
"... A number of recent empirical studies of traffic measurements from a variety of working packet networks have convincingly demonstrated that actual network traffic is selfsimilar or longrange dependent in nature (i.e., bursty over a wide range of time scales)  in sharp contrast to commonly made tr ..."
Abstract

Cited by 597 (24 self)
 Add to MetaCart
A number of recent empirical studies of traffic measurements from a variety of working packet networks have convincingly demonstrated that actual network traffic is selfsimilar or longrange dependent in nature (i.e., bursty over a wide range of time scales)  in sharp contrast to commonly made traffic modeling assumptions. In this paper, we provide a plausible physical explanation for the occurrence of selfsimilarity in LAN traffic. Our explanation is based on new convergence results for processes that exhibit high variability (i.e., infinite variance) and is supported by detailed statistical analyses of realtime traffic measurements from Ethernet LAN's at the level of individual sources. This paper is an extended version of [53] and differs from it in significant ways. In particular, we develop here the mathematical results concerning the superposition of strictly alternating ON/OFF sources. Our key mathematical result states that the superposition of many ON/OFF sources (also k...
How bad is selfish routing?
 JOURNAL OF THE ACM
, 2002
"... We consider the problem of routing traffic to optimize the performance of a congested network. We are given a network, a rate of traffic between each pair of nodes, and a latency function for each edge specifying the time needed to traverse the edge given its congestion; the objective is to route t ..."
Abstract

Cited by 516 (27 self)
 Add to MetaCart
We consider the problem of routing traffic to optimize the performance of a congested network. We are given a network, a rate of traffic between each pair of nodes, and a latency function for each edge specifying the time needed to traverse the edge given its congestion; the objective is to route traffic such that the sum of all travel times—the total latency—is minimized. In many settings, it may be expensive or impossible to regulate network traffic so as to implement an optimal assignment of routes. In the absence of regulation by some central authority, we assume that each network user routes its traffic on the minimumlatency path available to it, given the network congestion caused by the other users. In general such a “selfishly motivated ” assignment of traffic to paths will not minimize the total latency; hence, this lack of regulation carries the cost of decreased network performance. In this article, we quantify the degradation in network performance due to unregulated traffic. We prove that if the latency of each edge is a linear function of its congestion, then the total latency of the routes chosen by selfish network users is at most 4/3 times the minimum possible total latency (subject to the condition that all traffic must be routed). We also consider the more general setting in which edge latency functions are assumed only to be continuous and nondecreasing in the edge congestion. Here, the total
Improved algorithms for topic distillation in a hyperlinked environment
 In SIGIR Conference on Research and Development in Information Retrieval
, 1998
"... Abstract This paper addresses the problem of topic distillation on the World Wide Web, namely, given a typical user query to find quality documents related to the query topic. Connectivity analysis has been shown to be useful in identifying high quality pages within a topic specific graph of hyperli ..."
Abstract

Cited by 403 (7 self)
 Add to MetaCart
Abstract This paper addresses the problem of topic distillation on the World Wide Web, namely, given a typical user query to find quality documents related to the query topic. Connectivity analysis has been shown to be useful in identifying high quality pages within a topic specific graph of hyperlinked documents. The essence of our approach is to augment a previous connectivity analysis based algorithm with content analysis. We identify three problems with the existing approach and devise algorithms to tackle them. The results of a user evaluation are reported that show an improvement of precision at 10 documents by at least 45 % over pure connectivity analysis. 1
The Variable Discharge of Cortical Neurons: Implications for Connectivity, Computation, and Information Coding
 J. Neurosci
, 1998
"... this paper we propose that the irregular ISI arises as a consequence of a specific problem that cortical neurons must solve: the problem of dynamic range or gain control. Cortical neurons receive 300010,000 synaptic contacts, 85% of which are asymmetric and hence presumably excitatory (Peters, 198 ..."
Abstract

Cited by 219 (1 self)
 Add to MetaCart
this paper we propose that the irregular ISI arises as a consequence of a specific problem that cortical neurons must solve: the problem of dynamic range or gain control. Cortical neurons receive 300010,000 synaptic contacts, 85% of which are asymmetric and hence presumably excitatory (Peters, 1987; Braitenberg and Schuz, 1991). More than half of these contacts are thought to arise from neurons within a 100200 #m radius of the target cell, reflecting the stereotypical columnar organization of neocortex. Because neurons within a cortical column typically share similar physiological properties, the conditions that excite one neuron are likely to excite a considerable fraction of its afferent input as well (Mountcastle, 1978; Peters and Sethares, 1991), creating a scenario in which saturation of the neuron's firing rate could easily occur. This problem is exacerbated by the fact that EPSPs from individual axons appear to exert substantial impact on the membrane potential (Mason et al., 1991; Otmakhov Received Sept. 15, 1997; revised Feb. 25, 1998; accepted March 3, 1998.
Packet Loss Correlation in the MBone Multicast Network
, 1996
"... The recent success of multicast applications such as Internet teleconferencing illustrates the tremendous potential of applications built upon widearea multicast communication services. A critical issue for such multicast applications and the higher layer protocols required to support them is the m ..."
Abstract

Cited by 213 (17 self)
 Add to MetaCart
The recent success of multicast applications such as Internet teleconferencing illustrates the tremendous potential of applications built upon widearea multicast communication services. A critical issue for such multicast applications and the higher layer protocols required to support them is the manner in which packet losses occur within the multicast network. In this paper we present and analyze packet loss data collected on multicastcapable hosts at 17 geographically distinct locations in Europe and the US and connected via the MBone. We experimentally and quantitatively examine the spatial and temporal correlation in packet loss among participants in a multicast session. Our results show that there is some spatial correlation in loss among the multicast sites. However, the shared loss in the backbone of the MBone is, for the most part, low. We find a fairly significant amount of of burst loss (consecutive losses) at most sites. In every dataset, at least one receiver experienced ...
Proof of a Fundamental Result in SelfSimilar Traffic Modeling
 COMPUTER COMMUNICATION REVIEW
, 1997
"... We state and prove the following key mathematical result in selfsimilar traffic modeling: the superposition of many ON/OFF sources (also known as packet trains) with strictly alternating ON and OFFperiods and whose ONperiods or OFFperiods exhibit the Noah Effect (i.e., have high variability or ..."
Abstract

Cited by 206 (8 self)
 Add to MetaCart
We state and prove the following key mathematical result in selfsimilar traffic modeling: the superposition of many ON/OFF sources (also known as packet trains) with strictly alternating ON and OFFperiods and whose ONperiods or OFFperiods exhibit the Noah Effect (i.e., have high variability or infinite variance) can produce aggregate network traffic that exhibits the Joseph Effect (i.e., is selfsimilar or longrange dependent). There is, moreover, a simple relation between the parameters describing the intensities of the Noah Effect (high variability) and the Joseph Effect (selfsimilarity). This provides a simple physical explanation for the presence of selfsimilar traffic patterns in modern highspeed network traffic that is consistent with traffic measurements at the source level. We illustrate how this mathematical result can be combined with modern highperformance computing capabilities to yield a simple and efficient lineartime algorithm for generating selfsimilar traf...
The price of anarchy is independent of the network topology
 JOURNAL OF COMPUTER AND SYSTEM SCIENCES
, 2002
"... We study the degradation in network performance caused by the selfish behavior of noncooperative network users. We consider a model of selfish routing in which the latency experienced by network traffic on an edge of the network is a function of the edge congestion, and network users are assumed to ..."
Abstract

Cited by 178 (14 self)
 Add to MetaCart
We study the degradation in network performance caused by the selfish behavior of noncooperative network users. We consider a model of selfish routing in which the latency experienced by network traffic on an edge of the network is a function of the edge congestion, and network users are assumed to selfishly route traffic on minimumlatency paths. The quality of a routing of traffic is measured by the sum of travel times, also called the total latency. The outcome of selfish routing—a Nash equilibrium—does not in general minimize the total latency; hence, selfish behavior carries the cost of decreased network performance. We quantify this degradation in network performance via the price of anarchy, the worstpossible ratio between the total latency of a Nash equilibrium and of an optimal routing of the traffic. We show the price of anarchy is determined only by the simplest of networks. Specifically, we prove that under weak hypotheses on the class of allowable edge latency functions, the worstcase ratio between the total latency of a Nash equilibrium and of a minimumlatency routing for any multicommodity flow network is achieved by a singlecommodity
Sending Messages to Mobile Users in Disconnected AdHoc Wireless Networks
 In ACM MOBICOM
, 2000
"... An adhoc network is formed by a group of mobile hosts upon a wireless network interface. Previous research in this area has concentrated on routing algorithms which are designed for fully connected networks. The usual way to deal with a disconnected adhoc network is to let the mobile computer wait ..."
Abstract

Cited by 127 (1 self)
 Add to MetaCart
An adhoc network is formed by a group of mobile hosts upon a wireless network interface. Previous research in this area has concentrated on routing algorithms which are designed for fully connected networks. The usual way to deal with a disconnected adhoc network is to let the mobile computer wait for network reconnection passively, which may lead to unacceptable transmission delays. In this paper, we propose an approach that guarantees message transmission in minimal time. In this approach, mobile hosts actively modify their trajectories to transmit messages. We develop algorithms that minimize the trajectory modifications under two different assumptions: (a) the movements of all the nodes in the system are known and (b) the movements of the hosts in the system are not known. 1.
TimeChanged Lévy Processes and Option Pricing
, 2002
"... As is well known, the classic BlackScholes option pricing model assumes that returns follow Brownian motion. It is widely recognized that return processes differ from this benchmark in at least three important ways. First, asset prices jump, leading to nonnormal return innovations. Second, return ..."
Abstract

Cited by 89 (12 self)
 Add to MetaCart
As is well known, the classic BlackScholes option pricing model assumes that returns follow Brownian motion. It is widely recognized that return processes differ from this benchmark in at least three important ways. First, asset prices jump, leading to nonnormal return innovations. Second, return volatilities vary stochastically over time. Third, returns and their volatilities are correlated, often negatively for equities. We propose that timechanged Lévy processes be used to simultaneously address these three facets of the underlying asset return process. We show that our framework encompasses almost all of the models proposed in the option pricing literature. Despite the generality of our approach, we show that it is straightforward to select and test a particular option pricing model through the use of characteristic function technology.
Of Smiles and Smirks: A TermStructure Perspective
 JOURNAL OF FINANCIAL AND QUANTITATIVE ANALYSIS
, 1998
"... An extensive empirical literature in finance has documented not only the presence of anamolies in the BlackScholes model, but also the "termstructures" of these anamolies (for instance, the behavior of the volatility smile or of unconditional returns at different maturities). Theoretical efforts i ..."
Abstract

Cited by 79 (3 self)
 Add to MetaCart
An extensive empirical literature in finance has documented not only the presence of anamolies in the BlackScholes model, but also the "termstructures" of these anamolies (for instance, the behavior of the volatility smile or of unconditional returns at different maturities). Theoretical efforts in the literature at addressing these anamolies have largely focussed on two extensions of the BlackScholes model: introducing jumps into the return process, and allowing volatility to be stochastic. This paper employs commonlyused versions of these two classes of models to examine the extent to which the models are theoretically capable of resolving the observed anamolies. We find that each model exhibits some "termstructure" patterns that are fundamentally inconsistent with those observed in the data. As a consequence, neither class of models constitutes an adequate explanation of the empirical evidence, although stochastic volatility models fare better than jumps in this regard.