Results 1  10
of
58
Random Early Detection Gateways for Congestion Avoidance
 IEEE/ACM TRANSACTIONS ON NETWORKING
, 1993
"... This paper presents Random Early Detection (RED) gateways for congestion avoidance in packetswitched networks. The gateway detects incipient congestion by computing the average queue size. The gateway could notify connections of congestion either by dropping packets arriving at the gateway or by ..."
Abstract

Cited by 2214 (32 self)
 Add to MetaCart
This paper presents Random Early Detection (RED) gateways for congestion avoidance in packetswitched networks. The gateway detects incipient congestion by computing the average queue size. The gateway could notify connections of congestion either by dropping packets arriving at the gateway or by setting a bit in packet headers. When the average queue size exceeds a preset threshold,the gateway drops or marks each arriving packet with a certain probability, where the exact probability is a function of the average queue size. RED gateways keep the average queue size low while allowing occasional bursts of packets in the queue. During congestion, the probability that the gateway notifies a particular connection to reduce its window is roughly proportional to that connection's share of the bandwidth throughthe gateway. RED gateways are designed to accompany a transportlayer congestion control protocol such as TCP.The RED gateway has no bias against bursty traffic and avoids the global synchronization of many connectionsdecreasing their window at the same time. Simulations of a TCP/IP network are used to illustrate the performance of RED gateways.
Chernoffhoeffding bounds for applications with limited independence
 ACMSIAM Symposium on Discrete Algorithms
, 1993
"... ..."
Majority Gates vs. General Weighted Threshold Gates
 Computational Complexity
, 1992
"... . In this paper we study small depth circuits that contain threshold gates (with or without weights) and parity gates. All circuits we consider are of polynomial size. We prove several results which complete the work on characterizing possible inclusions between many classes defined by small depth c ..."
Abstract

Cited by 91 (6 self)
 Add to MetaCart
. In this paper we study small depth circuits that contain threshold gates (with or without weights) and parity gates. All circuits we consider are of polynomial size. We prove several results which complete the work on characterizing possible inclusions between many classes defined by small depth circuits. These results are the following: 1. A single threshold gate with weights cannot in general be replaced by a polynomial fanin unweighted threshold gate of parity gates. 2. On the other hand it can be replaced by a depth 2 unweighted threshold circuit of polynomial size. An extension of this construction is used to prove that whatever can be computed by a depth d polynomial size threshold circuit with weights can be computed by a depth d + 1 polynomial size unweighted threshold circuit, where d is an arbitrary fixed integer. 3. A polynomial fanin threshold gate (with weights) of parity gates cannot in general be replaced by a depth 2 unweighted threshold circuit of polynomial size...
Private vs. common random bits in communication complexity
 Information Processing Letters
, 1991
"... ..."
MeasurementBased Connection Admission Control
, 1997
"... ... In this paper we continue the development of a modelling approach which attempts to integrate these several timescales, and illustrate its application to the analysis of a family of simple and robust measurementbased admission controls. A subsidiary aim of the paper is to shed light on the rel ..."
Abstract

Cited by 81 (2 self)
 Add to MetaCart
... In this paper we continue the development of a modelling approach which attempts to integrate these several timescales, and illustrate its application to the analysis of a family of simple and robust measurementbased admission controls. A subsidiary aim of the paper is to shed light on the relationship between the admission control proposed for ATM networks by Gibbens et al [9] and that proposed for controlledload Internet services by Floyd [7]. We shall see that their common origin in Chernoff bounds allows the definition of a simple and general family of admission controls, capable of tailoring for several implementation scenarios.
Comments on Measurementbased Admissions Control for ControlledLoad Services
, 1996
"... This paper considers measurementbased admissions control procedures for ControlledLoad services [Wro96]. First, we discuss some of the factors that reduce the requirements on a measurementbased admissions control procedure in an Internet environment with a strong mix of realtime and data traffic. ..."
Abstract

Cited by 69 (0 self)
 Add to MetaCart
This paper considers measurementbased admissions control procedures for ControlledLoad services [Wro96]. First, we discuss some of the factors that reduce the requirements on a measurementbased admissions control procedure in an Internet environment with a strong mix of realtime and data traffic. Next, we discuss measurementbased admissions control procedures for ControlledLoad services that are based on the approach of equivalent capacity. Finally, we show some measurements of audio traffic for small audioconferences, and discuss why such traffic could be among the most problematic for measurementbased admissions control procedures. 1 Introduction This paper makes several observations about measurementbased admissions control procedures for ControlledLoad services [Wro96]. We begin with a general discussion of the factors that would allow a measurementbased admissions control procedure to succeed even in the presence of traffic that is difficult to predict. Next, we discuss ...
Probability Metrics and Recursive Algorithms
"... In this paper it is shown by several examples that probability metrics are a useful tool to study the asymptotic behaviour of (stochastic) recursive algorithms. The basic idea of this approach is to find a `suitable ' probability metric which yields contraction properties of the transformation ..."
Abstract

Cited by 48 (9 self)
 Add to MetaCart
In this paper it is shown by several examples that probability metrics are a useful tool to study the asymptotic behaviour of (stochastic) recursive algorithms. The basic idea of this approach is to find a `suitable ' probability metric which yields contraction properties of the transformations describing the limits of the algorithm. In order to demonstrate the wide range of applicability of this contraction method we investigate examples from various fields, some of them have been analyzed already in the literature.
An adaptive sampling algorithm for solving Markov decision processes
 Operations Research
, 2005
"... Based on recent results for multiarmed bandit problems, we propose an adaptive sampling algorithm that approximates the optimal value of a finite horizon Markov decision process (MDP) with infinite state space but finite action space and bounded rewards. The algorithm adaptively chooses which actio ..."
Abstract

Cited by 24 (6 self)
 Add to MetaCart
Based on recent results for multiarmed bandit problems, we propose an adaptive sampling algorithm that approximates the optimal value of a finite horizon Markov decision process (MDP) with infinite state space but finite action space and bounded rewards. The algorithm adaptively chooses which action to sample as the sampling process proceeds, and it is proven that the estimate produced by the algorithm is asymptotically unbiased and the worst possible bias is bounded by a quantity that converges to zero at rate O � � H ln N N,whereHis the horizon length and N is the total number of samples that are used per state sampled in each stage. The worstcase runningtime complexity of the algorithm is O((AN) H), independent of the state space size, where A  is the size of the action space. The algorithm can be used to create an approximate receding horizon control to solve infinite horizon MDPs.
Designing overlay multicast networks for streaming
 In Proceedings of ACM Symposium on Parallel Algorithms and Architectures
, 2003
"... In this paper we present a polynomial time approximation algorithm for designing a multicast overlay network. The algorithm finds a solution that satisfies capacity and reliability constraints to within a constant factor of optimal, and cost to within a logarithmic factor. The class of networks that ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
In this paper we present a polynomial time approximation algorithm for designing a multicast overlay network. The algorithm finds a solution that satisfies capacity and reliability constraints to within a constant factor of optimal, and cost to within a logarithmic factor. The class of networks that our algorithm applies to includes the one used by Akamai Technologies to deliver live media streams over the Internet. In particular, we analyze networks consisting of three stages of nodes. The nodes in the first stage are the sources where live streams originate. A source forwards each of its streams to one or more nodes in the second stage, which are called reflectors. A reflector can split an incoming stream into multiple identical outgoing streams, which are then sent on to nodes in the third and final stage, which are called the sinks. As the packets in a stream travel from one stage to the next, some of them may be lost. The job of a sink is to combine the packets from multiple instances of the same stream (by reordering packets and discarding duplicates) to form a single instance of the stream with minimal loss. We assume that the loss rate between any pair of nodes in the network is known, and that losses between different pairs are independent, but discuss extensions in which some losses may be correlated.