Results 21  30
of
2,623
A new approach for allocating buffers and bandwidth to heterogeneous, regulated traffic in an ATM node
 IEEE Journal on Selected Areas in Communications
, 1995
"... AbstractA new approach to determining the admissibility of variable bit rate (VBR) traffic in buffered digital networks is developed. In this approach all traffic presented to the network is assumed to have been subjected to leakybucket regulation, and extremal, periodic, onoff regulated traffic ..."
Abstract

Cited by 150 (9 self)
 Add to MetaCart
AbstractA new approach to determining the admissibility of variable bit rate (VBR) traffic in buffered digital networks is developed. In this approach all traffic presented to the network is assumed to have been subjected to leakybucket regulation, and extremal, periodic, onoff regulated traffic is considered; the analysis is based on fluid models. Each regulated traffic stream is allocated bandwidth and buffer resources which are independent of other traffic. Bandwidth and buffer allocations are traded off in a manner optimal for an adversarial situation involving minimal knowledge of other traffic. This leads to a singleresource statisticalmultiplexing problem which is solved using techniques previously used for unbuffered traffic. VBR traffic is found to be divisible into two classes, one for which statistical multiplexing is effective and one for which statistical multiplexing is ineffective in the sense that accepting small losses provides no advantage over requiring lossless performance. The boundary of the set of admissible traffic sources is examined, and is found to be sufficiently linear that an effective bandwidth can be meaningfully assigned to each VBR source, so long as only statisticallymultiplexable sources are considered, or only nonstatisticallymultiplexable sources are considered. If these two types of sources are intermixed, then nonlinear interactions occur and fewer sources can be admitted than a linear theory would predict. A qualitative characterization of the nonlinearities is presented. The complete analysis involves conservative approximations; however, admission decisions based on this work are expected to be less overly conservative than decisions based on alternative approaches. I.
Fitting Mixtures Of Exponentials To LongTail Distributions To Analyze Network Performance Models
, 1997
"... Traffic measurements from communication networks have shown that many quantities characterizing network performance have longtail probability distributions, i.e., with tails that decay more slowly than exponentially. File lengths, call holding times, scene lengths in MPEG video streams, and interva ..."
Abstract

Cited by 144 (13 self)
 Add to MetaCart
Traffic measurements from communication networks have shown that many quantities characterizing network performance have longtail probability distributions, i.e., with tails that decay more slowly than exponentially. File lengths, call holding times, scene lengths in MPEG video streams, and intervals between connection requests in Internet traffic all have been found to have longtail distributions, being well described by distributions such as the Pareto and Weibull. It is known that longtail distributions can have a dramatic effect upon performance, e.g., longtail servicetime distributions cause longtail waitingtime distributions in queues, but it is often difficult to describe this effect in detail, because performance models with component longtail distributions tend to be difficult to analyze. We address this problem by developing an algorithm for approximating a longtail distribution by a hyperexponential distribution (a finite mixture of exponentials). We first prove tha...
Longest increasing subsequences: from patience sorting to the BaikDeiftJohansson theorem
 Bull. Amer. Math. Soc. (N.S
, 1999
"... Abstract. We describe a simple oneperson card game, patience sorting. Its analysis leads to a broad circle of ideas linking Young tableaux with the longest increasing subsequence of a random permutation via the Schensted correspondence. A recent highlight of this area is the work of BaikDeiftJoha ..."
Abstract

Cited by 137 (2 self)
 Add to MetaCart
Abstract. We describe a simple oneperson card game, patience sorting. Its analysis leads to a broad circle of ideas linking Young tableaux with the longest increasing subsequence of a random permutation via the Schensted correspondence. A recent highlight of this area is the work of BaikDeiftJohansson which yields limiting probability laws via hard analysis of Toeplitz determinants. 1.
Linear Regression Limit Theory for Nonstationary Panel Data
 Econometrica
, 1999
"... This paper develops a regression limit theory for nonstationary panel data with large numbers of cross section Ž n. and time series Ž T. observations. The limit theory allows for both sequential limits, wherein T� � followed by n��, and joint limits where T, n�� simultaneously; and the relationship ..."
Abstract

Cited by 137 (13 self)
 Add to MetaCart
This paper develops a regression limit theory for nonstationary panel data with large numbers of cross section Ž n. and time series Ž T. observations. The limit theory allows for both sequential limits, wherein T� � followed by n��, and joint limits where T, n�� simultaneously; and the relationship between these multidimensional limits is explored. The panel structures considered allow for no time series cointegration, heterogeneous cointegration, homogeneous cointegration, and nearhomogeneous cointegration. The paper explores the existence of longrun average relations between integrated panel vectors when there is no individual time series cointegration and when there is heterogeneous cointegration. These relations are parameterized in terms of the matrix regression coefficient of the longrun average covariance matrix. In the case of homogeneous and near homogeneous cointegrating panels, a panel fully modified regression estimator is developed and studied. The limit theory enables us to test hypotheses about the long run average parameters both within and between subgroups of the full population.
Capacity and Optimal Resource Allocation for Fading Broadcast Channels: Part I: Ergodic Capacity
"... ..."
Single Crossing Properties And The Existence Of Pure Strategy Equilibria In Games Of Incomplete Information
 Econometrica
, 1997
"... This paper analyzes a class of games of incomplete information where each agent has ..."
Abstract

Cited by 127 (6 self)
 Add to MetaCart
This paper analyzes a class of games of incomplete information where each agent has
The Variational Formulation of the FokkerPlanck Equation
 SIAM J. Math. Anal
, 1999
"... The FokkerPlanck equation, or forward Kolmogorov equation, describes the evolution of the probability density for a stochastic process associated with an Ito stochastic differential equation. It pertains to a wide variety of timedependent systems in which randomness plays a role. In this paper, ..."
Abstract

Cited by 127 (18 self)
 Add to MetaCart
The FokkerPlanck equation, or forward Kolmogorov equation, describes the evolution of the probability density for a stochastic process associated with an Ito stochastic differential equation. It pertains to a wide variety of timedependent systems in which randomness plays a role. In this paper, we are concerned with FokkerPlanck equations for which the drift term is given by the gradient of a potential. For a broad class of potentials, we construct a timediscrete, iterative variational scheme whose solutions converge to the solution of the FokkerPlanck equation. The major novelty of this iterative scheme is that the time step is governed by the Wasserstein metric on probability measures. This formulation enables us to reveal an appealing, and previously unexplored, relationship between the FokkerPlanck equation and the associated free energy functional. Namely, we demonstrate that the dynamics may be regarded as a gradient flux, or a steepest descent, for the free energy wi...
Some Impossibility Theorems In Econometrics With Applications To Instrumental Variables, Dynamic Models And Cointegration
 Econometrica
, 1995
"... General characterizations of valid confidence sets and tests in problems which involve locally almost unidentified (LAU) parameters are provided and applied to several econometric models. Two types of inference problems are studied: (1) inference about parameters which are not identifiable on certai ..."
Abstract

Cited by 124 (16 self)
 Add to MetaCart
General characterizations of valid confidence sets and tests in problems which involve locally almost unidentified (LAU) parameters are provided and applied to several econometric models. Two types of inference problems are studied: (1) inference about parameters which are not identifiable on certain subsets of the parameter space, and (2) inference about parameter transformations with singularities (discontinuities). When a LAU parameter or parametric function has an unbounded range, it is shown under general regularity conditions that any valid confidence set with level 1 \Gamma ff for this parameter should be unbounded with probability close to 1 \Gamma ff in the neighborhood of nonidentification subsets and should as well have a nonzero probability of being unbounded under any distribution compatible with the model: no valid confidence set which is bounded with probability one does exist. These properties hold even if "identifying restrictions" are imposed. Similar results also ob...
A Racing Algorithm for Configuring Metaheuristics
, 2002
"... This paper describes a racing procedure for finding, in a limited amount of time, a configuration of a metaheuristic that performs as good as possible on a given instance class of a combinatorial optimization problem. Taking inspiration from methods proposed in the machine learning literature ..."
Abstract

Cited by 117 (34 self)
 Add to MetaCart
This paper describes a racing procedure for finding, in a limited amount of time, a configuration of a metaheuristic that performs as good as possible on a given instance class of a combinatorial optimization problem. Taking inspiration from methods proposed in the machine learning literature for model selection through crossvalidation, we propose a procedure that empirically evaluates a set of candidate configurations by discarding bad ones as soon as statistically sufficient evidence is gathered against them. We empirically evaluate our procedure using as an example the configuration of an ant colony optimization algorithm applied to the traveling salesman problem.