Results 1  10
of
192
Benchmarks for Basic Scheduling Problems
, 1989
"... In this paper, we propose 260 scheduling problems whose size is greater than that of the rare examples published. Such sizes correspond to real dimensions of industrial problems. ..."
Abstract

Cited by 152 (0 self)
 Add to MetaCart
In this paper, we propose 260 scheduling problems whose size is greater than that of the rare examples published. Such sizes correspond to real dimensions of industrial problems.
Random number generation
"... Random numbers are the nuts and bolts of simulation. Typically, all the randomness required by the model is simulated by a random number generator whose output is assumed to be a sequence of independent and identically distributed (IID) U(0, 1) random variables (i.e., continuous random variables dis ..."
Abstract

Cited by 136 (30 self)
 Add to MetaCart
Random numbers are the nuts and bolts of simulation. Typically, all the randomness required by the model is simulated by a random number generator whose output is assumed to be a sequence of independent and identically distributed (IID) U(0, 1) random variables (i.e., continuous random variables distributed uniformly over the interval
Admission Control for Statistical QoS: Theory and Practice
, 1999
"... In networks that support Quality of Service (QoS), an admission control algorithm determines whether or not a new traffic flow can be admitted to the network such that all users will receive their required performance. Such an algorithm is a key component of future multiservice networks as it deter ..."
Abstract

Cited by 106 (12 self)
 Add to MetaCart
In networks that support Quality of Service (QoS), an admission control algorithm determines whether or not a new traffic flow can be admitted to the network such that all users will receive their required performance. Such an algorithm is a key component of future multiservice networks as it determines the extent to which network resources are utilized and whether the promised QoS parameters are actually delivered. Our goals in this paper are threefold. First, we describe and classify a broad set of proposed admission control algorithms. Second, we evaluate the accuracy of these algorithms via experiments using both onoff sources and long traces of compressed video; we compare the admissible regions and QoS parameters predicted by our implementations of the algorithms with those obtained from tracedriven simulations. Finally, we identify the key aspects of an admission control algorithm necessary for achieving a high degree of accuracy and hence a high statistical multiplexing gain...
Designing and reporting on computational experiments with heuristic methods
 Journal of Heuristics
, 1995
"... This report discusses the design of computational experiments to test heuristic methods and provides reporting guidelines for such experimentation. The goal is to promote thoughtful, wellplanned, and extensive testing of heuristics, full disclosure of experimental conditions, and integrity in and r ..."
Abstract

Cited by 105 (1 self)
 Add to MetaCart
This report discusses the design of computational experiments to test heuristic methods and provides reporting guidelines for such experimentation. The goal is to promote thoughtful, wellplanned, and extensive testing of heuristics, full disclosure of experimental conditions, and integrity in and reproducibility of the reported results. 1
Regeneration in Markov Chain Samplers
, 1994
"... Markov chain sampling has received considerable attention in the recent literature, in particular in the context of Bayesian computation and maximum likelihood estimation. This paper discusses the use of Markov chain splitting, originally developed as a tool for the theoretical analysis of general s ..."
Abstract

Cited by 87 (5 self)
 Add to MetaCart
Markov chain sampling has received considerable attention in the recent literature, in particular in the context of Bayesian computation and maximum likelihood estimation. This paper discusses the use of Markov chain splitting, originally developed as a tool for the theoretical analysis of general state space Markov chains, to introduce regeneration times into Markov chain samplers. This allows the use of regenerative methods for analyzing the output of these samplers, and can also provide a useful diagnostic of the performance of the samplers. The general approach is applied to several different samplers and is illustrated in a number of examples. 1 Introduction In Markov chain Monte Carlo, a distribution ß is examined by obtaining sample paths from a Markov chain constructed to have equilibrium distribution ß. This approach was introduced by Metropolis et al. (1953) and has recently received considerable attention as a method for examining posterior distributions in Bayesian infer...
NIST Net: A Linuxbased Network Emulation Tool
 Computer Communication Review
, 2003
"... Testing of network protocols and distributed applications has become increasingly complex, as the diversity of networks and underlying technologies increase, and the adaptive behavior of applications becomes more sophisticated. In this paper, we present NIST Net, a tool to facilitate testing and exp ..."
Abstract

Cited by 85 (0 self)
 Add to MetaCart
Testing of network protocols and distributed applications has become increasingly complex, as the diversity of networks and underlying technologies increase, and the adaptive behavior of applications becomes more sophisticated. In this paper, we present NIST Net, a tool to facilitate testing and experimentation with network code through emulation. NIST Net enables experimenters to model and effect arbitrary performance dynamics (packet delay, jitter, bandwidth limitations, congestion, packet loss and duplication) on live IP packets passing through a commodity Linuxbased PC router. We describe the emulation capabilities of NIST Net; examine its architecture; and discuss some of the implementation challenges encountered in building such a tool to operate at very high network data rates while imposing minimal processing overhead. Calibration results are provided to quantify the fidelity and performance of NIST Net over a wide range of offered loads (up to 1 Gbps), and a diverse set of emulated performance dynamics. 1
Good Parameters And Implementations For Combined Multiple Recursive Random Number Generators
, 1998
"... this paper is to provide good CMRGs of different sizes, selected via the spectral test up to 32 (or 24) dimensions, and a faster implementation than in L'Ecuyer (1996) using floatingpoint arithmetic. Why do we need different parameter sets? Firstly, different types of implementations require differ ..."
Abstract

Cited by 78 (18 self)
 Add to MetaCart
this paper is to provide good CMRGs of different sizes, selected via the spectral test up to 32 (or 24) dimensions, and a faster implementation than in L'Ecuyer (1996) using floatingpoint arithmetic. Why do we need different parameter sets? Firstly, different types of implementations require different constraints on the modulus and multipliers. For example, a floatingpoint implementation with 53 bits of precision allows moduli of more than 31 bits and this can be exploited to increase the period length for free. Secondly, as 64bit computers get more widespread, there is demand for generators implemented in 64bit integer arithmetic. Tables of good parameters for such generators must be made available. Thirdly, RNGs are somewhat like cars: a single model and single size for the entire world is not the most satisfactory solution. Some people want a fast and relatively small RNG, while others prefer a bigger and more robust one, with longer period and good equidistribution properties in larger dimensions. Naively, one could think that an RNG with period length near 2
Statistical Reconstruction And Analysis Of Autoregressive Signals In Impulsive Noise
, 1998
"... Modelling and reconstruction methods are presented for noise reduction of autocorrelated signals in nonGaussian, impulsive noise environments. A Bayesian probabilistic framework is adopted and Markov chain Monte Carlo methods are developed for detection and correction of impulses. Individual noise ..."
Abstract

Cited by 48 (16 self)
 Add to MetaCart
Modelling and reconstruction methods are presented for noise reduction of autocorrelated signals in nonGaussian, impulsive noise environments. A Bayesian probabilistic framework is adopted and Markov chain Monte Carlo methods are developed for detection and correction of impulses. Individual noise sources are modelled as Gaussian with unknown scale (variance), allowing for robustness to `heavytailed' impulse distributions, while the underlying signal is modelled as autoregressive (AR). Results are presented for both artificial and real data from voice and music recordings and comparisons are made with existing techniques. The new techniques are found to give improved detection and elimination of impulses in adverse noise conditions at the expense of some extra computational complexity.
The Asymptotic Efficiency Of Simulation Estimators
 Operations Research
, 1992
"... A decisiontheoretic framework is proposed for evaluating the efficiency of simulation estimators. The framework includes the cost of obtaining the estimate as well as the cost of acting based on the estimate. The cost of obtaining the estimate and the estimate itself are represented as realizations ..."
Abstract

Cited by 43 (14 self)
 Add to MetaCart
A decisiontheoretic framework is proposed for evaluating the efficiency of simulation estimators. The framework includes the cost of obtaining the estimate as well as the cost of acting based on the estimate. The cost of obtaining the estimate and the estimate itself are represented as realizations of jointly distributed stochastic processes. In this context, the efficiency of a simulation estimator based on a given computational budget is defined as the reciprocal of the risk (the overall expected cost). This framework is appealing philosophically, but it is often difficult to apply in practice (e.g., to compare the efficiency of two different estimators) because only rarely can the efficiency associated with a given computational budget be calculated. However, a useful practical framework emerges in a large sample context when we consider the limiting behavior as the computational budget increases. A limit theorem established for this model supports and extends a fairly well known e...
Pushtopeer videoondemand system: Design and evaluation
 In UMass Computer Science Techincal Report 2006–59
, 2006
"... Number: CRPRL2006110001 ..."