Results 1  10
of
32
NonUniform Random Variate Generation
, 1986
"... Abstract. This is a survey of the main methods in nonuniform random variate generation, and highlights recent research on the subject. Classical paradigms such as inversion, rejection, guide tables, and transformations are reviewed. We provide information on the expected time complexity of various ..."
Abstract

Cited by 716 (21 self)
 Add to MetaCart
Abstract. This is a survey of the main methods in nonuniform random variate generation, and highlights recent research on the subject. Classical paradigms such as inversion, rejection, guide tables, and transformations are reviewed. We provide information on the expected time complexity of various algorithms, before addressing modern topics such as indirectly specified distributions, random processes, and Markov chain methods.
Recent Advances In Randomized QuasiMonte Carlo Methods
"... We survey some of the recent developments on quasiMonte Carlo (QMC) methods, which, in their basic form, are a deterministic counterpart to the Monte Carlo (MC) method. Our main focus is the applicability of these methods to practical problems that involve the estimation of a highdimensional inte ..."
Abstract

Cited by 60 (13 self)
 Add to MetaCart
We survey some of the recent developments on quasiMonte Carlo (QMC) methods, which, in their basic form, are a deterministic counterpart to the Monte Carlo (MC) method. Our main focus is the applicability of these methods to practical problems that involve the estimation of a highdimensional integral. We review several QMC constructions and dierent randomizations that have been proposed to provide unbiased estimators and for error estimation. Randomizing QMC methods allows us to view them as variance reduction techniques. New and old results on this topic are used to explain how these methods can improve over the MC method in practice. We also discuss how this methodology can be coupled with clever transformations of the integrand in order to reduce the variance further. Additional topics included in this survey are the description of gures of merit used to measure the quality of the constructions underlying these methods, and other related techniques for multidimensional integration. 1 2 1.
Fast Numerical Methods for Stochastic Computations: A Review
, 2009
"... This paper presents a review of the current stateoftheart of numerical methods for stochastic computations. The focus is on efficient highorder methods suitable for practical applications, with a particular emphasis on those based on generalized polynomial chaos (gPC) methodology. The framework ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
This paper presents a review of the current stateoftheart of numerical methods for stochastic computations. The focus is on efficient highorder methods suitable for practical applications, with a particular emphasis on those based on generalized polynomial chaos (gPC) methodology. The framework of gPC is reviewed, along with its Galerkin and collocation approaches for solving stochastic equations. Properties of these methods are summarized by using results from literature. This paper also attempts to present the gPC based methods in a unified framework based on an extension of the classical spectral methods into multidimensional random spaces.
On rates of convergence for stochastic optimization problems under nonI.I.D. sampling
, 2006
"... In this paper we discuss the issue of solving stochastic optimization problems by means of sample average approximations. Our focus is on rates of convergence of estimators of optimal solutions and optimal values with respect to the sample size. This is a well studied problem in case the samples are ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
In this paper we discuss the issue of solving stochastic optimization problems by means of sample average approximations. Our focus is on rates of convergence of estimators of optimal solutions and optimal values with respect to the sample size. This is a well studied problem in case the samples are independent and identically distributed (i.e., when standard Monte Carlo is used); here, we study the case where that assumption is dropped. Broadly speaking, our results show that, under appropriate assumptions, the rates of convergence for pointwise estimators under a sampling scheme carry over to the optimization case, in the sense that convergence of approximating optimal solutions and optimal values to their true counterparts has the same rates as in pointwise estimation. Our motivation for the study arises from two types of sampling methods that have been widely used in the Statistics literature. One is Latin Hypercube Sampling (LHS), a stratified sampling method originally proposed in the seventies by McKay, Beckman, and Conover (1979). The other is the class of quasiMonte Carlo (QMC) methods, which have become popular especially after the work of Niederreiter (1992). The advantage of such methods is that they typically yield pointwise estimators which not only have lower variance than standard Monte Carlo but also possess better rates of convergence. Thus, it is important to study the use of these techniques in samplingbased optimization. The novelty of our work arises from the fact that, while there has been some work on the use of variance reduction techniques and QMC methods in stochastic optimization, none of the existing work — to the best of our knowledge — has provided a theoretical study on the effect of these techniques on rates of convergence for the optimization problem. We present numerical results for some twostage stochastic programs from the literature to illustrate the discussed ideas.
Randomized Polynomial Lattice Rules For Multivariate Integration And Simulation
 SIAM JOURNAL ON SCIENTIFIC COMPUTING
, 2001
"... Lattice rules are among the best methods to estimate integrals in a large number of dimensions. They are part of the quasiMonte Carlo set of tools. A new class of lattice rules, defined in a space of polynomials with coefficients in a finite field, is introduced in this paper, and a theoretical fra ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
(Show Context)
Lattice rules are among the best methods to estimate integrals in a large number of dimensions. They are part of the quasiMonte Carlo set of tools. A new class of lattice rules, defined in a space of polynomials with coefficients in a finite field, is introduced in this paper, and a theoretical framework for these polynomial lattice rules is developed. A randomized version is studied, implementations and criteria for selecting the parameters are discussed, and examples of its use as a variance reduction tool in stochastic simulation are provided. Certain types of digital net constructions, as well as point sets constructed by taking all vectors of successive output values produced by a Tausworthe random number generator, turn out to be special cases of this method.
SPLITTING FOR RAREEVENT SIMULATION
, 2006
"... Splitting and importance sampling are the two primary techniques to make important rare events happen more frequently in a simulation, and obtain an unbiased estimator with much smaller variance than the standard Monte Carlo estimator. Importance sampling has been discussed and studied in several ar ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Splitting and importance sampling are the two primary techniques to make important rare events happen more frequently in a simulation, and obtain an unbiased estimator with much smaller variance than the standard Monte Carlo estimator. Importance sampling has been discussed and studied in several articles presented at the Winter Simulation Conference in the past. A smaller number of WSC articles have examined splitting. In this paper, we review the splitting technique and discuss some of its strengths and limitations from the practical viewpoint. We also introduce improvements in the implementation of the multilevel splitting technique. This is done in a setting where we want to estimate the probability of reaching B before reaching (or returning to) A when starting from a fixed state x0 ∈ B, where A and B are two disjoint subsets of the state space and B is very rarely attained. This problem has several practical applications.
Inverting the symmetrical beta distribution
 ACM Trans. Math. Software. Forthcoming
, 2004
"... We propose a fast algorithm for computing the inverse symmetrical beta distribution. Four series (two around x = 0 and two around x = 1/2) are used to approximate the distribution function and its inverse is found via Newton’s method. This algorithm can be used to generate beta random variates by in ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
We propose a fast algorithm for computing the inverse symmetrical beta distribution. Four series (two around x = 0 and two around x = 1/2) are used to approximate the distribution function and its inverse is found via Newton’s method. This algorithm can be used to generate beta random variates by inversion and is much faster than currently available general inversion methods for the beta distribution. It turns out to be very useful for generating gamma processes efficiently via bridge sampling.
Panel: Strategic Directions In Simulation Research
, 1999
"... We consider the future directions of simulation research. 1 INTRODUCTION To mark the 50th anniversary of the Association for Computing Machinery, Volume 28, Number 4 of ACM Computing Surveys was released entitled, "Strategic Directions in Computing Research." Notableto attendees of Wi ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We consider the future directions of simulation research. 1 INTRODUCTION To mark the 50th anniversary of the Association for Computing Machinery, Volume 28, Number 4 of ACM Computing Surveys was released entitled, "Strategic Directions in Computing Research." Notableto attendees of Winter Simulation Conference at leastamong the topics not covered was computer simulation. One may reasonably ask why this is so. Are there no unanswered questions remaining in computer simulation? Is computer simulation an unimportant topic? Does simulation not have relevance as a computing discipline? Should it rather be considered solely in terms of operations research, statistics or mathematics? Arguably computer simulation is quite relevant to technological advance in many arenas. Modeling and simulation are playing key roles within industry, academia and the government. The papers appearing in these Proceedings bear significant witness to that fact. The study of computer simulation as a compu...
Polynomial Integration Lattices
"... Lattice rules are quasiMonte Carlo methods for estimating largedimensional integrals over the unit hypercube. In this paper, after briefly reviewing key ideas of quasiMonte Carlo methods, we give an overview of recent results, generalize them, and provide several new results, for lattice rules de ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Lattice rules are quasiMonte Carlo methods for estimating largedimensional integrals over the unit hypercube. In this paper, after briefly reviewing key ideas of quasiMonte Carlo methods, we give an overview of recent results, generalize them, and provide several new results, for lattice rules defined in spaces of polynomials and of formal series with coeffocients in a finite ring. We discuss basic properties, implementations, a randomized version, and quality criteria (i.e., measures of uniformity) for selecting the parameters. Two types of polynomial lattice rules are examined: dimensionwise lattices and resolutionwise lattices. These rules turn out to be special cases of digital net constructions, which we reinterpret as yet another type of lattice in a space of formal series. Our development underlines the connections between integration lattices and digital nets.
Variance and Discrepancy with Alternative Scramblings
, 2002
"... This paper analyzes some schemes for reducing the computational burden of digital scrambling. Some such schemes have been shown not to affect the mean squared L² discrepancy. This paper shows that some discrepancypreserving alternative scrambles can change the variance in scrambled net quadrature. ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
This paper analyzes some schemes for reducing the computational burden of digital scrambling. Some such schemes have been shown not to affect the mean squared L² discrepancy. This paper shows that some discrepancypreserving alternative scrambles can change the variance in scrambled net quadrature. Even the rate of convergence can be adversely affected by alternative scramblings. Finally, some alternatives reduce the computational burden and can also be shown to improve the rate of convergence for the variance, at least in dimension 1.