Results 1  10
of
53
Monte Carlo Statistical Methods
, 1998
"... This paper is also the originator of the Markov Chain Monte Carlo methods developed in the following chapters. The potential of these two simultaneous innovations has been discovered much latter by statisticians (Hastings 1970; Geman and Geman 1984) than by of physicists (see also Kirkpatrick et al. ..."
Abstract

Cited by 931 (23 self)
 Add to MetaCart
This paper is also the originator of the Markov Chain Monte Carlo methods developed in the following chapters. The potential of these two simultaneous innovations has been discovered much latter by statisticians (Hastings 1970; Geman and Geman 1984) than by of physicists (see also Kirkpatrick et al. 1983). 5.5.5 ] PROBLEMS 211
NonUniform Random Variate Generation
, 1986
"... Abstract. This is a survey of the main methods in nonuniform random variate generation, and highlights recent research on the subject. Classical paradigms such as inversion, rejection, guide tables, and transformations are reviewed. We provide information on the expected time complexity of various ..."
Abstract

Cited by 646 (21 self)
 Add to MetaCart
Abstract. This is a survey of the main methods in nonuniform random variate generation, and highlights recent research on the subject. Classical paradigms such as inversion, rejection, guide tables, and transformations are reviewed. We provide information on the expected time complexity of various algorithms, before addressing modern topics such as indirectly specified distributions, random processes, and Markov chain methods.
Generating Beta Variates Via Patchwork Rejection
 Computing
, 1992
"... Zusammenfassung Generating Beta Variates Via Patchwork Rejection. A new algorithm for sampling from beta(p; q) distributions with parameters p ? 1, q ? 1 is developed. It is based on a method by Minh [9] which improves acceptancerejection sampling in the main part of the distributions. Additionall ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Zusammenfassung Generating Beta Variates Via Patchwork Rejection. A new algorithm for sampling from beta(p; q) distributions with parameters p ? 1, q ? 1 is developed. It is based on a method by Minh [9] which improves acceptancerejection sampling in the main part of the distributions. Additionally, transformed uniform deviates can often be accepted immediately, so that much fewer than two uniforms are needed for one beta variate, on the average. The remaining tests for acceptance are enhanced by 'squeezes'. Experiments covering a wide range of pairs (p; q) showed improvements in speed over competing algorithms in most cases.
Sampling from archimedean copulas
 Quantitative Finance
, 2004
"... Abstract. We develop sampling algorithms for multivariate Archimedean copulas. For exchangeable copulas, where there is only one generating function, we first analyse the distribution of the copula itself, deriving a number of integral representations and a generating function representation. One of ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Abstract. We develop sampling algorithms for multivariate Archimedean copulas. For exchangeable copulas, where there is only one generating function, we first analyse the distribution of the copula itself, deriving a number of integral representations and a generating function representation. One of the integral representations is related, by a form of convolution, to the distribution whose Laplace transform yields the copula generating function. In the infinite dimensional limit there is a direct connection between the distribution of the copula value and the inverse Laplace transform. Armed with these results, we present three sampling algorithms, all of which entail drawing from a one dimensional distribution and then scaling the result to create random deviates distributed according to the copula. We implement and compare the various methods. For more general cases, in which an N dimensional Archimedean copula is given by N − 1 nested generating functions, we present algorithms in which each new variate is drawn conditional only on the value of the copula of the previously drawn variates. We also discuss the use of composite nested and exchangeable copulas for modelling random variates with a natural hierarchical structure, such as ratings and sectors for obligors in credit baskets.
Supporting views in data stream management systems
 ACM TRANSACTIONS ON DATABASE SYSTEMS (TODS)
, 2010
"... In relational database management systems, views supplement basic query constructs to cope with the demand for “higherlevel” views of data. Moreover, in traditional query optimization, answering a query using a set of existing materialized views can yield a more efficient query execution plan. Due ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
In relational database management systems, views supplement basic query constructs to cope with the demand for “higherlevel” views of data. Moreover, in traditional query optimization, answering a query using a set of existing materialized views can yield a more efficient query execution plan. Due to their effectiveness, views are attractive to data stream management systems. In order to support views over streams, a data stream management system should employ a closed (or composable) continuous query language. A closed query language is a language in which query inputs and outputs are interpreted in the same way, hence allowing query composition.
This article introduces the Synchronized SQL (or SyncSQL) query language that defines a data stream as a sequence of modify operations against a relation. SyncSQL enables query composition through the unified interpretation of query inputs and outputs. An important issue in continuous queries over data streams is the frequency by which the answer gets refreshed and the conditions that trigger the refresh. Coarser periodic refresh requirements are typically expressed as sliding windows. In this article, the sliding window approach is generalized by introducing the synchronization principle that empowers SyncSQL with a formal mechanism to express queries with arbitrary refresh conditions. After introducing the semantics and syntax, we lay the algebraic foundation for SyncSQL and propose a querymatching algorithm for deciding containment of SyncSQL expressions. Then, the article introduces the NileSyncSQL prototype to support SyncSQL queries. NileSyncSQL employs a pipelined incremental evaluation paradigm in which the query pipeline consists of a set of differential operators. A cost model is developed to estimate the cost of SyncSQL query execution pipelines and to choose the best execution plan from a set of different plans for the same query. An experimental study is conducted to evaluate the performance of NileSyncSQL. The experimental results illustrate the effectiveness of NileSyncSQL and the significant performance gains when views are enabled in data stream management systems.
Sampling From Discrete And Continuous Distributions With CRand
 In Simulation and
, 1992
"... CRAND is a system of TurboC routines and functions intended for use on microcomputers. It contains uptodate random number generators for more than thirty univariate distributions. For some important distributions the user has the choice between extremely fast but rather complicated method ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
CRAND is a system of TurboC routines and functions intended for use on microcomputers. It contains uptodate random number generators for more than thirty univariate distributions. For some important distributions the user has the choice between extremely fast but rather complicated methods and somewhat slower but also much simpler procedures. Menu driven demo programs allow to test and analyze the generators with regard to speed and quality of the output. 1.
P.: An adaptive algorithm for simulation of stochastic reactiondiffusion processes
 J. Comput. Phys
, 2010
"... We propose an adaptive hybrid method suitable for stochastic simulation of diffusion dominated reactiondiffusion processes. For such systems, simulation of the diffusion requires the predominant part of the computing time. In order to reduce the computational work, the diffusion in parts of the dom ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
We propose an adaptive hybrid method suitable for stochastic simulation of diffusion dominated reactiondiffusion processes. For such systems, simulation of the diffusion requires the predominant part of the computing time. In order to reduce the computational work, the diffusion in parts of the domain is treated macroscopically, in other parts with the tauleap method and in the remaining parts with Gillespie’s stochastic simulation algorithm (SSA) as implemented in the next subvolume method (NSM). The chemical reactions are handled by SSA everywhere in the computational domain. A trajectory of the process is advanced in time by an operator splitting technique and the time steps are chosen adaptively. The spatial adaptation is based on estimates of the errors in the tauleap method and the macroscopic diffusion. The accuracy and efficiency of the method are demonstrated in examples from molecular biology where the domain is discretized by unstructured meshes.
The Double CFTP method
, 2010
"... Abstract. We consider the problem of the exact simulation of random variables Z that satisfy the distributional identity Z L = V Y + (1 − V)Z, where V ∈ [0, 1] and Y are independent, and L = denotes equality in distribution. Equivalently, Z is the limit of a Markov chain driven by that map. We give ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. We consider the problem of the exact simulation of random variables Z that satisfy the distributional identity Z L = V Y + (1 − V)Z, where V ∈ [0, 1] and Y are independent, and L = denotes equality in distribution. Equivalently, Z is the limit of a Markov chain driven by that map. We give an algorithm that can be automated under the condition that we have a source capable of generating independent copies of Y, and that V has a density that can be evaluated in a black box format. The method uses a doubling trick for inducing coalescence in coupling from the past. Applications include exact samplers for many Dirichlet means, some twoparameter Poisson–Dirichlet means, and a host of other distributions related to occupation times of Bessel bridges that can be described by stochastic fixed point equations. Keywords and phrases. Random variate generation. Perpetuities. Coupling from the past. Random partitions. Stochastic recurrences. Stochastic fixed point equations. Distribution theory. Markov chain Monte Carlo. Simulation. Expected time analysis. Bessel bridge. PoissonDirichlet. Dirichlet means.
Flood control reservoir system design using stochastic programming
, 1978
"... Mathematically a natural river system is a rooted directed tree where the orientations of the edges coincide with the directions of the streamflows. Assume that in some of the river valleys it is possible to build reservoirs the purpose of which will be to retain the flood, once a year, say. The pro ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Mathematically a natural river system is a rooted directed tree where the orientations of the edges coincide with the directions of the streamflows. Assume that in some of the river valleys it is possible to build reservoirs the purpose of which will be to retain the flood, once a year, say. The problem is to find optimal reservoir capacities by minimizing total building cost eventually plus a penalty, where a reliability type constraint, further lower and upper bounds for the capacities are prescribed. The solution of the obtained nonlinear programming problem is based on the supporting hyperplane method of Veinott combined with simulation of multivariate probability distributions. Numerical illustrations are given.
Reliability type inventory models based on stochastic programming
 Mathematical Programming Study 9, 43–58. 26 RRR
, 1978
"... The models discussed in the present paper are generalizations of the models introduced previously by A.Prekopa [6] and M. Ziermann [13]. In the mentioned papers the initial stock level of one basic commodity is determined provided that the delivery and demand process allow certain homogeneity (in ti ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The models discussed in the present paper are generalizations of the models introduced previously by A.Prekopa [6] and M. Ziermann [13]. In the mentioned papers the initial stock level of one basic commodity is determined provided that the delivery and demand process allow certain homogeneity (in time) assumptions if they are random. Here we are dealing with more than one basic commodity and drop the time homogeneity assumption. Only the delivery processes will be assumed to be random. They will be supposed to be stochastically independent. The rst model discussed in this paper was already introduced in [9]. All these models are stochastic programming models and algorithms are used to determine the initial stock levels rather than simple formulas. We have tosolve nonlinear programming problems where one of the constraints is probabilistic. The function and gradient values of the corresponding constraining function are determined by simulation. A numerical example is detailed.