Results 1  10
of
37
Recent Advances In Randomized QuasiMonte Carlo Methods
"... We survey some of the recent developments on quasiMonte Carlo (QMC) methods, which, in their basic form, are a deterministic counterpart to the Monte Carlo (MC) method. Our main focus is the applicability of these methods to practical problems that involve the estimation of a highdimensional inte ..."
Abstract

Cited by 60 (13 self)
 Add to MetaCart
We survey some of the recent developments on quasiMonte Carlo (QMC) methods, which, in their basic form, are a deterministic counterpart to the Monte Carlo (MC) method. Our main focus is the applicability of these methods to practical problems that involve the estimation of a highdimensional integral. We review several QMC constructions and dierent randomizations that have been proposed to provide unbiased estimators and for error estimation. Randomizing QMC methods allows us to view them as variance reduction techniques. New and old results on this topic are used to explain how these methods can improve over the MC method in practice. We also discuss how this methodology can be coupled with clever transformations of the integrand in order to reduce the variance further. Additional topics included in this survey are the description of gures of merit used to measure the quality of the constructions underlying these methods, and other related techniques for multidimensional integration. 1 2 1.
A Hilbert space embedding for distributions
 In Algorithmic Learning Theory: 18th International Conference
, 2007
"... Abstract. We describe a technique for comparing distributions without the need for density estimation as an intermediate step. Our approach relies on mapping the distributions into a reproducing kernel Hilbert space. Applications of this technique can be found in twosample tests, which are used for ..."
Abstract

Cited by 57 (28 self)
 Add to MetaCart
(Show Context)
Abstract. We describe a technique for comparing distributions without the need for density estimation as an intermediate step. Our approach relies on mapping the distributions into a reproducing kernel Hilbert space. Applications of this technique can be found in twosample tests, which are used for determining whether two sets of observations arise from the same distribution, covariate shift correction, local learning, measures of independence, and density estimation. Kernel methods are widely used in supervised learning [1, 2, 3, 4], however they are much less established in the areas of testing, estimation, and analysis of probability distributions, where information theoretic approaches [5, 6] have long been dominant. Recent examples include [7] in the context of construction of graphical models, [8] in the context of feature extraction, and [9] in the context of independent component analysis. These methods have by and large a common issue: to compute quantities such as the mutual information, entropy, or KullbackLeibler divergence, we require sophisticated space partitioning and/or
Extensible Lattice Sequences For QuasiMonte Carlo Quadrature
 SIAM Journal on Scientific Computing
, 1999
"... Integration lattices are one of the main types of low discrepancy sets used in quasiMonte Carlo methods. However, they have the disadvantage of being of fixed size. This article describes the construction of an infinite sequence of points, the first b m of which form a lattice for any nonnegative ..."
Abstract

Cited by 35 (6 self)
 Add to MetaCart
Integration lattices are one of the main types of low discrepancy sets used in quasiMonte Carlo methods. However, they have the disadvantage of being of fixed size. This article describes the construction of an infinite sequence of points, the first b m of which form a lattice for any nonnegative integer m. Thus, if the quadrature error using an initial lattice is too large, the lattice can be extended without discarding the original points. Generating vectors for extensible lattices are found by minimizing a loss function based on some measure of discrepancy or nonuniformity of the lattice. The spectral test used for finding pseudorandom number generators is one important example of such a discrepancy. The performance of the extensible lattices proposed here is compared to that of other methods for some practical quadrature problems.
On Resampling Algorithms for Particle Filters
 Nonlinear Statistical Signal Processing Workshop
, 2006
"... In this paper a comparison is made between four frequently encountered resampling algorithms for particle filters. A theoretical framework is introduced to be able to understand and explain the differences between the resampling algorithms. This facilitates a comparison of the algorithms with respec ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
(Show Context)
In this paper a comparison is made between four frequently encountered resampling algorithms for particle filters. A theoretical framework is introduced to be able to understand and explain the differences between the resampling algorithms. This facilitates a comparison of the algorithms with respect to their resampling quality and computational complexity. Using extensive Monte Carlo simulations the theoretical results are verified. It is found that systematic resampling is favourable, both in terms of resampling quality and computational complexity. 1.
Deterministic Design for Neural Network Learning: An Approach Based on Discrepancy
"... The general problem of reconstructing an unknown function from a finite collection of samples is considered, in case the position of each input vector in the training set is not fixed beforehand, but is part of the learning process. In particular, the consistency of the Empirical Risk Minimization ( ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
The general problem of reconstructing an unknown function from a finite collection of samples is considered, in case the position of each input vector in the training set is not fixed beforehand, but is part of the learning process. In particular, the consistency of the Empirical Risk Minimization (ERM) principle is analyzed, when the points in the input space are generated by employing a purely deterministic algorithm (deterministic learning). When the
QuasiMonte Carlo algorithms for unbounded, weighted integration problems
 Journal of Complexity
, 2004
"... In this article we investigate QuasiMonte Carlo methods for multidimensional improper integrals with respect to a measure other than the uniform distribution. Additionally, the integrand is allowed to be unbounded at the lower boundary of the integration domain. We establish convergence of the Quas ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
(Show Context)
In this article we investigate QuasiMonte Carlo methods for multidimensional improper integrals with respect to a measure other than the uniform distribution. Additionally, the integrand is allowed to be unbounded at the lower boundary of the integration domain. We establish convergence of the QuasiMonte Carlo estimator to the value of the improper integral under conditions involving both the integrand and the sequence used. Furthermore, we suggest a modification of an approach proposed by Hlawka and Mück for the creation of lowdiscrepancy sequences with regard to a given density, which are suited for singular integrands. Key words: QuasiMonte Carlo integration, weighted integration, nonuniformly distributed lowdiscrepancy sequences This paper is devoted to QuasiMonte Carlo (QMC) techniques for weighted integration problems of the form
Variance and Discrepancy with Alternative Scramblings
, 2002
"... This paper analyzes some schemes for reducing the computational burden of digital scrambling. Some such schemes have been shown not to affect the mean squared L² discrepancy. This paper shows that some discrepancypreserving alternative scrambles can change the variance in scrambled net quadrature. ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
This paper analyzes some schemes for reducing the computational burden of digital scrambling. Some such schemes have been shown not to affect the mean squared L² discrepancy. This paper shows that some discrepancypreserving alternative scrambles can change the variance in scrambled net quadrature. Even the rate of convergence can be adversely affected by alternative scramblings. Finally, some alternatives reduce the computational burden and can also be shown to improve the rate of convergence for the variance, at least in dimension 1.
Pricing options using lattice rules
, 2005
"... There are many examples of option contracts in which the payoff depends on several stochastic variables. These options often can be priced by the valuation of multidimensional integrals. Quasi– Monte Carlo methods are an effective numerical tool for this task. We show that, when the dimensions of th ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
There are many examples of option contracts in which the payoff depends on several stochastic variables. These options often can be priced by the valuation of multidimensional integrals. Quasi– Monte Carlo methods are an effective numerical tool for this task. We show that, when the dimensions of the problem are small (say, less than 10), a special type of quasi–Monte Carlo known as the lattice rule method is very efficient. We provide an overview of lattice rules, and we show how to implement this method and demonstrate its efficiency relative to standard Monte Carlo and classical quasi–Monte Carlo. To maximize the efficiency gains, we show how to exploit the regularity of the integrand through a periodization technique. We demonstrate the superior efficiency of the method both in the estimation of prices as well as in the estimation of partial derivatives of these prices (the socalled Greeks). In particular this approach provides good estimates of the second derivative (the gamma) of the price in contrast to traditional Monte Carlo methods, which normally yield poor estimates. Although this method is not new, it appears that the advantages of lattice rules in the context of insurance and finance applications have not been fully appreciated in the literature. 1.
Practical, fast Monte Carlo statistical static timing analysis: why
 and how,” International Conference on Computer Aided Design
"... Statistical static timing analysis (SSTA) has emerged as an essential tool for nanoscale designs. Monte Carlo methods are universally employed to validate the accuracy of the approximations made in all SSTA tools, but Monte Carlo itself is never employed as a strategy for practical SSTA. It is wid ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Statistical static timing analysis (SSTA) has emerged as an essential tool for nanoscale designs. Monte Carlo methods are universally employed to validate the accuracy of the approximations made in all SSTA tools, but Monte Carlo itself is never employed as a strategy for practical SSTA. It is widely believed to be “too slow ” – despite an uncomfortable lack of rigorous studies to support this belief. We offer the first largescale study to refute this belief. We synthesize recent results from fast quasiMonte Carlo (QMC) deterministic sampling and efficient KarhunenLoéve expansion (KLE) models of spatial correlation to show that Monte Carlo SSTA need not be slow. Indeed, we show for the ISCAS89 circuits, a few hundred, wellchosen sample points can achieve errors within 5%, with no assumptions on gate models, wire models, or the core STA engine, with runtimes less than 90 s. 1