Results 1  10
of
32
QuasiRandomized Path Planning
 In Proc. IEEE Int’l Conf. on Robotics and Automation
, 2001
"... We propose the use of quasirandom sampling techniques for path planning in highdimensional conguration spaces. Following similar trends from related numerical computation elds, we show several advantages oered by these techniques in comparison to random sampling. Our ideas are evaluated in the con ..."
Abstract

Cited by 67 (10 self)
 Add to MetaCart
We propose the use of quasirandom sampling techniques for path planning in highdimensional conguration spaces. Following similar trends from related numerical computation elds, we show several advantages oered by these techniques in comparison to random sampling. Our ideas are evaluated in the context of the probabilistic roadmap (PRM) framework. Two quasirandom variants of PRMbased planners are proposed: 1) a classical PRM with quasirandom sampling, and 2) a quasirandom LazyPRM. Both have been implemented, and are shown through experiments to oer some performance advantages in comparison to their randomized counterparts. 1 Introduction Over two decades of path planning research have led to two primary trends. In the 1980s, deterministic approaches provided both elegant, complete algorithms for solving the problem, and also useful approximate or incomplete algorithms. The curse of dimensionality due to highdimensional conguration spaces motivated researchers from the 199...
From Discrepancy to Declustering: Nearoptimal multidimensional declustering strategies for range queries (Extended Abstract)
, 2001
"... Declustering schemes allocate data blocks among multiple disks to enable parallel retrieval. Given a declustering scheme D, its response time with respect to a query Q, rt(Q), is defined to be the maximum number of disk blocks of the query stored by the scheme in any one of the disks. If Q is the ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
Declustering schemes allocate data blocks among multiple disks to enable parallel retrieval. Given a declustering scheme D, its response time with respect to a query Q, rt(Q), is defined to be the maximum number of disk blocks of the query stored by the scheme in any one of the disks. If Q is the number of data blocks in Q and M is the number of disks then rt(Q) is at least Q/M. One way to evaluate the performance of D with respect to a set of queries Q is to measure its additive error the maximum difference between rt(Q) from Q/M over all range queries Q ∈ Q. In this paper, we consider the problem of designing declustering schemes for uniform multidimensional data arranged in a ddimensional grid so that their additive errors with respect to range queries are as small as possible. It has been shown that such declustering schemes will have an additive error of Ω(log M) when d = 2 and Ω(log d−1 2 M) when d> 2 with respect to range queries. Asymptotically optimal declustering schemes exist for 2dimensional data. For data in larger dimensions, however, the best bound for additive errors is O(M d−1), which is extremely large. In this paper, we propose the two declustering schemes based on low discrepancy points in ddimensions. When d is fixed, both schemes have an additive error of O(log d−1 M) with respect to range queries provided certain conditions are satisfied: the first scheme requires d ≥ 3 and M to be a power of a prime where the prime is at least d while the second scheme requires the size of the data to grow within some polynomial of M, with no restriction on
A randomized quasiMonte Carlo simulation method for Markov chains
 Operations Research
, 2007
"... Abstract. We introduce and study a randomized quasiMonte Carlo method for estimating the state distribution at each step of a Markov chain. The number of steps in the chain can be random and unbounded. The method simulates n copies of the chain in parallel, using a (d + 1)dimensional highlyunifor ..."
Abstract

Cited by 20 (8 self)
 Add to MetaCart
Abstract. We introduce and study a randomized quasiMonte Carlo method for estimating the state distribution at each step of a Markov chain. The number of steps in the chain can be random and unbounded. The method simulates n copies of the chain in parallel, using a (d + 1)dimensional highlyuniform point set of cardinality n, randomized independently at each step, where d is the number of uniform random numbers required at each transition of the Markov chain. This technique is effective in particular to obtain a lowvariance unbiased estimator of the expected total cost up to some random stopping time, when statedependent costs are paid at each step. It is generally more effective when the state space has a natural order related to the cost function. We provide numerical illustrations where the variance reduction with respect to standard Monte Carlo is substantial. The variance can be reduced by factors of several thousands in some cases. We prove bounds on the convergence rate of the worstcase error and variance for special situations. In line with what is typically observed in randomized quasiMonte Carlo contexts, our empirical results indicate much better convergence than what these bounds guarantee.
Asymptotically optimal declustering schemes for range queries
 in 8th International Conference on Database Theory, Lecture Notes In Computer Science
, 2001
"... d\Gamma 1 2) for ddim schemes and to \Omega (log M) for 2dim schemes, thus proving that the ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
d\Gamma 1 2) for ddim schemes and to \Omega (log M) for 2dim schemes, thus proving that the
A weighted error metric and optimization method for antialiasing patterns. Eurographics
 Computer Graphics Forum
, 2006
"... Displaying a synthetic image on a computer display requires determining the colors of individual pixels. To avoid aliasing, multiple samples of the image can be taken per pixel, after which the color of a pixel may be computed as a weighted sum of the samples. The positions and weights of the sample ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Displaying a synthetic image on a computer display requires determining the colors of individual pixels. To avoid aliasing, multiple samples of the image can be taken per pixel, after which the color of a pixel may be computed as a weighted sum of the samples. The positions and weights of the samples play a major role in the resulting image quality, especially in realtime applications where usually only a handful of samples can be afforded per pixel. This paper presents a new error metric and an optimization method for antialiasing patterns used in image reconstruction. The metric is based on comparing the pattern against a given reference reconstruction filter in spatial domain and it takes into account psychovisually measured anglespecific acuities for sharp features. Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Picture/Image Generation – Antialiasing
Control variates for quasiMonte Carlo
, 2003
"... QuasiMonte Carlo (QMC) methods have begun to displace ordinary Monte Carlo (MC) methods in many practical problems. It is natural and obvious to combine QMC methods with traditional variance reduction techniques used in MC sampling, such as control variates. There can, ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
QuasiMonte Carlo (QMC) methods have begun to displace ordinary Monte Carlo (MC) methods in many practical problems. It is natural and obvious to combine QMC methods with traditional variance reduction techniques used in MC sampling, such as control variates. There can,
LowDiscrepancy Curves and Efficient Coverage of Space
 Workshop on Algorithmic Foundations of Robotics VII
, 2006
"... We introduce the notion of lowdiscrepancy curves and use it to solve the problem of optimally covering space. In doing so, we extend the notion of lowdiscrepancy sequences in such a way that sufficiently smooth curves with low discrepancy properties can be defined and generated. Based on a class o ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
We introduce the notion of lowdiscrepancy curves and use it to solve the problem of optimally covering space. In doing so, we extend the notion of lowdiscrepancy sequences in such a way that sufficiently smooth curves with low discrepancy properties can be defined and generated. Based on a class of curves that cover the unit square in an efficient way, we define induced low discrepancy curves in Riemannian spaces. This allows us to efficiently cover an arbitrarily chosen abstract surface that admits a diffeomorphism to the unit square. We demonstrate the application of these ideas by presenting concrete examples of lowdiscrepancy curves on some surfaces that are of interest in robotics.
Shape Fitting on Point Sets with Probability Distributions
"... Abstract. We consider problems on data sets where each data point has uncertainty described by an individual probability distribution. We develop several frameworks and algorithms for calculating statistics on these uncertain data sets. Our examples focus on geometric shape fitting problems. We prov ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Abstract. We consider problems on data sets where each data point has uncertainty described by an individual probability distribution. We develop several frameworks and algorithms for calculating statistics on these uncertain data sets. Our examples focus on geometric shape fitting problems. We prove approximation guarantees for the algorithms with respect to the full probability distributions. We then empirically demonstrate that our algorithms are simple and practical, solving for a constant hidden by asymptotic analysis so that a user can reliably trade speed and size for accuracy. 1
Variance and Discrepancy with Alternative Scramblings
, 2002
"... This paper analyzes some schemes for reducing the computational burden of digital scrambling. Some such schemes have been shown not to affect the mean squared L² discrepancy. This paper shows that some discrepancypreserving alternative scrambles can change the variance in scrambled net quadrature. ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
This paper analyzes some schemes for reducing the computational burden of digital scrambling. Some such schemes have been shown not to affect the mean squared L² discrepancy. This paper shows that some discrepancypreserving alternative scrambles can change the variance in scrambled net quadrature. Even the rate of convergence can be adversely affected by alternative scramblings. Finally, some alternatives reduce the computational burden and can also be shown to improve the rate of convergence for the variance, at least in dimension 1.