• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

Discrepancy as a quality measure for sample distributions. In Computer Graphics Forum: (1991)

by P SHIRLEY
Venue:Proceedings of Eurographics
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 77
Next 10 →

Fast hierarchical importance sampling with blue noise properties

by Victor Ostromoukhov, Charles Donohue, Pierre-marc Jodoin - ACM TRANSACTIONS ON GRAPHICS , 2004
"... This paper presents a novel method for efficiently generating a good sampling pattern given an importance density over a 2D domain. A Penrose tiling is hierarchically subdivided creating a sufficiently large number of sample points. These points are numbered using the Fibonacci number system, and th ..."
Abstract - Cited by 102 (8 self) - Add to MetaCart
This paper presents a novel method for efficiently generating a good sampling pattern given an importance density over a 2D domain. A Penrose tiling is hierarchically subdivided creating a sufficiently large number of sample points. These points are numbered using the Fibonacci number system, and these numbers are used to threshold the samples against the local value of the importance density. Pre-computed correction vectors, obtained using relaxation, are used to improve the spectral characteristics of the sampling pattern. The technique is deterministic and very fast; the sampling time grows linearly with the required number of samples. We illustrate our technique with importance-based environment mapping, but the technique is versatile enough to be used in a large variety of computer graphics applications, such as light transport calculations, digital halftoning, geometry processing, and various rendering techniques.

Monte Carlo Techniques for Direct Lighting Calculations

by Peter Shirley, Changyaw Wang, Kurt Zimmermann - ACM Transactions on Graphics , 1996
"... In a distribution ray tracer, the crucial part of the direct lighting calculation is the sampling strategy for shadow ray testing. Monte Carlo integration with importance sampling is used to carry out this calculation. Importance sampling involves the design of integrand-specific probability density ..."
Abstract - Cited by 93 (9 self) - Add to MetaCart
In a distribution ray tracer, the crucial part of the direct lighting calculation is the sampling strategy for shadow ray testing. Monte Carlo integration with importance sampling is used to carry out this calculation. Importance sampling involves the design of integrand-specific probability density functions which are used to generate sample points for the numerical quadrature. Probability density functions are presented that aid in the direct lighting calculation from luminaires of various simple shapes. A method for defining a probability density function over a set of luminaires is presented that allows the direct lighting calculation to be carried out with one sample, regardless of the number of luminaires. CR Categories and Subject Descriptors: G.1.4 [Mathematical Computing]: Quadrature and Numerical Differentiation; I.3.0 [Computer Graphics]: General; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism. Additional Key Words and Phrases: direct lighting, importanc...
(Show Context)

Citation Context

...chers replace the / i with more evenly distributed (quasi-random) samples (e.g. [6, 24, 31]). This approach can be shown to be sound by analyzing decreasing error in terms of some discrepancy measure =-=[50, 48, 24, 34]-=- rather than in terms of variance. However, in practice it is often convenient to develop a sampling strategy using variance analysis on random samples, and then to turn around and use non-random, but...

A Realistic Camera Model for Computer Graphics

by Craig Kolb, Don Mitchell , Pat Hanrahan
"... Most recent rendering research has concentrated on two subproblems: modeling the reflection of light from materials, and calculating the direct and indirect illumination from light sources and other surfaces. Another key component of a rendering system is the camera model. Unfortunately, current cam ..."
Abstract - Cited by 89 (1 self) - Add to MetaCart
Most recent rendering research has concentrated on two subproblems: modeling the reflection of light from materials, and calculating the direct and indirect illumination from light sources and other surfaces. Another key component of a rendering system is the camera model. Unfortunately, current camera models are not geometrically or radiometrically correct and thus are not sufficient for synthesizing images from physically-based rendering programs. In this paper we describe a physically-based camera model for computer graphics. More precisely, a physically-based camera model accurately computes the irradiance on the film given the incoming radiance from the scene. In our model a camera is described as a lens system and film backplane. The lens system consists of a sequence of simple lens elements, stops and apertures. The camera simulation module computes the irradiance on the backplane from the scene radiances using distributed ray tracing. This is accomplished by a detailed simulation of the geometry of ray paths through the lens system, and by sampling the lens system such that the radiometry is computed accurately and efficiently. Because even the most complicated lenses have a relatively small number of elements, the simulation only increases the total rendering time slightly.

Quasi-Randomized Path Planning

by Michael S. Branicky, Steven M. Lavalle, Kari Olson, Libo Yang - In Proc. IEEE Int’l Conf. on Robotics and Automation , 2001
"... We propose the use of quasi-random sampling techniques for path planning in high-dimensional conguration spaces. Following similar trends from related numerical computation elds, we show several advantages oered by these techniques in comparison to random sampling. Our ideas are evaluated in the con ..."
Abstract - Cited by 74 (8 self) - Add to MetaCart
We propose the use of quasi-random sampling techniques for path planning in high-dimensional conguration spaces. Following similar trends from related numerical computation elds, we show several advantages oered by these techniques in comparison to random sampling. Our ideas are evaluated in the context of the probabilistic roadmap (PRM) framework. Two quasi-random variants of PRM-based planners are proposed: 1) a classical PRM with quasi-random sampling, and 2) a quasi-random Lazy-PRM. Both have been implemented, and are shown through experiments to oer some performance advantages in comparison to their randomized counterparts. 1 Introduction Over two decades of path planning research have led to two primary trends. In the 1980s, deterministic approaches provided both elegant, complete algorithms for solving the problem, and also useful approximate or incomplete algorithms. The curse of dimensionality due to high-dimensional conguration spaces motivated researchers from the 199...
(Show Context)

Citation Context

... superior to random sampling. Quasi-random sampling ideas have improved computational methods in many areas, including integration [25], optimization [21], image processing [8], and computer graphics =-=[24]-=-. It is therefore natural to ask: Can quasi-random sampling ideas also improve path planning methods designed for high degrees of freedom? Is randomization really necessary? a. b. Figure 1: a) A proba...

Recursive wang tiles for real-time blue noise

by Johannes Kopf, Daniel Cohen-or, Oliver Deussen, Dani Lischinski - PROC. SIGGRAPH , 2006
"... ..."
Abstract - Cited by 58 (5 self) - Add to MetaCart
Abstract not found

Well-Spaced Points for Numerical Methods

by Dafna Talmor, Guy Blelloch, Alan M. Frieze, Noel J. Walkington, Shang-hua Teng , 1997
"... mesh generation, mesh coarsening, multigrid Abstract A numerical method for the solution of a partial differential equation (PDE) requires the following steps: (1) discretizing the domain (mesh generation); (2) using an approximation method and the mesh to transform the problem into a linear system; ..."
Abstract - Cited by 49 (2 self) - Add to MetaCart
mesh generation, mesh coarsening, multigrid Abstract A numerical method for the solution of a partial differential equation (PDE) requires the following steps: (1) discretizing the domain (mesh generation); (2) using an approximation method and the mesh to transform the problem into a linear system; (3) solving the linear system. The approximation error and convergence of the numerical method depend on the geometric quality of the mesh, which in turn depends on the size and shape of its elements. For example, the shape quality of a triangular mesh is measured by its element's aspect ratio. In this work, we shift the focus to the geometric properties of the nodes, rather than the elements, of well shaped meshes. We introduce the concept of well-spaced points and their spacing functions, and show that these enable the development of simple and efficient algorithms for the different stages of the numerical solution of PDEs. We first apply well-spaced point sets and their accompanying technology to mesh coarsening, a crucial step in the multigrid solution of a PDE. A good aspect-ratio coarsening sequence of an unstructured mesh M0 is a sequence of good aspect-ratio meshes M1; : : : ; Mk such that Mi is an approximation of Mi\Gamma 1 containing fewer nodes and elements. We present a new approach to coarsening that guarantees the sequence is also of optimal size and width up to a constant factor-- the first coarsening method that provides these guarantees. We also present experimental results, based on an implementation of our approach, that substantiate the theoretical claims.
(Show Context)

Citation Context

...eting a point if it falls within some other disk. The generalization we use, restricting the disk sizes to behave as a slowly changing function, is very different in spirit. 1.1.2 Discrepancy Shirley =-=[68]-=- was the first to introduce discrepancy as a quality measure of samples for the rendering equation. The concept of discrepancy is from the field of numerical integration, where the problem of identify...

Quasi-Monte Carlo Radiosity

by Alexander Keller , 1996
"... The problem of global illumination in computer graphics is described by a second kind Fredholm integral equation. Due to the complexity of this equation, Monte Carlo methods provide an interesting tool for approximating solutions to this transport equation. For the case of the radiosity equation, w ..."
Abstract - Cited by 40 (2 self) - Add to MetaCart
The problem of global illumination in computer graphics is described by a second kind Fredholm integral equation. Due to the complexity of this equation, Monte Carlo methods provide an interesting tool for approximating solutions to this transport equation. For the case of the radiosity equation, we present the deterministic method of quasi-random walks. This method very efficiently uses low discrepancy sequences for integrating the Neumann series and consistently outperforms stochastic techniques. The method of quasi-random walks also is applicable to transport problems in settings other than computer graphics.

Computing the maximum bichromatic discrepancy, with applications to computer graphics and machine learning

by David P. Dobkin, Dimitrios Gunopulos, Wolfgang Maass - J. Computer and Systems Sciences , 1996
"... Computing the maximum bichromatic discrepancy is an interesting theoretical problem with important applications in computational learning theory, computational geometry and computer graphics. In this paper we give algorithms to compute the maximum bichromatic discrepancy for simple geometric ranges, ..."
Abstract - Cited by 36 (6 self) - Add to MetaCart
Computing the maximum bichromatic discrepancy is an interesting theoretical problem with important applications in computational learning theory, computational geometry and computer graphics. In this paper we give algorithms to compute the maximum bichromatic discrepancy for simple geometric ranges, including rectangles and halfspaces. In addition, we give extensions to other discrepancy problems.] 1996 Academic Press, Inc. 1.
(Show Context)

Citation Context

... be reduced with supersampling. One promising approach is the application of the theory of discrepancy or irregularities of distribution, introduced by [BC], and applied to computer graphics first by =-=[S]-=- and [N92]. The discrepancy theory focuses on the problem of approximating one measure (typically a continuous one) with another (typically a discrete one). It's main application is in Quasi Monte-Car...

Computing the discrepancy with applications to supersampling patterns

by David P. Dobkin, David Eppstein, Don P. Mitchell - 11Dynamic Planar Convex , 1996
"... Patterns used for supersampling in graphics have been analyzed from statistical and signal-processing viewpoints. We present an analysis based on a type of isotropic discrepancy—how good patterns are at estimating the area in a region of defined type. We present algorithms for computing discrepancy ..."
Abstract - Cited by 32 (4 self) - Add to MetaCart
Patterns used for supersampling in graphics have been analyzed from statistical and signal-processing viewpoints. We present an analysis based on a type of isotropic discrepancy—how good patterns are at estimating the area in a region of defined type. We present algorithms for computing discrepancy relative to regions that are defined by rectangles, halfplanes, and higher-dimensional figures. Experimental evidence shows that popular supersampling pat-terns have discrepancies with better asymptotic behavior than random sampling, which is not inconsistent with theoretical bounds on discrepancy.

Global Ray-bundle Tracing with Hardware Acceleration

by Laszlo Szirmay-Kalos, Werner Purgathofer - IN RENDERING TECHNIQUES '98 , 1998
"... The paper presents a single-pass, view-dependent method to solve the general rendering equation, using a combined finite element and random walk approach. Applying finite element techniques, the surfaces are decomposed into planar patches that are assumed to have position independent, but not dir ..."
Abstract - Cited by 31 (7 self) - Add to MetaCart
The paper presents a single-pass, view-dependent method to solve the general rendering equation, using a combined finite element and random walk approach. Applying finite element techniques, the surfaces are decomposed into planar patches that are assumed to have position independent, but not direction independent radiance. The direction dependent radiance function is then computed by random walk using bundles of parallel rays. In a single step of the walk, the radiance transfer is evaluated exploiting the hardware z-buffer of workstations, making the calculation fast. The proposed method is particularly efficient for scenes including not very specular materials illuminated by large area lightsources or sky-light. In order to increase the speed for difficult lighting situations, walks can be selected according to their importance. The importance can be explored adaptively by the Metropolis sampling method.
(Show Context)

Citation Context

...lo random walk algorithm — called distributed ray-tracing — was proposed by Cook et al. [5], which spawned to a set of variations, including path tracing [9], light-tracing [7], Monte-Carlo radiosity =-=[20]-=-[14][17], and two-pass methods which combine radiosity and ray-tracing [29]. The problem of naive generation of walks is that the majority of the paths do not contribute to the image at all, and their...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University