Results 1  10
of
215
Polygonization of Implicit Surfaces
, 1988
"... This paper discusses a numerical technique that approximates an implicit surface with a polygonal representation. The implicit function is adaptively sampled as it is surrounded by a spatial partitioning. The partitioning is represented by an octree, which may either converge to the surface or track ..."
Abstract

Cited by 376 (3 self)
 Add to MetaCart
This paper discusses a numerical technique that approximates an implicit surface with a polygonal representation. The implicit function is adaptively sampled as it is surrounded by a spatial partitioning. The partitioning is represented by an octree, which may either converge to the surface or track it. A piecewise polygonal representation is derived from the octree.
A rapid hierarchical radiosity algorithm
 Computer Graphics
, 1991
"... This paper presents a rapid hierarchical radiosity algorithm for illuminating scenes containing lar e polygonal patches. The afgorithm constructs a hierarchic“J representation of the form factor matrix by adaptively subdividing patches into su bpatches according to a usersupplied error bound. The a ..."
Abstract

Cited by 369 (11 self)
 Add to MetaCart
This paper presents a rapid hierarchical radiosity algorithm for illuminating scenes containing lar e polygonal patches. The afgorithm constructs a hierarchic“J representation of the form factor matrix by adaptively subdividing patches into su bpatches according to a usersupplied error bound. The algorithm guarantees that all form factors are calculated to the same precision, removing many common image artifacts due to inaccurate form factors. More importantly, the al orithm decomposes the form factor matrix into at most O? n) blocks (where n is the number of elements). Previous radiosity algorithms represented the elementtoelement transport interactions with n2 form factors. Visibility algorithms are given that work well with this approach. Standard techniques for shooting and gathering can be used with the hierarchical representation to solve for equilibrium radiosities, but we also discuss using a brightnessweighted error criteria, in conjunction with multigrldding, to even more rapidly progressively refine the image.
Efficient ray tracing of volume data
 ACM Transactions on Graphics
, 1990
"... Volume rendering is a technique for visualizing sampled scalar or vector fields of three spatial dimensions without fitting geometric primitives to the data. A subset of these techniques generates images by computing 2D projections of a colored semitransparent volume, where the color and opacity at ..."
Abstract

Cited by 327 (4 self)
 Add to MetaCart
Volume rendering is a technique for visualizing sampled scalar or vector fields of three spatial dimensions without fitting geometric primitives to the data. A subset of these techniques generates images by computing 2D projections of a colored semitransparent volume, where the color and opacity at each point are derived from the data using local operators. Since all voxels participate in the generation of each image, rendering time grows linearly with the size of the dataset. This paper presents a fronttoback imageorder volumerendering algorithm and discusses two techniques for improving its performance. The first technique employs a pyramid of binary volumes to encode spatial coherence present in the data, and the second technique uses an opacity threshold to adaptively terminate ray tracing. Although the actual time saved depends on the data, speedups of an order of magnitude have been observed for datasets of useful size and complexity. Examples from two applications are given: medical imaging and molecular graphics.
The Accumulation Buffer: Hardware Support for HighQuality Rendering
, 1990
"... This paper describes a system architecture that supports realtime generation of complex images, efficient generation of extremely highquality images, and a smooth tradeoff between the two. Based on the paradigm of integration, the architecture extends a stateoftheart rendering system with an ad ..."
Abstract

Cited by 151 (3 self)
 Add to MetaCart
This paper describes a system architecture that supports realtime generation of complex images, efficient generation of extremely highquality images, and a smooth tradeoff between the two. Based on the paradigm of integration, the architecture extends a stateoftheart rendering system with an additional highprecision image buffer. This additional buffer, called the Accumulation Buffer, is used to integrate images that are rendered into the framebuffer. While originally conceived as a solution to the problem of aliasing, the Accumulation Buffer provides a general solution to the problems of motion blur and depthoffield as well. Because the architecture is a direct extension of current workstation rendering technology, we begin by discussing the performance and quality characteristics of that technology. The problem of spatial aliasing is then discussed, and the Accumulation Buffer is shown to be a desirable solution. Finally the generality of the Accumulation Buffer is explored, concentrating on its application to the problems of motion blur, depthoffield, and soft shadows.
A survey of shadow algorithms
 IEEE Computer Graphics and Applications
, 1990
"... Essential to realistic and visually appealing images, shadows are difficult ta compute in most display environments. This survey characterizes the various types of shadows. It also describes most existing shadow algorithms and discusses their complexities, advantages, and shommings. We examine herd ..."
Abstract

Cited by 126 (3 self)
 Add to MetaCart
Essential to realistic and visually appealing images, shadows are difficult ta compute in most display environments. This survey characterizes the various types of shadows. It also describes most existing shadow algorithms and discusses their complexities, advantages, and shommings. We examine herd shadows, soft shadbws, shadows of transparent objects, and shadows for complex modeling primitives. For each type, we examine shadow algorithms within various rendswing techniques. This survey attempts to provide readem with enough background and insight on the various rmthods to dow them to choose the algorithm best wpuited to their W. We also hope that our analysis will h&p identify the a m that need more research and point bo possible sotutkms. A shadowa region of relative darkness within an not necessarily attenuate the light it occludes. In fact, illuminated regionoccurs when an object totally or it can concentrate light. However, as is traditional in partially occludes the light. A transparent object does image synthesis, lve will consider a region to be in
A language for shading and lighting calculations
 Computer Graphics (SIGGRAPH ’90 Proceedings
, 1990
"... A shading language provides a means to extend the shading and lighting formulae used by a rendering system. This paper discusses the design of a new shading language based on previous work of Cook and Perlin. This language has various types of shaders for light sources and surface reflectances, poin ..."
Abstract

Cited by 108 (6 self)
 Add to MetaCart
A shading language provides a means to extend the shading and lighting formulae used by a rendering system. This paper discusses the design of a new shading language based on previous work of Cook and Perlin. This language has various types of shaders for light sources and surface reflectances, point and color data types, control flow constructs that support the casting of outgoing and the integration of incident light, a clearly specified interface to the rendering system using global state variables, and a host of useful builtin functions. The design issues and their impact on the implementation are also discussed. CR Categories: 1.3.3 [Computer Graphics] Picture/Image Generation Display algorithms; 1.3.5 [Computer Graphics]
Monte Carlo Techniques for Direct Lighting Calculations
 ACM Transactions on Graphics
, 1996
"... In a distribution ray tracer, the crucial part of the direct lighting calculation is the sampling strategy for shadow ray testing. Monte Carlo integration with importance sampling is used to carry out this calculation. Importance sampling involves the design of integrandspecific probability density ..."
Abstract

Cited by 88 (9 self)
 Add to MetaCart
In a distribution ray tracer, the crucial part of the direct lighting calculation is the sampling strategy for shadow ray testing. Monte Carlo integration with importance sampling is used to carry out this calculation. Importance sampling involves the design of integrandspecific probability density functions which are used to generate sample points for the numerical quadrature. Probability density functions are presented that aid in the direct lighting calculation from luminaires of various simple shapes. A method for defining a probability density function over a set of luminaires is presented that allows the direct lighting calculation to be carried out with one sample, regardless of the number of luminaires. CR Categories and Subject Descriptors: G.1.4 [Mathematical Computing]: Quadrature and Numerical Differentiation; I.3.0 [Computer Graphics]: General; I.3.7 [Computer Graphics]: ThreeDimensional Graphics and Realism. Additional Key Words and Phrases: direct lighting, importanc...
The Irradiance Volume
, 1996
"... This thesis presents a volumetric representation for the global illumination within a space based on the radiometric quantity irradiance. We call this representation the irradiance volume. Although irradiance is traditionally computed only for surfaces, its de nition can be naturally extended to all ..."
Abstract

Cited by 81 (8 self)
 Add to MetaCart
This thesis presents a volumetric representation for the global illumination within a space based on the radiometric quantity irradiance. We call this representation the irradiance volume. Although irradiance is traditionally computed only for surfaces, its de nition can be naturally extended to all points and directions in space. The irradiance volume supports the reconstruction of believable approximations to the illumination in situations that overwhelm traditional global illumination algorithms. Atheoretical basis for the irradiance volume is discussed and the methods and issues involved with building the volume are described. The irradiance volume method is tested within several situations in which the use of traditional global illumination methods is impractical, and is shown to provide good performance.
Spatially nonuniform scaling functions for high contrast images
 In Proceedings of Graphics Interface ’93
, 1993
"... An algorithm is presented that scales the pixel intensities of a computer generated greyscale image so that they are all displayable on a standard CRT. This scaling is spatially nonuniform over the image in that different pixels with the same intensity in the original image may have different intens ..."
Abstract

Cited by 80 (6 self)
 Add to MetaCart
An algorithm is presented that scales the pixel intensities of a computer generated greyscale image so that they are all displayable on a standard CRT. This scaling is spatially nonuniform over the image in that different pixels with the same intensity in the original image may have different intensities in the resulting image. The goal of this scaling transformation is to produce an image on the CRT that perceptually mimics the calculated image, while staying within the physical limitations of the CRT. CR Categories and SubjectDescriptors: I.3.0 [Computer Graphics]:
Fast hierarchical importance sampling with blue noise properties
 ACM TRANSACTIONS ON GRAPHICS
, 2004
"... This paper presents a novel method for efficiently generating a good sampling pattern given an importance density over a 2D domain. A Penrose tiling is hierarchically subdivided creating a sufficiently large number of sample points. These points are numbered using the Fibonacci number system, and th ..."
Abstract

Cited by 76 (8 self)
 Add to MetaCart
This paper presents a novel method for efficiently generating a good sampling pattern given an importance density over a 2D domain. A Penrose tiling is hierarchically subdivided creating a sufficiently large number of sample points. These points are numbered using the Fibonacci number system, and these numbers are used to threshold the samples against the local value of the importance density. Precomputed correction vectors, obtained using relaxation, are used to improve the spectral characteristics of the sampling pattern. The technique is deterministic and very fast; the sampling time grows linearly with the required number of samples. We illustrate our technique with importancebased environment mapping, but the technique is versatile enough to be used in a large variety of computer graphics applications, such as light transport calculations, digital halftoning, geometry processing, and various rendering techniques.