Results 1  10
of
23
Discontinuity Meshing for Radiosity
 Third Eurographics Workshop on Rendering
, 1992
"... The radiosity method is the most popular algorithm for simulating interreflection of light between diffuse surfaces. Most existing radiosity algorithms employ simple meshes and piecewise constant approximations, thereby constraining the radiosity function to be constant across each polygonal element ..."
Abstract

Cited by 90 (2 self)
 Add to MetaCart
The radiosity method is the most popular algorithm for simulating interreflection of light between diffuse surfaces. Most existing radiosity algorithms employ simple meshes and piecewise constant approximations, thereby constraining the radiosity function to be constant across each polygonal element. Much more accurate simulations are possible if linear, quadratic, or higher degree approximations are used. In order to realize the potential accuracy of higherdegree approximations, however, it is necessary for the radiosity mesh to resolve discontinuities such as shadow edges in the radiosity function. A discontinuity meshing algorithm is presented that places mesh boundaries directly along discontinuities. Such algorithms offer the potential of faster, more accurate simulations. Results are shown for threedimensional scenes. Keywords: global illumination, diffuse interreflection, adaptive mesh, shadow. 1 Introduction One of the most challenging tasks of image synthesis in computer ...
The Irradiance Volume
, 1996
"... This thesis presents a volumetric representation for the global illumination within a space based on the radiometric quantity irradiance. We call this representation the irradiance volume. Although irradiance is traditionally computed only for surfaces, its de nition can be naturally extended to all ..."
Abstract

Cited by 81 (8 self)
 Add to MetaCart
This thesis presents a volumetric representation for the global illumination within a space based on the radiometric quantity irradiance. We call this representation the irradiance volume. Although irradiance is traditionally computed only for surfaces, its de nition can be naturally extended to all points and directions in space. The irradiance volume supports the reconstruction of believable approximations to the illumination in situations that overwhelm traditional global illumination algorithms. Atheoretical basis for the irradiance volume is discussed and the methods and issues involved with building the volume are described. The irradiance volume method is tested within several situations in which the use of traditional global illumination methods is impractical, and is shown to provide good performance.
Physically Based Lighting Calculations for Computer Graphics
, 1991
"... Realistic image generation is presented in a theoretical formulation that builds from previous work on the rendering equation. Previous and new solution techniques for the global illumination are discussed in the context of this formulation. The basic ..."
Abstract

Cited by 69 (12 self)
 Add to MetaCart
Realistic image generation is presented in a theoretical formulation that builds from previous work on the rendering equation. Previous and new solution techniques for the global illumination are discussed in the context of this formulation. The basic
Radioptimization  Goal Based Rendering
 In Computer Graphics Proceedings, Annual Conference Series
, 1993
"... This paper presents a method for designing the illumination in an environment using optimization techniques applied to a radiosity based image synthesis system. An optimization of lighting parameters is performed based on user specified constraints and objectives for the illumination of the envir ..."
Abstract

Cited by 43 (0 self)
 Add to MetaCart
This paper presents a method for designing the illumination in an environment using optimization techniques applied to a radiosity based image synthesis system. An optimization of lighting parameters is performed based on user specified constraints and objectives for the illumination of the environment. The system solves for the "best" possible settings for: light source emissivities, element reflectivities, and spot light directionality parameters so that the design goals, suchastominimize energy or to give the the room an impression of privacy, are met. The system absorbs much of the burden for searching the design space allowing the user to focus on the goals of the illumination design rather than the intricate details of a complete lighting specification. A software implementation is described and some results of using the system are reported.
Radiosity in Flatland
 Computer Graphics Forum
, 1992
"... The radiosity method for the simulation of interreflection of light between diffuse surfaces is such a common image synthesis technique that its derivation is worthy of study. We here examine the radiosity method in a two dimensional, flatland world. It is shown that the radiosity method is a simple ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
The radiosity method for the simulation of interreflection of light between diffuse surfaces is such a common image synthesis technique that its derivation is worthy of study. We here examine the radiosity method in a two dimensional, flatland world. It is shown that the radiosity method is a simple finite element method for the solution of the integral equation governing global illumination. These twodimensional studies help explain the radiosity method in general and suggest a number of improvements to existing algorithms. In particular, radiosity solutions can be improved using a priori discontinuity meshing, placing mesh boundaries on discontinuities such as shadow edges. When discontinuity meshing is used along with piecewiselinear approximations instead of the current piecewiseconstant approximations, the accuracy of radiosity simulations can be greatly increased. Keywords: integral equation, adaptive mesh, finite element method, discontinuity, shadow, global illumination, di...
A multiresolution relational data model
 In Proc. 18th Int. Conf. Very Large Data Bases
, 1992
"... The use of data at different levels of information content is essential to the performance of multimedia, scientific, and other large databases because it can significantly decrease I/O and communication costs. ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
The use of data at different levels of information content is essential to the performance of multimedia, scientific, and other large databases because it can significantly decrease I/O and communication costs.
Time Complexity of Monte Carlo Radiosity
, 1992
"... The time complexity of Monte Carlo radiosity is discussed, and a proof is given that the expected number of rays required to produce a statistical radiosity solution below a specified variance for N zones is O(N ). A satisfactory solution is defined to be one in which the variance of radiance estima ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
The time complexity of Monte Carlo radiosity is discussed, and a proof is given that the expected number of rays required to produce a statistical radiosity solution below a specified variance for N zones is O(N ). A satisfactory solution is defined to be one in which the variance of radiance estimates for each zone is below a predefined threshold. The proof assumes that the radiance is bounded, and the area ratio of the largest to smallest zone is bounded. 1 Introduction In a radiosity (zonal) program, the surfaces in the environment are broken into N zones, z i , and the radiance, L i , of each zone is calculated [6, 7]. In the most straightforward radiosity method, all N 2 relationships (formfactors) are explicitly calculated, so the time complexity of the program is at least O(N 2 ). One of the first schemes to lower the radiosity calculation time was to group the N zones into p patches, and transfer power from patches to zones (elements) [4]. Still, the computation time wil...
Radiosity and Relaxation Methods  Progressive Refinement is Southwell Relaxation
 Department of Computer Science, Princeton University
, 1993
"... The radiosity method for realistic image synthesis has been described in the computer graphics literature since 1984. This paper discusses the various algorithms which have been developed for solving the radiosity problem and places them in the context of the literature on solving systems of line ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
The radiosity method for realistic image synthesis has been described in the computer graphics literature since 1984. This paper discusses the various algorithms which have been developed for solving the radiosity problem and places them in the context of the literature on solving systems of linear equations. The progressive radiosity method developed in 1988 is shown to be equivalent to a numerical technique known as Southwell iteration. A proof of convergence for this method when used for the radiosity problem is presented in the appendix. A new overshooting (similar to over relaxation) method is developed as a means of accelerating the convergence of the iterative radiosity methods.
Multidimensional illumination functions for visualisation of complex 3d environments
 The Journal of Visualisation and Computer Animation
, 1990
"... This paper presents a new viewindependent, energy equilibrium method for determining the light distributed in a complex 3D environment consisting of surfaces with general re ectance properties. The method does not depend on discretisation of directions or discretisation of surfaces to di erential e ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
This paper presents a new viewindependent, energy equilibrium method for determining the light distributed in a complex 3D environment consisting of surfaces with general re ectance properties. The method does not depend on discretisation of directions or discretisation of surfaces to di erential elements. Hence, it is a signi cant improvement over the earlier complete viewindepedent method which is computationally intractable for complex environments or the hybrid methods which include an extended viewdependent ray tracing second pass. The new method is based on an e cient data structure of order O(N 2) named as the spherical cover. The spherical cover elegantly captures the complex multidimensional directional nature of light distributed over surfaces. Subdivision techniques based onrange estimation of various parameters using interval arithmetic like methods are next described for e ciently computing the spherical cover for a given 3D environment. Using the spherical cover, light is progressively propagated through the environment until energy equilibrium is reached. Complexity analysis of the propagation step is carried out to show that the method is computationally tractable. The paper also includes a comprehensive review of earlier rendering techniques viewed from the point of capturing the multidimensional nature of light distribution over surfaces.
Quality Image Metrics for Synthetic Images based on Perceptual Color Differences
 IEEE Transactions on Image Processing
, 2002
"... Due to the improvement of image rendering processes, and the increasing importance of quantitative comparisons among synthetic color images, it is essential to define perceptually based metrics which enable to objectively assess the visual quality of digital simulations. In response to this need, th ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Due to the improvement of image rendering processes, and the increasing importance of quantitative comparisons among synthetic color images, it is essential to define perceptually based metrics which enable to objectively assess the visual quality of digital simulations. In response to this need, this paper proposes a new methodology for the determination of an objective image quality metric, and gives an answer to this problem through three metrics. This methodology is based on the LLAB color space for perception of color in complex images, a recent modification of the CIELab1976 color space. The first metric proposed is a pixel by pixel metric which introduces a local distance map between two images. The second metric associates, to a pair of images, a global value. Finally, the third metric uses a recursive subdivision of the images to obtain an adaptative distance map, rougher but less expensive to compute than the first method.