Results 1 
4 of
4
Scan Converting Spirals
 Proc. of WSCG 2002
"... Scanconversion of Archimedes ' spiral (a straight line in polar coordinates) is investigated. It is shown that an exact algorithm requires transcendental functions and, thus, cannot have a fast and exact integer implementation. Piecewise polynomial approximations are discussed and a simple algorith ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Scanconversion of Archimedes ' spiral (a straight line in polar coordinates) is investigated. It is shown that an exact algorithm requires transcendental functions and, thus, cannot have a fast and exact integer implementation. Piecewise polynomial approximations are discussed and a simple algorithm based on piecewise circular approximation is derived. Variations of the algorithms allow to scan convert other types of spirals.
Robust Adaptive Polygonal Approximation of Implicit Curves
"... We present an algorithm for computing a robust adaptive polygonal approximation of an implicit curve in the plane. The approximation is adapted to the geometry of the curve because the length of the edges varies with the curvature of the curve. Robustness is achieved by combining interval arithmet ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We present an algorithm for computing a robust adaptive polygonal approximation of an implicit curve in the plane. The approximation is adapted to the geometry of the curve because the length of the edges varies with the curvature of the curve. Robustness is achieved by combining interval arithmetic and automatic differentiation.
Interval Arithmetic, Affine Arithmetic, Taylor Series Methods: Why, What Next?
, 2003
"... In interval computations, the range of each intermediate result r is described by an interval r. To decrease excess interval width, we can keep some information on how r depends on the input x = (x 1 ; : : : ; xn ). There are several successful methods of approximating this dependence; in these m ..."
Abstract
 Add to MetaCart
In interval computations, the range of each intermediate result r is described by an interval r. To decrease excess interval width, we can keep some information on how r depends on the input x = (x 1 ; : : : ; xn ). There are several successful methods of approximating this dependence; in these methods, the dependence is approximated by linear functions (affine arithmetic) or by general polynomials (Taylor series methods). Why linear functions and polynomials? What other classes can we try? These questions are answered in this paper.
VISUAL EQUIVALENCE: A NEW STANDARD OF IMAGE FIDELITY FOR COMPUTER GRAPHICS
, 2008
"... Determining the visual fidelity of an image is a fundamental problem in computer graphics. When is an image good enough; i.e. when does it convey a convincing representation of a scene? Most graphics algorithms either aim to compute a physically accurate solution matching the real world, or they lea ..."
Abstract
 Add to MetaCart
Determining the visual fidelity of an image is a fundamental problem in computer graphics. When is an image good enough; i.e. when does it convey a convincing representation of a scene? Most graphics algorithms either aim to compute a physically accurate solution matching the real world, or they leave judgments of fidelity entirely up to the end user. The former is often computationally intractable, and the latter is adhoc since it cannot be generalized or predicted. In this dissertation, we chart a new course between these two approaches. We propose visual equivalence, a new standard of image fidelity that focuses on what is visually important to the observer: the appearance of the scene, consisting of impressions of shapes, materials, and lighting. Under visual equivalence, an image with noticeable, pixelbypixel differences from a perfect reference can still be a high fidelity representation of the same scene, provided it conveys the same impression of scene appearance. This appearancepreserving standard is, to our knowledge, the first approach to image fidelity that permits judgments of this kind. We present an endtoend psychophysical and algorithmic investigation of visual equivalence,