Results 1  10
of
18
The Lumigraph
 In Proceedings of SIGGRAPH 96
, 1996
"... This paper discusses a new method for capturing the complete appearanceof both synthetic and real world objects and scenes, representing this information, and then using this representation to render images of the object from new camera positions. Unlike the shape capture process traditionally used ..."
Abstract

Cited by 1034 (43 self)
 Add to MetaCart
(Show Context)
This paper discusses a new method for capturing the complete appearanceof both synthetic and real world objects and scenes, representing this information, and then using this representation to render images of the object from new camera positions. Unlike the shape capture process traditionally used in computer vision and the rendering process traditionally used in computer graphics, our approach does not rely on geometric representations. Instead we sample and reconstruct a 4D function, which we call a Lumigraph. The Lumigraph is a subset of the complete plenoptic function that describes the flow of light at all positions in all directions. With the Lumigraph, new images of the object can be generated very quickly, independent of the geometric or illumination complexity of the scene or object. The paper discusses a complete working system including the capture of samples, the construction of the Lumigraph, and the subsequent rendering of images from this new representation. 1
Scattered Data Interpolation with Multilevel Splines
 IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS
, 1997
"... This paper describes a fast algorithm for scattered data interpolation and approximation. Multilevel Bsplines are introduced to compute a C²continuous surface through a set of irregularly spaced points. The algorithm makes use of a coarsetofine hierarchy of control lattices to generate a sequen ..."
Abstract

Cited by 158 (10 self)
 Add to MetaCart
This paper describes a fast algorithm for scattered data interpolation and approximation. Multilevel Bsplines are introduced to compute a C²continuous surface through a set of irregularly spaced points. The algorithm makes use of a coarsetofine hierarchy of control lattices to generate a sequence of bicubic Bspline functions whose sum approaches the desired interpolation function. Large performance gains are realized by using Bspline refinement to reduce the sum of these functions into one equivalent Bspline function. Experimental results demonstrate that highfidelity reconstruction is possible from a selected set of sparse and irregular samples.
A Tensor Framework for Multidimensional Signal Processing
 Linkoping University, Sweden
, 1994
"... ii About the cover The figure on the cover shows a visualization of a symmetric tensor in three dimensions, G = λ1ê1ê T 1 + λ2ê2ê T 2 + λ3ê3ê T 3 The object in the figure is the sum of a spear, a plate and a sphere. The spear describes the principal direction of the tensor λ1ê1ê T 1, where the lengt ..."
Abstract

Cited by 66 (8 self)
 Add to MetaCart
ii About the cover The figure on the cover shows a visualization of a symmetric tensor in three dimensions, G = λ1ê1ê T 1 + λ2ê2ê T 2 + λ3ê3ê T 3 The object in the figure is the sum of a spear, a plate and a sphere. The spear describes the principal direction of the tensor λ1ê1ê T 1, where the length is proportional to the largest eigenvalue, λ1. The plate describes the plane spanned by the eigenvectors corresponding to the two largest eigenvalues, λ2(ê1ê T 1 + ê2ê T 2). The sphere, with a radius proportional to the smallest eigenvalue, shows how isotropic the tensor is, λ3(ê1ê T 1 + ê2ê T 2 + ê3ê T 3). The visualization is done using AVS [WWW94]. I am very grateful to Johan Wiklund for implementing the tensor viewer module used. This thesis deals with filtering of multidimensional signals. A large part of the thesis is devoted to a novel filtering method termed “Normalized convolution”. The method performs local expansion of a signal in a chosen filter basis which
Nonuniform Image Reconstruction Using Multilevel Surface Interpolation
 Proc. IEEE Int. Conf. on Image Processing, ICIP'97
, 1997
"... This paper describes a fast algorithm for nonuniform image reconstruction. A multiresolution approach is formulated to compute a C 2 continuous surface through a set of irregularly spaced samples. The algorithm makes use of a coarsetofine hierarchy of control lattices to generate a sequence of ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
This paper describes a fast algorithm for nonuniform image reconstruction. A multiresolution approach is formulated to compute a C 2 continuous surface through a set of irregularly spaced samples. The algorithm makes use of a coarsetofine hierarchy of control lattices to generate a sequence of surfaces whose sum approaches the desired interpolating surface. Experimental results demonstrate that high fidelity reconstruction is possible from a selected set of sparse and irregular samples. 1. INTRODUCTION Nonuniform image reconstruction refers to the problem of fitting a smooth function through a nonuniform, or scattered, distribution of image samples. This subject is closely related to the general problem of scattered data interpolation, which is a subject is of practical importance in many science and engineering fields, where data is often measured or generated at sparse and irregular positions. The goal of interpolation is to reconstruct an underlying function (e.g., surface) th...
Continuous normalized convolution
 in Proceedings of International Conference on Multimedia and Expo (ICME’02
, 2002
"... The problem of signal estimation for sparsely and irregularly sampled signals is dealt with using continuous normalized convolution. Image values on realvalued positions are estimated using integration of signals and certainties over a neighbourhood employing a local model of both the signal and th ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
The problem of signal estimation for sparsely and irregularly sampled signals is dealt with using continuous normalized convolution. Image values on realvalued positions are estimated using integration of signals and certainties over a neighbourhood employing a local model of both the signal and the used discrete filters. The result of the approach is that an output sample close to signals with high certainty is interpolated using a small neighbourhood. An output sample close to signals with low certainty is spatially predicted from signals in a large neighbourhood. 1.
Fractal Modeling of Natural Terrain: Analysis and Surface Reconstruction with Range Data
 Graphical Models and Image Processing
, 1996
"... this paper we address two issues in modeling natural terrain using fractal geometry: estimation of fractal dimension, and fractal surface reconstruction. For estimation of fractal dimension, we extend the fractal Brownian function approach to accommodate irregularly sampled data, and we develop met ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
this paper we address two issues in modeling natural terrain using fractal geometry: estimation of fractal dimension, and fractal surface reconstruction. For estimation of fractal dimension, we extend the fractal Brownian function approach to accommodate irregularly sampled data, and we develop methods for segmentingsets of points exhibitingselfsimilarity over only certain scales. For fractal surface reconstruction, we extend Szeliski's regularization with fractal priors to use a temperature parameter that depends on fractal dimen sion. We demonstrate both estimation and reconstruction with noisy range imagery of natural terrain. Academic 1.
Efficient image reconstruction for pointbased and linebased rendering
, 2008
"... We address the problem of an efficient imagespace reconstruction of adaptively sampled scenes in the context of pointbased and linebased graphics. The imagespace reconstruction offers an advantageous time complexity compared to surface splatting techniques and, in fact, our improved GPU implement ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We address the problem of an efficient imagespace reconstruction of adaptively sampled scenes in the context of pointbased and linebased graphics. The imagespace reconstruction offers an advantageous time complexity compared to surface splatting techniques and, in fact, our improved GPU implementation performs significantly better than splatting implementations for large pointbased models. We discuss the integration of elliptical Gaussian weights for enhanced image quality and generalize the imagespace reconstruction to line segments. Furthermore, we present solutions for the efficient combination of points, lines, and polygons in a single image.
A. Pentland. FractalBased Description of Natural Scenes. IEEE on Pattern Analysis and Machine Intelligence,
, 1992
"... References ..."
The Light Field Oracle
, 2002
"... We present the light field oracle, a novel mathematical concept for the acquisition, processing and representation of light fields. We first compute a hierarchical representation from a set of sparse image samples using a combination of wavelet transform and scattered data interpolation. The light f ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
We present the light field oracle, a novel mathematical concept for the acquisition, processing and representation of light fields. We first compute a hierarchical representation from a set of sparse image samples using a combination of wavelet transform and scattered data interpolation. The light field oracle then progressively acquires image data and selectively refines this initial representation. By comparing the actual input image to the corresponding reconstruction from the wavelet pyramid, the oracle dynamically decides on whether the new sample is needed and, if necessary, inserts it into the representation. Our incremental update scheme exploits the spatial localization of wavelets and allows for highly efficient image decomposition. Likewise, image reconstruction for rendering is computed locally in the wavelet domain and does not require a global inverse transform. The wavelet hierarchy along with fast decomposition and rendering operators constitutes a powerful mathematical framework also amenable to compression.
NAIVE  Network Aware Internet Video Encoding
, 1999
"... The distribution of digital video content over computer networks has become commonplace. Unfortunately, most digital video encoding standards do not degrade gracefully in the face of packet losses, which often occur in a bursty fashion. We propose an new video encoding system that scales well with r ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The distribution of digital video content over computer networks has become commonplace. Unfortunately, most digital video encoding standards do not degrade gracefully in the face of packet losses, which often occur in a bursty fashion. We propose an new video encoding system that scales well with respect to the network's performance and degrades gracefully under packet loss. Our encoder sends packets that consist of a small random subset of pixels distributed throughout a video frame. The receiver places samples in their proper location (through a previously agreed ordering), and applies a reconstruction algorithm on the received samples to produce an image. Each of the packets is independent, and does not depend on the successful transmission of any other packets. Also, each packet contains information that is distributed over the entire image. We also apply spatial and temporal optimization to achieve better compression. 1