Results 1  10
of
12
Progressive Encoding of Complex Isosurfaces
, 2003
"... Some of the largest and most intricate surfaces result from isosurface extraction of volume data produced by 3D imaging modalities and scientific simulations. Such surfaces often possess both complicated geometry and topology (i.e., many connected components and high genus). Because of their sheer s ..."
Abstract

Cited by 32 (3 self)
 Add to MetaCart
Some of the largest and most intricate surfaces result from isosurface extraction of volume data produced by 3D imaging modalities and scientific simulations. Such surfaces often possess both complicated geometry and topology (i.e., many connected components and high genus). Because of their sheer size, efficient compression algorithms, in particular progressive encodings, are critical in working with these surfaces. Most standard mesh compression algorithms have been designed to deal with generally smooth surfaces of low topologic complexity. Much better results can be achieved with algorithms which are specifically designed for isosurfaces arising from volumetric datasets.
Volume warping for adaptive isosurface extraction
 In IEEE Visualization (2002
"... ..."
(Show Context)
Simplicial Isosurface Compression
 Proc. Vision, Modeling, and Visualization Conf
, 2004
"... In this work, we introduce a new algorithm for direct and progressive encoding of isosurfaces extracted from volumetric data. A binary multi– triangulation is used to represent and adapt the 3D scalar grid. This simplicial scheme provides geometrical and topological control on the decoded isosurface ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
In this work, we introduce a new algorithm for direct and progressive encoding of isosurfaces extracted from volumetric data. A binary multi– triangulation is used to represent and adapt the 3D scalar grid. This simplicial scheme provides geometrical and topological control on the decoded isosurface. The resulting algorithm is an efficient and flexible isosurface compression scheme. 1
Progressive encoding and compression of surfaces generated from point cloud data
 In preparation
"... We present a new algorithm for compressing surfaces created from oriented points, sampled using a laser range scanner or created from polygonal surfaces. We first use the input data to build an octree whose nodes contain planes that are constructed as the least square fit of the data within that nod ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
We present a new algorithm for compressing surfaces created from oriented points, sampled using a laser range scanner or created from polygonal surfaces. We first use the input data to build an octree whose nodes contain planes that are constructed as the least square fit of the data within that node. Then, given an error threshold, we prune this octree to remove redundant data while avoiding topological changes created by merging disjoint linear pieces. From this octree representation, we provide a progressive encoding technique that encodes the octree structure as well as the plane equations. We encode the planes using distances to three points and a single bit. To decode these planes, we solve a constrained optimization problem that has closedform solution. We then reconstruct the surface from this representation by implicitizing the discontinuous linear pieces at the leaves of the octree and take a level set of this implicit representation. Our tests show that the proposed method compresses surfaces with higher accuracy and smaller file sizes than other methods. 1.
unknown title
"... Abstract—The main objective of this paper is to provide an efficient tool for delineating brain tumors in threedimensional magnetic resonance images and set up compressiontransmit schemes to distribute result to the remote doctor. To achieve this goal, we use basically a levelsets approach to del ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—The main objective of this paper is to provide an efficient tool for delineating brain tumors in threedimensional magnetic resonance images and set up compressiontransmit schemes to distribute result to the remote doctor. To achieve this goal, we use basically a levelsets approach to delineating brain tumors in threedimensional. Then introduce a new compression and transmission plan of 3D brain structures based for the meshes simplification, adapted for time to the specific needs of the telemedicine and to the capacities restricted by wireless network communication. We present here the main stages of our system, and preliminary results which are very encouraging for clinical practice. Keywords—Medical imaging, levelsets, compression, meshes simplification, telemedicine, wireless transmission.
Feline Horse
"... A new progressive lossless 3D triangular mesh encoder is proposed in this work, which can encode any 3D triangular mesh with an arbitrary topological structure. Given a mesh, the quantized 3D vertices are first partitioned into an octree (OT) structure, which is then traversed from the root and grad ..."
Abstract
 Add to MetaCart
A new progressive lossless 3D triangular mesh encoder is proposed in this work, which can encode any 3D triangular mesh with an arbitrary topological structure. Given a mesh, the quantized 3D vertices are first partitioned into an octree (OT) structure, which is then traversed from the root and gradually to the leaves. During the traversal, each 3D cell in the tree front is subdivided into eight childcells. For each cell subdivision, both local geometry and connectivity changes are encoded, where the connectivity coding is guided by the geometry coding. Furthermore, prioritized cell subdivision is performed in the tree front to provide better ratedistortion (RD) performance. Experiments show that the proposed mesh coder outperforms the kdtree algorithm in both geometry and connectivity coding efficiency. For the geometry coding part, the range of improvement is typically around 10%∼20%, but may go up to 50%∼60 % for meshes with highly regular geometry data and/or tight clustering of vertices.
Eurographics Symposium on Rendering (2007) Jan Kautz and Sumanta Pattanaik (Editors) Compressed RandomAccess Trees for Spatially Coherent Data
"... Adaptive multiresolution hierarchies are highly efficient at representing spatially coherent graphics data. We introduce a framework for compressing such adaptive hierarchies using a compact randomlyaccessible tree structure. Prior schemes have explored compressed trees, but nearly all involve entr ..."
Abstract
 Add to MetaCart
(Show Context)
Adaptive multiresolution hierarchies are highly efficient at representing spatially coherent graphics data. We introduce a framework for compressing such adaptive hierarchies using a compact randomlyaccessible tree structure. Prior schemes have explored compressed trees, but nearly all involve entropy coding of a sequential traversal, thus preventing finegrain random queries required by rendering algorithms. Instead, we use fixedrate encoding for both the tree topology and its data. Key elements include the replacement of pointers by local offsets, a forested mipmap structure, vector quantization of interlevel residuals, and efficient coding of partially defined data. Both the offsets and codebook indices are stored as byte records for easy parsing by either CPU or GPU shaders. We show that continuous mipmapping over an adaptive tree is more efficient using primal subdivision than traditional dual subdivision. Finally, we demonstrate efficient compression of many data types including light maps, alpha mattes, distance fields, and HDR images. Light map (1.0 bits/pixel) Alpha matte (0.4 bits/pixel) Distance field (0.07 bits/pixel) HDR image (5.0 bits/pixel) Figure 1: Coherent data stored in a compact randomly accessible adaptive hierarchy with efficient mipmap filtering. 1.
Fast Rendering of Large Encoded Isosurfaces from Uniform Grid Datasets Angel del R o, Jan Fischer, Dirk Bartz, Wolfgang Straer
, 2005
"... Standard algorithms for the extraction of isosurfaces from volume data (e.g., the Marching Cubes algorithm) are notorious for producing a large amount of small triangles. Unfortunately, simplification is not a possible avenue in many application fields, thus a large number of triangles are transferr ..."
Abstract
 Add to MetaCart
Standard algorithms for the extraction of isosurfaces from volume data (e.g., the Marching Cubes algorithm) are notorious for producing a large amount of small triangles. Unfortunately, simplification is not a possible avenue in many application fields, thus a large number of triangles are transferred to the GPU. The resulting massive data load of transferring polygonal data from main memory down to the GPU is a major bottleneck in traditional polygonal isosurface rendering.