Results 1  10
of
121
Progressive Compression for Lossless Transmission of Triangle Meshes
, 2001
"... Lossless transmission of 3D meshes is a very challenging and timely problem for many applications, ranging from collaborative design to engineering. Additionally, frequent delays in transmissions call for progressive transmission in order for the end user to receive useful successive refinements of ..."
Abstract

Cited by 99 (4 self)
 Add to MetaCart
Lossless transmission of 3D meshes is a very challenging and timely problem for many applications, ranging from collaborative design to engineering. Additionally, frequent delays in transmissions call for progressive transmission in order for the end user to receive useful successive refinements of the final mesh. In this paper, we present a novel, fully progressive encoding approach for lossless transmission of triangle meshes with a very fine granularity. A new valencedriven decimating conquest, combined with patch tiling and an original strategic retriangulation is used to maintain the regularity of valence. We demonstrate that this technique leads to good mesh quality, nearoptimal connectivity encoding, and therefore a good ratedistortion ratio throughout the transmission. We also improve upon previous lossless geometry encoding by decorrelating the normal and tangential components of the surface. For typical meshes, our method compresses connectivity down to less than 3.7 bits per vertex, 40% better in average than the best methods previously reported [5, 18]; we further reduce the usual geometry bit rates by 20% in average by exploiting the smoothness of meshes. Concretely, our technique can reduce an ascii VRML 3D model down to 1.7% of its size for a 10bit quantization (2.3% for a 12bit quantization) while providing a very progressive reconstruction.
OutofCore Compression for Gigantic Polygon Meshes
, 2003
"... Polygonal models acquired with emerging 3D scanning technology or from large scale CAD applications easily reach sizes of several gigabytes and do not fit in the address space of common 32bit desktop PCs. In this paper we propose an outofcore mesh compression technique that converts such gigantic ..."
Abstract

Cited by 81 (23 self)
 Add to MetaCart
Polygonal models acquired with emerging 3D scanning technology or from large scale CAD applications easily reach sizes of several gigabytes and do not fit in the address space of common 32bit desktop PCs. In this paper we propose an outofcore mesh compression technique that converts such gigantic meshes into a streamable, highly compressed representation. During decompression only a small portion of the mesh needs to be kept in memory at any time. As full connectivity information is available along the decompression boundaries, this provides seamless mesh access for incremental incore processing on gigantic meshes. Decompression speeds are CPUlimited and exceed one million vertices and two million triangles per second on a 1.8 GHz Athlon processor.
Progressive lossless compression of arbitrary simplicial complexes
 ACM Trans. Graphics (Proc. ACM SIGGRAPH 2002
, 2002
"... Efficient algorithms for compressing geometric data have been widely developed in the recent years, but they are mainly designed for closed polyhedral surfaces which are manifold or “nearly manifold”. We propose here a progressive geometry compression scheme which can handle manifold models as well ..."
Abstract

Cited by 76 (0 self)
 Add to MetaCart
(Show Context)
Efficient algorithms for compressing geometric data have been widely developed in the recent years, but they are mainly designed for closed polyhedral surfaces which are manifold or “nearly manifold”. We propose here a progressive geometry compression scheme which can handle manifold models as well as “triangle soups ” and 3D tetrahedral meshes. The method is lossless when the decompression is complete which is extremely important in some domains such as medical or finite element. While most existing methods enumerate the vertices of the mesh in an order depending on the connectivity, we use a kdtree technique [8] which does not depend on the connectivity. Then we compute a compatible sequence of meshes which can be encoded using edge expansion [14] and vertex split [24]. 1 The main contributions of this paper are: the idea of using the kdtree encoding of the geometry to drive the construction of a sequence of meshes, an improved coding of the edge expansion and vertex split since the vertices to split are implicitly defined, a prediction scheme which reduces the code for simplices incident to the split vertex, and a new generalization of the edge expansion operation to tetrahedral meshes. 1
HighPass Quantization for Mesh Encoding
, 2003
"... Any quantization introduces errors. An important question is how to suppress their visual effect. In this paper we present a new quantization method for the geometry of 3D meshes, which enables aggressive quantization without significant loss of visual quality. Conventionally, quantization is applie ..."
Abstract

Cited by 53 (9 self)
 Add to MetaCart
Any quantization introduces errors. An important question is how to suppress their visual effect. In this paper we present a new quantization method for the geometry of 3D meshes, which enables aggressive quantization without significant loss of visual quality. Conventionally, quantization is applied directly to the 3space coordinates. This form of quantization introduces highfrequency errors into the model. Since highfrequency errors modify the appearance of the surface, they are highly noticeable, and commonly, this form of quantization must be done conservatively to preserve the precision of the coordinates. Our method first multiplies the coordinates by the Laplacian matrix of the mesh and quantizes the transformed coordinates which we call “δcoordinates”. We show that the highfrequency quantization errors in the δcoordinates are transformed into lowfrequency errors when the quantized δcoordinates are transformed back into standard Cartesian coordinates. These lowfrequency errors in the model are much less noticeable than the highfrequency errors. We call our strategy highpass quantization, to emphasize the fact that it tends to concentrate the quantization error at the lowfrequency end of the spectrum. To allow some control over the shape and magnitude of the lowfrequency quantization errors, we extend the Laplacian matrix by adding a number of spatial constraints. This enables us to tailor the quantization process to specific visual requirements, and to strongly quantize the δcoordinates.
NearOptimal Connectivity Encoding of 2Manifold Polygon Meshes
, 2002
"... ... this paper we introduce a connectivity encoding method which extends these ideas to 2manifold meshes consisting of faces with arbitrary degree. The encoding algorithm exploits duality by applying valence enumeration to both the primal and dual mesh in a symmetric fashion. It generates two sequen ..."
Abstract

Cited by 52 (7 self)
 Add to MetaCart
(Show Context)
... this paper we introduce a connectivity encoding method which extends these ideas to 2manifold meshes consisting of faces with arbitrary degree. The encoding algorithm exploits duality by applying valence enumeration to both the primal and dual mesh in a symmetric fashion. It generates two sequences of symbols, vertex valences and face degrees, and encodes them separately using two contextbased arithmetic coders. This allows us to exploit vertex and/or face regularity if present. When the mesh exhibits perfect face regularity (e.g., a pure triangle or quad mesh) and/or perfect vertex regularity (valence six or four respectively) the corresponding bit rate vanishes to zero asymptotically. For triangle meshes, our technique is equivalent to earlier valence driven approaches. We report compression results for a corpus of standard meshes. In all cases we are able to show coding gains over earlier coders, sometimes as large as 50%. Remarkably, we even slightly gain over coders specialized to triangle or quad meshes. A theoretical analysis reveals that our approach is nearoptimal as we achieve the Tutte entropy bound for arbitrary planar graphs of 2 bits per edge in the worst case.
Compressing Polygon Mesh Geometry with Parallelogram Prediction
 IEEE VISUALIZATION
, 2002
"... In this paper we present a generalization of the geometry coder by Touma and Gotsman [34] to polygon meshes. We let the polygon information dictate where to apply the parallelogram rule that they use to predict vertex positions. Since polygons tend to be fairly planar and fairly convex, it is benefi ..."
Abstract

Cited by 49 (13 self)
 Add to MetaCart
In this paper we present a generalization of the geometry coder by Touma and Gotsman [34] to polygon meshes. We let the polygon information dictate where to apply the parallelogram rule that they use to predict vertex positions. Since polygons tend to be fairly planar and fairly convex, it is beneficial to make predictions within a polygon rather than across polygons. This, for example, avoids poor predictions due to a crease angle between polygons. Up to 90 percent of the vertices can be predicted this way. Our strategy improves geometry compression by 10 to 40 percent depending on (a) how polygonal the mesh is and (b) on the quality (planarity/convexity) of the polygons.
Compressing Polygon Mesh Connectivity with Degree Duality Prediction
, 2002
"... In this paper we present a coder for polygon mesh connectivity that delivers the best connectivity compression rates meshes reported so far. Our coder is an extension of the vertexbased coder for triangle mesh connectivity by Touma and Gotsman [26]. We code polygonal connectivity as a sequence of f ..."
Abstract

Cited by 39 (13 self)
 Add to MetaCart
(Show Context)
In this paper we present a coder for polygon mesh connectivity that delivers the best connectivity compression rates meshes reported so far. Our coder is an extension of the vertexbased coder for triangle mesh connectivity by Touma and Gotsman [26]. We code polygonal connectivity as a sequence of face and vertex degrees and exploit the correlation between them for mutual predictive compression. Because lowdegree vertices are likely to be surrounded by highdegree faces and vice versa, we predict vertex degrees based on neighboring face degrees and face degrees based on neighboring vertex degrees.
Simplification and Compression of 3D Meshes
 In Proceedings of the European Summer School on Principles of Multiresolution in Geometric Modelling (PRIMUS
, 1998
"... We survey recent developments in compact representations of 3D mesh data. This includes: Methods to reduce the complexity of meshes by simplification, thereby reducing the number of vertices and faces in the mesh; Methods to resample the geometry in order to optimize the vertex distribution; Methods ..."
Abstract

Cited by 35 (6 self)
 Add to MetaCart
(Show Context)
We survey recent developments in compact representations of 3D mesh data. This includes: Methods to reduce the complexity of meshes by simplification, thereby reducing the number of vertices and faces in the mesh; Methods to resample the geometry in order to optimize the vertex distribution; Methods to compactly represent the connectivity data (the graph structure defined by the edges) of the mesh; Methods to compactly represent the geometry data (the vertex coordinates) of a mesh.