Results 1  10
of
97
Anisotropic Polygonal Remeshing
"... In this paper, we propose a novel polygonal remeshing technique that exploits a key aspect of surfaces: the intrinsic anisotropy of natural or manmade geometry. In particular, we use curvature directions to drive the remeshing process, mimicking the lines that artists themselves would use when cre ..."
Abstract

Cited by 175 (17 self)
 Add to MetaCart
In this paper, we propose a novel polygonal remeshing technique that exploits a key aspect of surfaces: the intrinsic anisotropy of natural or manmade geometry. In particular, we use curvature directions to drive the remeshing process, mimicking the lines that artists themselves would use when creating 3D models from scratch. After extracting and smoothing the curvature tensor field of an input genus0 surface patch, lines of minimum and maximum curvatures are used to determine appropriate edges for the remeshed version in anisotropic regions, while spherical regions are simply pointsampled since there is no natural direction of symmetry locally. As a result our technique generates polygon meshes mainly composed of quads in anisotropic regions, and of triangles in spherical regions. Our approach provides the flexibility to produce meshes ranging from isotropic to anisotropic, from coarse to dense, and from uniform to curvature adapted.
Geometric SpeedUp Techniques for Finding Shortest Paths in Large Sparse Graphs
, 2003
"... In this paper, we consider Dijkstra's algorithm for the single source single target shortest paths problem in large sparse graphs. The goal is to reduce the response time for online queries by using precomputed information. For the result of the preprocessing, we admit at most linear space. ..."
Abstract

Cited by 51 (14 self)
 Add to MetaCart
In this paper, we consider Dijkstra's algorithm for the single source single target shortest paths problem in large sparse graphs. The goal is to reduce the response time for online queries by using precomputed information. For the result of the preprocessing, we admit at most linear space. We assume that a layout of the graph is given. From this layout, in the preprocessing, we determine for each edge a geometric object containing all nodes that can be reached on a shortest path starting with that edge. Based on these geometric objects, the search space for online computation can be reduced significantly. We present an extensive experimental study comparing the impact of different types of objects. The test data we use are traffic networks, the typical field of application for this scenario.
Concepts: Linguistic support for generic programming in C
 SIGPLAN Notices
, 2006
"... Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, ..."
Abstract

Cited by 43 (10 self)
 Add to MetaCart
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
Polygon Decomposition for Efficient Construction of Minkowski Sums
, 2000
"... Several algorithms for computing the Minkowski sum of two polygons in the plane begin by decomposing each polygon into convex subpolygons. We examine different methods for decomposing polygons by their suitability for efficient construction of Minkowski sums. We study and experiment with various ..."
Abstract

Cited by 42 (8 self)
 Add to MetaCart
Several algorithms for computing the Minkowski sum of two polygons in the plane begin by decomposing each polygon into convex subpolygons. We examine different methods for decomposing polygons by their suitability for efficient construction of Minkowski sums. We study and experiment with various wellknown decompositions as well as with several new decomposition schemes. We report on our experiments with various decompositions and different input polygons. Among our findings are that in general: (i) triangulations are too costly (ii) what constitutes a good decomposition for one of the input polygons depends on the other input polygon  consequently, we develop a procedure for simultaneously decomposing the two polygons such that a "mixed" objective function is minimized, (iii) there are optimal decomposition algorithms that significantly expedite the Minkowskisum computation, but the decomposition itself is expensive to compute  in such cases simple heuristics that approximate the optimal decomposition perform very well.
Classroom examples of robustness problems in geometric computations
 In Proc. 12th European Symposium on Algorithms, volume 3221 of Lecture Notes Comput. Sci
, 2004
"... ..."
Exacus: Efficient and exact algorithms for curves and surfaces
 IN ESA, VOLUME 1669 OF LNCS
, 2005
"... We present the first release of the EXACUS C++ libraries. We aim for systematic support of nonlinear geometry in software libraries. Our goals are efficiency, correctness, completeness, clarity of the design, modularity, flexibility, and ease of use. We present the generic design and structure of ..."
Abstract

Cited by 33 (13 self)
 Add to MetaCart
We present the first release of the EXACUS C++ libraries. We aim for systematic support of nonlinear geometry in software libraries. Our goals are efficiency, correctness, completeness, clarity of the design, modularity, flexibility, and ease of use. We present the generic design and structure of the libraries, which currently compute arrangements of curves and curve segments of low algebraic degree, and boolean operations on polygons bounded by such segments.
Isotropic Surface Remeshing
, 2003
"... This paper proposes a new method for isotropic remeshing of triangulated surface meshes. Given a triangulated surface mesh to be resampled and a userspecified density function defined over it, we first distribute the desired number of samples by generalizing error diffusion, commonly used in image ..."
Abstract

Cited by 32 (3 self)
 Add to MetaCart
This paper proposes a new method for isotropic remeshing of triangulated surface meshes. Given a triangulated surface mesh to be resampled and a userspecified density function defined over it, we first distribute the desired number of samples by generalizing error diffusion, commonly used in image halftoning, to work directly on mesh triangles and feature edges. We then use the resulting sampling as an initial configuration for building a weighted centroidal Voronoi tessellation in a conformal parameter space, where the specified density function is used for weighting. We finally create the mesh by lifting the corresponding constrained Delaunay triangulation from parameter space. A precise control over the sampling is obtained through a flexible design of the density function, the latter being possibly lowpass filtered to obtain a smoother gradation. We demonstrate the versatility of our approach through various remeshing examples.
A computational basis for conic arcs and boolean operations on conic polygons
 In Proc. 10th European Symposium on Algorithms
, 2002
"... Abstract. We give an exact geometry kernel for conic arcs, algorithms for exact computation with lowdegree algebraic numbers, and an algorithm for computing the arrangement of conic arcs that immediately leads to a realization of regularized boolean operations on conic polygons. A conic polygon, or ..."
Abstract

Cited by 31 (15 self)
 Add to MetaCart
Abstract. We give an exact geometry kernel for conic arcs, algorithms for exact computation with lowdegree algebraic numbers, and an algorithm for computing the arrangement of conic arcs that immediately leads to a realization of regularized boolean operations on conic polygons. A conic polygon, or polygon for short, is anything that can be obtained from linear or conic halfspaces ( = the set of points where a linear or quadratic function is nonnegative) by regularized boolean operations. The algorithm and its implementation are complete (they can handle all cases), exact (they give the mathematically correct result), and efficient (they can handle inputs with several hundred primitives). 1
An adaptable and extensible geometry kernel
 In Proc. Workshop on Algorithm Engineering
, 2001
"... ii ..."
Upright orientation of manmade objects
 ACM Trans. Graphics
, 2008
"... Figure 1: Left: A manmade model with unnatural orientation. Middle: Six orientations obtained by aligning the model into a canonical coordinate frame using Principal Component Analysis. Right: Our method automatically detects the upright orientation of the model from its geometry alone. Humans usua ..."
Abstract

Cited by 26 (8 self)
 Add to MetaCart
Figure 1: Left: A manmade model with unnatural orientation. Middle: Six orientations obtained by aligning the model into a canonical coordinate frame using Principal Component Analysis. Right: Our method automatically detects the upright orientation of the model from its geometry alone. Humans usually associate an upright orientation with objects, placing them in a way that they are most commonly seen in our surroundings. While it is an open challenge to recover the functionality of a shape from its geometry alone, this paper shows that it is often possible to infer its upright orientation by analyzing its geometry. Our key idea is to reduce the twodimensional (spherical) orientation space to a small set of orientation candidates using functionalityrelated geometric properties of the object, and then determine the best orientation using an assessment function of several functional geometric attributes defined with respect to each candidate. Specifically we focus on obtaining the upright orientation for manmade objects that typically stand on some flat surface (ground, floor, table, etc.), which include the vast majority of objects in our everyday surroundings. For these types of models orientation candidates can be defined according to static equilibrium. For each candidate, we introduce a set of discriminative attributes linking shape to function. We learn an assessment function of these attributes from a training set using a combination of Random Forest classifier and Support Vector Machine classifier. Experiments demonstrate that our method generalizes well and achieves about 90 % prediction accuracy for both a 10fold crossvalidation over the training set and a validation with an independent test set. 1