Results 1  10
of
53
Survey of Polygonal Surface Simplification Algorithms
, 1997
"... This paper surveys methods for simplifying and approximating polygonal surfaces. A polygonal surface is a piecewiselinear surface in 3D defined by a set of polygons ..."
Abstract

Cited by 192 (3 self)
 Add to MetaCart
This paper surveys methods for simplifying and approximating polygonal surfaces. A polygonal surface is a piecewiselinear surface in 3D defined by a set of polygons
Discrete Geometric Shapes: Matching, Interpolation, and Approximation: A Survey
 Handbook of Computational Geometry
, 1996
"... In this survey we consider geometric techniques which have been used to measure the similarity or distance between shapes, as well as to approximate shapes, or interpolate between shapes. Shape is a modality which plays a key role in many disciplines, ranging from computer vision to molecular biolog ..."
Abstract

Cited by 126 (10 self)
 Add to MetaCart
In this survey we consider geometric techniques which have been used to measure the similarity or distance between shapes, as well as to approximate shapes, or interpolate between shapes. Shape is a modality which plays a key role in many disciplines, ranging from computer vision to molecular biology. We focus on algorithmic techniques based on computational geometry that have been developed for shape matching, simplification, and morphing. 1 Introduction The matching and analysis of geometric patterns and shapes is of importance in various application areas, in particular in computer vision and pattern recognition, but also in other disciplines concerned with the form of objects such as cartography, molecular biology, and computer animation. The general situation is that we are given two objects A, B and want to know how much they resemble each other. Usually one of the objects may undergo certain transformations like translations, rotations or scalings in order to be matched with th...
Rendering Effective Route Maps: Improving Usability Through Generalization
"... Route maps, which depict a path from one location to another, have emerged as one of the most popular applications on the Web. Current computergenerated route maps, however, are often very difficult to use. In this paper we present a set of cartographic generalization techniques specifically desig ..."
Abstract

Cited by 114 (6 self)
 Add to MetaCart
Route maps, which depict a path from one location to another, have emerged as one of the most popular applications on the Web. Current computergenerated route maps, however, are often very difficult to use. In this paper we present a set of cartographic generalization techniques specifically designed to improve the usability of route maps. Our generalization techniques are based both on cognitive psychology research studying how route maps are used and on an analysis of the generalizations commonly found in handdrawn route maps. We describe algorithmic implementations of these generalization techniques within BeeLine, a realtime system for automatically designing and rendering route maps. We show that BeeLine produces route maps that are much more usable than those produced by current computerbased route map rendering systems. Feedback from over 1100 users indicates that over 99 % believe BeeLine maps are preferable to using standard computergenerated route maps alone.
Hierarchical Triangulation for Multiresolution Surface Description
 ACM Transactions on Graphics
, 1995
"... A new hierarchical trianglebased model for representing surfaces over sampled data is proposed, which is based on the subdivision of the surface domain into nested triangulations, called a Hierarchical Triangulation (HT). The model allows compression of spatial data and representation of a surface ..."
Abstract

Cited by 88 (16 self)
 Add to MetaCart
A new hierarchical trianglebased model for representing surfaces over sampled data is proposed, which is based on the subdivision of the surface domain into nested triangulations, called a Hierarchical Triangulation (HT). The model allows compression of spatial data and representation of a surface at successively finer degrees of resolution. An HT is a collection of triangulations organized in a tree, where each node, except for the root, is a triangulation refining a face belonging to its parent in the hierarchy. We present a topological model for representing an HT, and algorithms for its construction and for the extraction of a triangulation at a given degree of resolution. The surface model, called a Hierarchical Triangulated Surface (HTS), is obtained by associating data values with the vertices of triangles, and defining suitable functions that describe the surface over each triangular patch. We consider an application of a piecewiselinear version of the HTS to interpolate topo...
Video Summarization by Curve Simplification
 ACM MULTIMEDIA
, 1998
"... A video sequence can be represented as a trajectory curve in a high dimensional feature space. This video curve can be analyzed by tools similar to those developed for planar curves. In particular, the classic binary curve splitting algorithm has been found to be a useful tool for video analysis. Wi ..."
Abstract

Cited by 77 (6 self)
 Add to MetaCart
A video sequence can be represented as a trajectory curve in a high dimensional feature space. This video curve can be analyzed by tools similar to those developed for planar curves. In particular, the classic binary curve splitting algorithm has been found to be a useful tool for video analysis. With a splitting condition that checks the dimensionality of the curve segment being split, the video curve can be recursively simplified and represented as a tree structure, and the frames that are found to be junctions between curve segments at different levels of the tree can be used as keyframes to summarize the video sequences at different levels of detail. These keyframes can be combined in various spatial and temporal configurations for browsing purposes. We describe a simple video player that displays the keyframes sequentially and lets the user change the summarization level on the fly with a slider. We also describe an approach to automatically selecting a summarization level that pr...
Approximating Polygons and Subdivisions with MinimumLink Paths
, 1991
"... We study several variations on one basic approach to the task of simplifying a plane polygon or subdivision: Fatten the given object and construct an approximation inside the fattened region. We investigate fattening by convolving the segments or vertices with disks and attempt to approximate object ..."
Abstract

Cited by 61 (11 self)
 Add to MetaCart
We study several variations on one basic approach to the task of simplifying a plane polygon or subdivision: Fatten the given object and construct an approximation inside the fattened region. We investigate fattening by convolving the segments or vertices with disks and attempt to approximate objects with the minimum number of line segments, or with near the minimum, by using efficient greedy algorithms. We give some variants that have linear or O(n log n) algorithms approximating polygonal chains of n segments. We also show that approximating subdivisions and approximating with chains with no selfintersections are NPhard.
Efficient Algorithms for Approximating Polygonal Chains
"... We consider the problem of approximating a polygonal chain C by another polygonal chain C ′ whose vertices are constrained to be a subset of the set of vertices of C. The goal is to minimize the number of vertices needed in the approximation C ′. Based on a framework introduced by Imai and Iri [25 ..."
Abstract

Cited by 39 (2 self)
 Add to MetaCart
We consider the problem of approximating a polygonal chain C by another polygonal chain C ′ whose vertices are constrained to be a subset of the set of vertices of C. The goal is to minimize the number of vertices needed in the approximation C ′. Based on a framework introduced by Imai and Iri [25], we define an error criterion for measuring the quality of an approximation. We consider two problems. (1) Given a polygonal chain C and a parameter ε ≥ 0, compute an approximation of C, among all approximations whose error is at most ε, that has the smallest number of vertices. We present an O(n 4/3+δ)time algorithm to solve this problem, for any δ>0; the constant of proportionality in the running time depends on δ. (2) Given a polygonal chain C and an integer k, compute an approximation of C with at most k vertices whose error is the smallest among all approximations with at most k vertices. We present a simple randomized algorithm, with expected running time O(n 4/3+δ), to solve this problem.
Approximate convex decomposition of polyhedra
 In Proc. of ACM Symposium on Solid and Physical Modeling
, 2005
"... Decomposition is a technique commonly used to partition complex models into simpler components. While decomposition into convex components results in pieces that are easy to process, such decompositions can be costly to construct and can result in representations with an unmanageable number of compo ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
Decomposition is a technique commonly used to partition complex models into simpler components. While decomposition into convex components results in pieces that are easy to process, such decompositions can be costly to construct and can result in representations with an unmanageable number of components. In this paper, we explore an alternative partitioning strategy that decomposes a given model into “approximately convex ” pieces that may provide similar benefits as convex components, while the resulting decomposition is both significantly smaller (typically by orders of magnitude) and can be computed more efficiently. Indeed, for many applications, an approximate convex decomposition (ACD) can more accurately represent the important structural features of the model by providing a mechanism for ignoring less significant features, such as surface texture. We describe a technique for computing ACDs of threedimensional polyhedral solids and surfaces of arbitrary genus. We provide results illustrating that our approach results in high quality decompositions with very few components and applications showing that comparable or better results can be obtained using ACD decompositions in place of exact convex decompositions (ECD) that are several orders of magnitude larger. 1 ECD Figure 1: The approximate convex decompositions (ACD) of the Armadillo and the David models consist of a small number of nearly convex components that characterize the important features of the models better than the exact convex decompositions (ECD) that have orders of magnitude more components. The Armadillo (500K edges, 12.1MB) has a solid ACD with 98 components (14.2MB) that was computed in 232 seconds while the solid “ECD ” has more than 726,240 components (20+ GB) and could not be completed because disk space was exhausted after nearly 4 hours of computation. The David (750K edges, 18MB) has a surface ACD with 66 components (18.1MB) while the surface ECD has 85,132 components (20.1MB). 1
A New Approach to Subdivision Simplification
, 1995
"... The line simplification problem is an old and wellstudied problem in cartography. Although there are several algorithms to compute a simplification, there seem to be no algorithms that perform line simplification in the context of other geographical objects. This paper presents a nearly quadratic t ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
The line simplification problem is an old and wellstudied problem in cartography. Although there are several algorithms to compute a simplification, there seem to be no algorithms that perform line simplification in the context of other geographical objects. This paper presents a nearly quadratic time algorithm for the following line simplification problem: Given a polygonal line, a set of extra points, and a real ffl ? 0, compute a simplification that guarantees (i) a maximum error ffl, (ii) that the extra points remain on the same side of the simplified chain as of the original chain, and (iii) that the simplified chain has no selfintersections. The algorithm is applied as the main subroutine for subdivision simplification. 1 Introduction The line simplification problem is a wellstudied problem in various disciplines including geographic information systems [Buttenfield '85, Cromley '88, Douglas & Peucker '73, Hershberger & Snoeyink '92, Li & Openshaw '92, McMaster '87], digital...
Discovery of Convoys in Trajectory Databases
"... As mobile devices with positioning capabilities continue to proliferate, data management for socalled trajectory databases that capture the historical movements of populations of moving objects becomes important. This paper considers the querying of such databases for convoys, a convoy being a grou ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
As mobile devices with positioning capabilities continue to proliferate, data management for socalled trajectory databases that capture the historical movements of populations of moving objects becomes important. This paper considers the querying of such databases for convoys, a convoy being a group of objects that have traveled together for some time. More specifically, this paper formalizes the concept of a convoy query using densitybased notions, in order to capture groups of arbitrary extents and shapes. Convoy discovery is relevant for reallife applications in throughput planning of trucks and carpooling of vehicles. Although there has been extensive research on trajectories in the literature, none of this can be applied to retrieve correctly exact convoy result sets. Motivated by this, we develop three efficient algorithms for convoy discovery that adopt the wellknown filterrefinement framework. In the filter step, we apply linesimplification techniques on the trajectories and establish distance bounds between the simplified trajectories. This permits efficient convoy discovery over the simplified trajectories without missing any actual convoys. In the refinement step, the candidate convoys are further processed to obtain the actual convoys. Our comprehensive empirical study offers insight into the properties of the paper’s proposals and demonstrates that the proposals are effective and efficient on realworld trajectory data. 1.