Results 1  10
of
17
Filling Gaps in the Boundary of a Polyhedron
 Computer Aided Geometric Design
, 1993
"... In this paper we present an algorithm for detecting and repairing defects in the boundary of a polyhedron. These defects, usually caused by problems in CAD software, consist of small gaps bounded by edges that are incident to only one polyhedron face. The algorithm uses a partial curve matching t ..."
Abstract

Cited by 38 (4 self)
 Add to MetaCart
In this paper we present an algorithm for detecting and repairing defects in the boundary of a polyhedron. These defects, usually caused by problems in CAD software, consist of small gaps bounded by edges that are incident to only one polyhedron face. The algorithm uses a partial curve matching technique for matching parts of the defects, and an optimal triangulation of 3D polygons for resolving the unmatched parts. It is also shown that finding a consistent set of partial curve matches with maximum score, a subproblem which is related to our repairing process, is NPHard. Experimental results on several polyhedra are presented. Keywords: CAD, polyhedra, gap filling, curve matching, geometric hashing, triangulation. 1 Introduction The problem studied in this paper is the detection and repair of "gaps" in the boundary of a polyhedron. This problem usually appears in polyhedral approximations of CAD objects, whose boundaries are described using curved entities of higher leve...
Concurrent Computation of Attribute Filters on Shared Memory Parallel Machines
, 2008
"... Morphological attribute filters have not previously been parallelized mainly because they are both global and nonseparable. We propose a parallel algorithm that achieves efficient parallelism for a large class of attribute filters, including attribute openings, closings, thinnings, and thickenings, ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Morphological attribute filters have not previously been parallelized mainly because they are both global and nonseparable. We propose a parallel algorithm that achieves efficient parallelism for a large class of attribute filters, including attribute openings, closings, thinnings, and thickenings, based on Salembier’s MaxTrees and Mintrees. The image or volume is first partitioned in multiple slices. We then compute the Maxtrees of each slice using any sequential MaxTree algorithm. Subsequently, the Maxtrees of the slices can be merged to obtain the Maxtree of the image. A Cimplementation yielded good speedups on both a 16processor MIPS 14000 parallel machine and a dualcore Opteronbased machine. It is shown that the speedup of the parallel algorithm is a direct measure of the gain with respect to the sequential algorithm used. Furthermore, the concurrent algorithm shows a speed gain of up to 72 percent on a singlecore processor due to reduced cache thrashing.
Nardelli: Distributed searching of kdimensional data with almost constant costs
 ADBIS 2000, Prague, Lecture Notes in Computer Science
, 2000
"... Abstract. In this paper we consider the dictionary problem in the scalable distributed data structure paradigm introduced by Litwin, Neimat and Schneider and analyze costs for insert and exact searches in an amortized framework. We show that both for the 1dimensional and the kdimensional case inser ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Abstract. In this paper we consider the dictionary problem in the scalable distributed data structure paradigm introduced by Litwin, Neimat and Schneider and analyze costs for insert and exact searches in an amortized framework. We show that both for the 1dimensional and the kdimensional case insert and exact searches have an amortized almost constant costs, namely O � log (1+A) n � messages, where n is the total number of servers of the structure, b is the capacity of each server, and A = b. Considering that A is a large value in real applications, in the 2 order of thousands, we can assume to have a constant cost in real distributed structures. Only worst case analysis has been previously considered and the almost constant cost for the amortized analysis of the general kdimensional case appears to be very promising in the light of the well known difficulties in proving optimal worst case bounds for kdimensions.
DiscFinder: A DataIntensive Scalable Cluster Finder for Astrophysics
"... DiscFinder is a scalable approach for identifying largescale astronomical structures, such as galaxy clusters, in massive observation and simulation astrophysics datasets. It is designed to operate on datasets with tens of billions of astronomical objects, even in the case when the dataset is much ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
DiscFinder is a scalable approach for identifying largescale astronomical structures, such as galaxy clusters, in massive observation and simulation astrophysics datasets. It is designed to operate on datasets with tens of billions of astronomical objects, even in the case when the dataset is much larger than the aggregate memory of compute cluster used for the processing. 1.
A Straightforward SaturationBased Decision Procedure for Hybrid Logic
"... In this paper we present a saturationbased decision procedure for basic hybrid logic extended with the universal modality. Termination of the procedure is guaranteed by constraints that are conceptually simpler than the loopchecks commonly used with related tableaubased decision methods in that t ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
In this paper we present a saturationbased decision procedure for basic hybrid logic extended with the universal modality. Termination of the procedure is guaranteed by constraints that are conceptually simpler than the loopchecks commonly used with related tableaubased decision methods in that they do not rely on the order in which new formulas are introduced. At the same time, our constraints allow us to limit the worstcase asymptotic complexity of the procedure more tightly than it seems to be possible for methods using conventional loopchecks. The procedure is based on Hardt and Smolka’s higherorder formulation of hybrid logic [10]. 1
The slice algorithm for irreducible decomposition of monomial ideals
 Journal of Symbolic Computation
"... Abstract. Irreducible decomposition of monomial ideals has an increasing number of applications from biology to pure math. This paper presents the Slice Algorithm for computing irreducible decompositions, Alexander duals and socles of monomial ideals. The paper includes experiments showing good perf ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract. Irreducible decomposition of monomial ideals has an increasing number of applications from biology to pure math. This paper presents the Slice Algorithm for computing irreducible decompositions, Alexander duals and socles of monomial ideals. The paper includes experiments showing good performance in practice.
Efficient Simplification of Bisimulation Formulas
 In Proceedings of the Workshop on Tools and Algorithms for the Construction and Analysis of Systems, pages 111132. LNCS 1019
, 1995
"... The problem of checking or optimally simplifying bisimulation formulas is likely to be computationally very hard. We take a different view at the problem: we set out to define a very fast algorithm, and then see what we can obtain. Sometimes our algorithm can simplify a formula perfectly, sometimes ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The problem of checking or optimally simplifying bisimulation formulas is likely to be computationally very hard. We take a different view at the problem: we set out to define a very fast algorithm, and then see what we can obtain. Sometimes our algorithm can simplify a formula perfectly, sometimes it cannot. However, the algorithm is extremely fast and can, therefore, be added to formulabased bisimulation model checkers at practically no cost. When the formula can be simplified by our algorithm, this can have a dramatic positive effect on the better, but also more time consuming, theorem provers which will finish the job. 1 Introduction The need for validity checking or optimal simplification of first order bisimulation formulas has arisen from recent work on symbolic bisimulation checking of valuepassing calculi [4, 9, 15]. The NPcompleteness of checking satisfiability of propositional formulas [3] implies that validity checking of that class of formulas is coNP complete. Addit...
Recipes for Baking Black Forest Databases Building and Querying Black Hole Merger Trees from Cosmological Simulations
, 2011
"... an allocation of advanced computing resources supported by the NSF and the TeraGrid Advanced Support Program. The computations were performed on on Kraken at the National Institute for Computational Sciences (NICS) ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
an allocation of advanced computing resources supported by the NSF and the TeraGrid Advanced Support Program. The computations were performed on on Kraken at the National Institute for Computational Sciences (NICS)
Backtracking
"... Contents 1 Introduction 3 2 Models of computation 6 3 The Set Union Problem 9 4 The WorstCase Time Complexity of a Single Operation 15 5 The Set Union Problem with Deunions 18 6 Split and the Set Union Problem on Intervals 22 7 The Set Union Problem with Unlimited Backtracking 26 1 Introduction A ..."
Abstract
 Add to MetaCart
Contents 1 Introduction 3 2 Models of computation 6 3 The Set Union Problem 9 4 The WorstCase Time Complexity of a Single Operation 15 5 The Set Union Problem with Deunions 18 6 Split and the Set Union Problem on Intervals 22 7 The Set Union Problem with Unlimited Backtracking 26 1 Introduction An equivalence relation on a finite set S is a binary relation that is reflexive symmetric and transitive. That is, for s; t and u in S, we have that sRs, if sRt then tRs, and if sRt and tRu then sRu. Set S is partitioned by R into equivalence classes where each class cointains all and only the elements that obey R pairwise. Many computational problems involve representing, modifying and tracking the evolution of equivalenc
PrivacyPreserving Publishing of Moving Objects Databases
, 2009
"... Moving Objects Databases (MOD) have gained popularity as a subject for research due to the latest developments in the positioning technologies and mobile networking. Analysis of mobility data can be used to discover and deliver knowledge that can enhance public welfare. For instance, a study of traf ..."
Abstract
 Add to MetaCart
Moving Objects Databases (MOD) have gained popularity as a subject for research due to the latest developments in the positioning technologies and mobile networking. Analysis of mobility data can be used to discover and deliver knowledge that can enhance public welfare. For instance, a study of traffic patterns and congestion trends can reveal some information that can be used to improve routing and scheduling of public transit vehicles. To enable analysis of mobility data, a MOD must be published. However, publication of MOD can pose a threat to location privacy of users, whose movement is recorded in the database. A user’s location at one or more time points can be publicly available prior to the publication of MOD. Based on this public knowledge, an attacker can potentially find the user’s entire trajectory and learn his/her positions at other time points, which constitutes privacy breach. This public knowledge is a user’s quasiidentifier (QID), i.e. a set of attributes that can uniquely identify the user’s trajectory in the published database. We argue that unlike in relational microdata, where all