Results 1 
6 of
6
Iterated Nearest Neighbors and Finding Minimal Polytopes
, 1994
"... Weintroduce a new method for finding several types of optimal kpoint sets, minimizing perimeter, diameter, circumradius, and related measures, by testing sets of the O(k) nearest neighbors to each point. We argue that this is better in a number of ways than previous algorithms, whichwere based o ..."
Abstract

Cited by 57 (6 self)
 Add to MetaCart
Weintroduce a new method for finding several types of optimal kpoint sets, minimizing perimeter, diameter, circumradius, and related measures, by testing sets of the O(k) nearest neighbors to each point. We argue that this is better in a number of ways than previous algorithms, whichwere based on high order Voronoi diagrams. Our technique allows us for the first time to efficiently maintain minimal sets as new points are inserted, to generalize our algorithms to higher dimensions, to find minimal convex kvertex polygons and polytopes, and to improvemany previous results. Weachievemany of our results via a new algorithm for finding rectilinear nearest neighbors in the plane in time O(n log n+kn). We also demonstrate a related technique for finding minimum area kpoint sets in the plane, based on testing sets of nearest vertical neighbors to each line segment determined by a pair of points. A generalization of this technique also allows us to find minimum volume and boundary measure sets in arbitrary dimensions.
Computing the Smallest kEnclosing Circle and Related Problems
, 1999
"... We present an efficient algorithm for solving the "smallest kenclosing circle " ( kSC) problem: Given a set of n points in the plane and an integer k ^ n, find the smallest disk containing k of the points. We present two solutions. When using O(nk) storage, the problem can be so ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
We present an efficient algorithm for solving the &quot;smallest kenclosing circle &quot; ( kSC) problem: Given a set of n points in the plane and an integer k ^ n, find the smallest disk containing k of the points. We present two solutions. When using O(nk) storage, the problem can be solved in time O(nk log2 n). When only O(n log n) storage is allowed, the running time is O(nk log2 n log nk). This problem
Finding kClosestPairs Efficiently for High Dimensional Data
, 2000
"... We present a novel approach to report approximate as well as exact kclosest pairs for sets of high dimensional points, under the L t metric, t = 1; : : : ; 1. The proposed algorithms are efficient and simple to implement. They all use multiple shifted copies of the data points sorted according to ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We present a novel approach to report approximate as well as exact kclosest pairs for sets of high dimensional points, under the L t metric, t = 1; : : : ; 1. The proposed algorithms are efficient and simple to implement. They all use multiple shifted copies of the data points sorted according to their position along a space filling curve, such as the Peano curve, in a way that allows us to make performance guarantees and without assuming that the dimensionality d is constant. The first algorithm computes an O(d 1+1=t ) approximation to the k th closest pair distance in O(d 2 n log +dk(d + log k)) time. Experimental results, obtained using various real data sets of varying dimensions, indicate that the approximation factor is much better in practice. In the second algorithm we use this approximation in order to find the exact k closest pairs in O(dM) additional time, where M is the number of points in certain short subsegments of the spacefilling curve. The exact algorithm is ...
Efficient and Accurate Nearest Neighbor and Closest Pair Search in High Dimensional Space
"... Nearest neighbor (NN) search in high dimensional space is an important problem in many applications. From the database perspective, a good solution needs to have two properties: (i) it can be easily incorporated in a relational database, and (ii) its query cost should increase sublinearly with the ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Nearest neighbor (NN) search in high dimensional space is an important problem in many applications. From the database perspective, a good solution needs to have two properties: (i) it can be easily incorporated in a relational database, and (ii) its query cost should increase sublinearly with the dataset size, regardless of the data and query distributions. Locality sensitive hashing (LSH) is a wellknown methodology fulfilling both requirements, but its current implementations either incur expensive space and query cost, or abandon its theoretical guarantee on the quality of query results. Motivated by this, we improve LSH by proposing an access method called the locality sensitive Btree (LSBtree) to enable fast, accurate, highdimensional NN search in relational databases. The combination of several LSBtrees forms a LSBforest that has strong quality guarantees, but improves dramatically the efficiency of the previous LSH implementation having the same guarantees. In practice, the LSBtree itself is also an effective index, which consumes linear space, supports efficient updates, and provides accurate query results. In our experiments, the LSBtree was faster than (i) iDistance (a famous technique for exact NN search) by two orders of magnitude, and (ii) MedRank (a recent approximate method with nontrivial quality guarantees) by one order
Structures for the Dynamic ClosestPair Problem
"... We describe a new randomized data structure, the sparse partition, for solving the dynamic closestpair problem. Using this data structure the closest pair of a set of n points in kdimensional space, for any fixed Ic, can be found in constant time. If the points are chosen from a finite universe, a ..."
Abstract
 Add to MetaCart
We describe a new randomized data structure, the sparse partition, for solving the dynamic closestpair problem. Using this data structure the closest pair of a set of n points in kdimensional space, for any fixed Ic, can be found in constant time. If the points are chosen from a finite universe, and if the floor function is available at unitcost, then the data structure supports insertions into and deletions from the set in expected O(log n) time and requires expected O(n) space. Here, it is assumed that the updates are chosen by an adversary who does not know the random choices made by the data structure. The data structure can be modified to run in O(log ’ n) expected time per update in the algebraic decision tree model of computation. Even this version is more efficient than the currently best known deterministic algorithms for solving the problem for Ic> 1. 1