Results 1  10
of
30
Nearest Neighbor Queries
, 1995
"... A frequently encountered type of query in Geographic Information Systems is to find the k nearest neighbor objects to a given point in space. Processing such queries requires substantially different search algorithms than those for location or range queries. In this paper we present an efficient bra ..."
Abstract

Cited by 594 (1 self)
 Add to MetaCart
(Show Context)
A frequently encountered type of query in Geographic Information Systems is to find the k nearest neighbor objects to a given point in space. Processing such queries requires substantially different search algorithms than those for location or range queries. In this paper we present an efficient branchandbound Rtree traversal algorithm to find the nearest neighbor object to a point, and then generalize it to finding the k nearest neighbors. We also discuss metrics for an optimistic and a pessimistic search ordering strategy as well as for pruning. Finally, we present the results of several experiments obtained using the implementation of our algorithm and examine the behavior of the metrics and the scalability of the algorithm.
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 594 (53 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning t parameters, interference between old and new data, implementing locally weighted learning e ciently, and applications of locally weighted learning. A companion paper surveys how locally weighted learning can be used in robot learning and control.
Data Structures and Algorithms for Nearest Neighbor Search in General Metric Spaces
, 1993
"... We consider the computational problem of finding nearest neighbors in general metric spaces. Of particular interest are spaces that may not be conveniently embedded or approximated in Euclidian space, or where the dimensionality of a Euclidian representation is very high. Also relevant are highdim ..."
Abstract

Cited by 356 (5 self)
 Add to MetaCart
We consider the computational problem of finding nearest neighbors in general metric spaces. Of particular interest are spaces that may not be conveniently embedded or approximated in Euclidian space, or where the dimensionality of a Euclidian representation is very high. Also relevant are highdimensional Euclidian settings in which the distribution of data is in some sense of lower dimension and embedded in the space. The vptree (vantage point tree) is introduced in several forms, together with associated algorithms, as an improved method for these difficult search problems. Tree construction executes in O(n log(n)) time, and search is under certain circumstances and in the limit, O(log(n)) expected time. The theoretical basis for this approach is developed and the results of several experiments are reported. In Euclidian cases, kdtree performance is compared.
Geometric Range Searching and Its Relatives
 CONTEMPORARY MATHEMATICS
"... ... process a set S of points in so that the points of S lying inside a query R region can be reported or counted quickly. Wesurvey the known techniques and data structures for range searching and describe their application to other related searching problems. ..."
Abstract

Cited by 273 (41 self)
 Add to MetaCart
... process a set S of points in so that the points of S lying inside a query R region can be reported or counted quickly. Wesurvey the known techniques and data structures for range searching and describe their application to other related searching problems.
Comparative Experiments on Disambiguating Word Senses: An Illustration of the Role of Bias in Machine Learning
, 1996
"... This paper describes an experimental comparison of seven different learning algorithms on the problem of learning to disambiguate the meaning of a word from context. The algorithms tested include statistical, neuralnetwork, decisiontree, rulebased, and casebased classification techniques. The sp ..."
Abstract

Cited by 126 (2 self)
 Add to MetaCart
This paper describes an experimental comparison of seven different learning algorithms on the problem of learning to disambiguate the meaning of a word from context. The algorithms tested include statistical, neuralnetwork, decisiontree, rulebased, and casebased classification techniques. The specific problem tested involves disambiguating six senses of the word "line" using the words in the current and proceeding sentence as context. The statistical and neuralnetwork methods perform the best on this particular problem and we discuss a potential reason for this ob served difference. We also discuss the role of bias in machine ]earning and its importance in explaining performance differences observed on specific problems.
Consensus Surfaces for Modeling 3D Objects from Multiple Range Images
, 1998
"... In this paper, we present a robust method for creating a triangulated surface mesh from multiple range images. Our method merges a set of range images into a volumetric implicitsurfacerepresentation which is converted to a surface mesh using a variant of the marchingcubes algorithm. Unlike previou ..."
Abstract

Cited by 97 (15 self)
 Add to MetaCart
(Show Context)
In this paper, we present a robust method for creating a triangulated surface mesh from multiple range images. Our method merges a set of range images into a volumetric implicitsurfacerepresentation which is converted to a surface mesh using a variant of the marchingcubes algorithm. Unlike previous techniques based on implicitsurfacerepresentations, our method estimates the signed distance to the object surfaceby #nding a consensus of locally coherent observations of the surface. We call this method the consensussurface algorithm. This algorithm e#ectively eliminates many of the troublesome e#ects of noise and extraneous surface observations without sacri#cing the accuracy of the resulting surface. We utilize octrees to represent volumetric implicit surfacese#ectively reducing the computation and memory requirements of the volumetric representation without sacri#cing accuracy of the resulting surface. We present results which demonstrate that our consensussurface algorithm can construct accurate geometric models from rather noisy input range data. 1
A NonHierarchical Procedure for ReSynthesis of Complex Textures
 In WSCG ’2001 Conference proceedings
, 2001
"... A procedure is described for synthesizing an image with the same texture as a given input image. ..."
Abstract

Cited by 65 (1 self)
 Add to MetaCart
(Show Context)
A procedure is described for synthesizing an image with the same texture as a given input image.
Nearest neighbor classification from multiple feature subsets
 Intelligent Data Analysis
, 1999
"... Combining multiple classifiers is an effective technique for improving accuracy. There are many general combining algorithms, such as Bagging, Boosting, or Error Correcting Output Coding, that significantly improve classifiers like decision trees, rule learners, or neural networks. Unfortunately, th ..."
Abstract

Cited by 38 (1 self)
 Add to MetaCart
Combining multiple classifiers is an effective technique for improving accuracy. There are many general combining algorithms, such as Bagging, Boosting, or Error Correcting Output Coding, that significantly improve classifiers like decision trees, rule learners, or neural networks. Unfortunately, these combining methods do not improve the nearest neighbor classifier. In this paper, we present MFS, a combining algorithm designed to improve the accuracy of the nearest neighbor (NN) classifier. MFS combines multiple NN classifiers each using only a random subset of features. The experimental results are encouraging: On 25 datasets from the UCI Repository, MFS signi cantly outperformed several standard NN variants and was competitive with boosted decision trees. In additional experiments, we show that MFS is robust to irrelevant features, and is able to reduce both bias and variance components of error.
A DistanceScan Algorithm for Spatial Access Structures
, 1994
"... In geographic information systems it is often useful to select an object located closest to a given point or to scan the objects with respect to their distance to a given point in ascending order. An example for a query of this type would be to retrieve ten hotels with at least three stars lying clo ..."
Abstract

Cited by 34 (6 self)
 Add to MetaCart
In geographic information systems it is often useful to select an object located closest to a given point or to scan the objects with respect to their distance to a given point in ascending order. An example for a query of this type would be to retrieve ten hotels with at least three stars lying closest to the venue of a conference. Various subtypes of similar queries exist. On the other hand, research in geometric access structures has concentrated mainly on range queries. We present an efficient algorithm for closest and distancescanqueries of various kinds. Our algorithm is based on the nearest neighbour algorithm for kdtrees given by Friedman et al. [FBF77] and refined by Sproull [Spr91]. We adapt this algorithm to external access structures and extend it to process a broader class of queries. Furthermore we show that the algorithm can be applied to point objects as well as to nonpoint objects, and that it can be used with all spatial access structures using a hierarchical dir...