• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 66,675
Next 10 →

Kinetic and Dynamic Data Structures for Closest Pairs and All Nearest Neighbors

by Pankaj K. Agarwal, Haim Kaplan, Micha Sharir , 2008
"... We present simple, fully dynamic and kinetic data structures, which are variants of a dynamic two-dimensional range tree, for maintaining the closest pair and all nearest neighbors for a set of n moving points in the plane; insertions and deletions of points are also allowed. If no insertions or del ..."
Abstract - Cited by 7 (2 self) - Add to MetaCart
We present simple, fully dynamic and kinetic data structures, which are variants of a dynamic two-dimensional range tree, for maintaining the closest pair and all nearest neighbors for a set of n moving points in the plane; insertions and deletions of points are also allowed. If no insertions

When Is "Nearest Neighbor" Meaningful?

by Kevin Beyer, Jonathan Goldstein, Raghu Ramakrishnan, Uri Shaft - In Int. Conf. on Database Theory , 1999
"... . We explore the effect of dimensionality on the "nearest neighbor " problem. We show that under a broad set of conditions (much broader than independent and identically distributed dimensions), as dimensionality increases, the distance to the nearest data point approaches the distance ..."
Abstract - Cited by 402 (1 self) - Add to MetaCart
. We explore the effect of dimensionality on the "nearest neighbor " problem. We show that under a broad set of conditions (much broader than independent and identically distributed dimensions), as dimensionality increases, the distance to the nearest data point approaches

Data Structures and Algorithms for Nearest Neighbor Search in General Metric Spaces

by Peter N. Yianilos , 1993
"... We consider the computational problem of finding nearest neighbors in general metric spaces. Of particular interest are spaces that may not be conveniently embedded or approximated in Euclidian space, or where the dimensionality of a Euclidian representation is very high. Also relevant are high-dim ..."
Abstract - Cited by 356 (5 self) - Add to MetaCart
We consider the computational problem of finding nearest neighbors in general metric spaces. Of particular interest are spaces that may not be conveniently embedded or approximated in Euclidian space, or where the dimensionality of a Euclidian representation is very high. Also relevant are high

FastMap: A Fast Algorithm for Indexing, Data-Mining and Visualization of Traditional and Multimedia Datasets

by Christos Faloutsos, King-Ip (David) Lin , 1995
"... A very promising idea for fast searching in traditional and multimedia databases is to map objects into points in k-d space, using k feature-extraction functions, provided by a domain expert [25]. Thus, we can subsequently use highly fine-tuned spatial access methods (SAMs), to answer several types ..."
Abstract - Cited by 497 (23 self) - Add to MetaCart
types of queries, including the `Query By Example' type (which translates to a range query); the `all pairs' query (which translates to a spatial join [8]); the nearest-neighbor or best-match query, etc. However, designing feature extraction functions can be hard. It is relatively easier for a

Large margin methods for structured and interdependent output variables

by Ioannis Tsochantaridis, Thorsten Joachims, Thomas Hofmann, Yasemin Altun - JOURNAL OF MACHINE LEARNING RESEARCH , 2005
"... Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses the complementary ..."
Abstract - Cited by 612 (12 self) - Add to MetaCart
the complementary issue of designing classification algorithms that can deal with more complex outputs, such as trees, sequences, or sets. More generally, we consider problems involving multiple dependent output variables, structured output spaces, and classification problems with class attributes. In order

Closest Point Search in Lattices

by Erik Agrell, Thomas Eriksson, Alexander Vardy, Kenneth Zeger - IEEE TRANS. INFORM. THEORY , 2000
"... In this semi-tutorial paper, a comprehensive survey of closest-point search methods for lattices without a regular structure is presented. The existing search strategies are described in a unified framework, and differences between them are elucidated. An efficient closest-point search algorithm, ba ..."
Abstract - Cited by 324 (2 self) - Add to MetaCart
In this semi-tutorial paper, a comprehensive survey of closest-point search methods for lattices without a regular structure is presented. The existing search strategies are described in a unified framework, and differences between them are elucidated. An efficient closest-point search algorithm

The Elements of Statistical Learning -- Data Mining, Inference, and Prediction

by Trevor Hastie, Robert Tibshirani, Jerome Friedman
"... ..."
Abstract - Cited by 1320 (13 self) - Add to MetaCart
Abstract not found

Primitives for the manipulation of general subdivisions and the computations of Voronoi diagrams

by Leonidas Guibas, Jorge Stolfi - ACM Tmns. Graph , 1985
"... The following problem is discussed: Given n points in the plane (the sites) and an arbitrary query point 4, find the site that is closest to q. This problem can be solved by constructing the Voronoi diagram of the given sites and then locating the query point in one of its regions. Two algorithms ar ..."
Abstract - Cited by 543 (11 self) - Add to MetaCart
The following problem is discussed: Given n points in the plane (the sites) and an arbitrary query point 4, find the site that is closest to q. This problem can be solved by constructing the Voronoi diagram of the given sites and then locating the query point in one of its regions. Two algorithms

A Weighted Nearest Neighbor Algorithm for Learning with Symbolic Features

by Scott Cost, Steven Salzberg - Machine Learning , 1993
"... In the past, nearest neighbor algorithms for learning from examples have worked best in domains in which all features had numeric values. In such domains, the examples can be treated as points and distance metrics can use standard definitions. In symbolic domains, a more sophisticated treatment of t ..."
Abstract - Cited by 305 (3 self) - Add to MetaCart
In the past, nearest neighbor algorithms for learning from examples have worked best in domains in which all features had numeric values. In such domains, the examples can be treated as points and distance metrics can use standard definitions. In symbolic domains, a more sophisticated treatment

Efficient Variants of the ICP Algorithm

by Szymon Rusinkiewicz, Marc Levoy - INTERNATIONAL CONFERENCE ON 3-D DIGITAL IMAGING AND MODELING , 2001
"... The ICP (Iterative Closest Point) algorithm is widely used for geometric alignment of three-dimensional models when an initial estimate of the relative pose is known. Many variants of ICP have been proposed, affecting all phases of the algorithm from the selection and matching of points to the minim ..."
Abstract - Cited by 702 (5 self) - Add to MetaCart
The ICP (Iterative Closest Point) algorithm is widely used for geometric alignment of three-dimensional models when an initial estimate of the relative pose is known. Many variants of ICP have been proposed, affecting all phases of the algorithm from the selection and matching of points
Next 10 →
Results 1 - 10 of 66,675
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2018 The Pennsylvania State University