Results 1  10
of
96,961
Learning NearestNeighbor Classifiers with
"... We consider improving the performance of kNearest Neighbor classifiers. A regularized kNN is proposed to learn an optimal dissimilarity function to substitute the Euclidean metric. The learning process employs hyperkernels and shares a similar regularization framework as support vector machines (S ..."
Abstract
 Add to MetaCart
We consider improving the performance of kNearest Neighbor classifiers. A regularized kNN is proposed to learn an optimal dissimilarity function to substitute the Euclidean metric. The learning process employs hyperkernels and shares a similar regularization framework as support vector machines
Validation of Nearest Neighbor Classifiers
, 1998
"... We develop a probabilistic bound on the error rate of the nearest neighbor classifier formed from a set of labelled examples. The bound is computed using only the examples in the set. A subset of the examples is used as a validation set to bound the error rate of the classifier formed from the remai ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We develop a probabilistic bound on the error rate of the nearest neighbor classifier formed from a set of labelled examples. The bound is computed using only the examples in the set. A subset of the examples is used as a validation set to bound the error rate of the classifier formed from
Adapt Bagging to Nearest Neighbor Classifiers
 Journal of Computer Science and Technology
, 2004
"... It is wellknown that in order to build a strong ensemble, the component learners should be with high diversity as well as high accuracy. If perturbing the training set can cause significant changes in the component learners constructed, then Bagging can effectively improve accuracy. However, for st ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
, for stable learners such as nearest neighbor classifiers, perturbing the training set can hardly produce diverse component learners, therefore Bagging does not work well. This paper adapts Bagging to nearest neighbor classifiers through injecting randomness to distance metrics. In detail, in constructing
Selective Sampling For Nearest Neighbor Classifiers
 MACHINE LEARNING
, 2004
"... Most existing inductive learning algorithms work under the assumption that their training examples are already tagged. There are domains, however, where the tagging procedure requires significant computation resources or manual labor. In such cases, it may be beneficial for the learner to be active, ..."
Abstract

Cited by 80 (3 self)
 Add to MetaCart
, intelligently selecting the examples for labeling with the goal of reducing the labeling cost. In this paper we present LSSa lookahead algorithm for selective sampling of examples for nearest neighbor classifiers. The algorithm is looking for the example with the highest utility, taking its effect
Large margin nearest neighbor classifiers
 IEEE Transactions on Neural Networks
, 2005
"... Abstract—The nearest neighbor technique is a simple and appealing approach to addressing classification problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a finite number of examples due to the curse of d ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
Abstract—The nearest neighbor technique is a simple and appealing approach to addressing classification problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a finite number of examples due to the curse
Centerbased nearest neighbor classifier
, 2007
"... In this paper, a novel centerbased nearest neighbor (CNN) classifier is proposed to deal with the pattern classification problems. Unlike nearest feature line (NFL) method, CNN considers the line passing through a sample point with known label and the center of the sample class. This line is called ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
In this paper, a novel centerbased nearest neighbor (CNN) classifier is proposed to deal with the pattern classification problems. Unlike nearest feature line (NFL) method, CNN considers the line passing through a sample point with known label and the center of the sample class. This line
Probabilistic Characterization of Nearest Neighbor Classifier
"... The kNearest Neighbor classification algorithm (kNN) is one of the most simple yet effective classification algorithms in use. It finds major applications in text categorization, outlier detection, handwritten character recognition, fraud detection and in other related areas. Though sound theoret ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
The kNearest Neighbor classification algorithm (kNN) is one of the most simple yet effective classification algorithms in use. It finds major applications in text categorization, outlier detection, handwritten character recognition, fraud detection and in other related areas. Though sound
Fast Implementations of Nearest Neighbor Classifiers
"... Statistical classifiers for OCR have been widely investigated. Using KarhunenLoève (KL) transforms of normalized binary images it has been found that the nonparametric classifiers work better than many commonly used neural networks [1]. Indeed the simplicity and efficacy of the KNN method [2] has ..."
Abstract
 Add to MetaCart
Statistical classifiers for OCR have been widely investigated. Using KarhunenLoève (KL) transforms of normalized binary images it has been found that the nonparametric classifiers work better than many commonly used neural networks [1]. Indeed the simplicity and efficacy of the KNN method [2] has
Reducing the Computational Demand of the Nearest Neighbor Classifier
"... Nearest neighbor classifiers demand significant computational resources (time and memory). Editing of the reference set and feature selection are two different solutions to this problem. This paper shows experimental results from applying editing and selection methods individually, in a cascade, or ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Nearest neighbor classifiers demand significant computational resources (time and memory). Editing of the reference set and feature selection are two different solutions to this problem. This paper shows experimental results from applying editing and selection methods individually, in a cascade
Results 1  10
of
96,961