Results 1 - 10
of
2,526
Learning Nearest-Neighbor Classifiers with
"... We consider improving the performance of k-Nearest Neighbor classifiers. A reg-ularized kNN is proposed to learn an optimal dissimilarity function to substitute the Euclidean metric. The learning process employs hyperkernels and shares a similar regularization framework as support vector machines (S ..."
Abstract
- Add to MetaCart
We consider improving the performance of k-Nearest Neighbor classifiers. A reg-ularized kNN is proposed to learn an optimal dissimilarity function to substitute the Euclidean metric. The learning process employs hyperkernels and shares a similar regularization framework as support vector machines
Validation of Nearest Neighbor Classifiers
, 1998
"... We develop a probabilistic bound on the error rate of the nearest neighbor classifier formed from a set of labelled examples. The bound is computed using only the examples in the set. A subset of the examples is used as a validation set to bound the error rate of the classifier formed from the remai ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
We develop a probabilistic bound on the error rate of the nearest neighbor classifier formed from a set of labelled examples. The bound is computed using only the examples in the set. A subset of the examples is used as a validation set to bound the error rate of the classifier formed from
Selective Sampling For Nearest Neighbor Classifiers
- MACHINE LEARNING
, 2004
"... Most existing inductive learning algorithms work under the assumption that their training examples are already tagged. There are domains, however, where the tagging procedure requires significant computation resources or manual labor. In such cases, it may be beneficial for the learner to be active, ..."
Abstract
-
Cited by 81 (3 self)
- Add to MetaCart
, intelligently selecting the examples for labeling with the goal of reducing the labeling cost. In this paper we present LSS---a lookahead algorithm for selective sampling of examples for nearest neighbor classifiers. The algorithm is looking for the example with the highest utility, taking its effect
Adapt Bagging to Nearest Neighbor Classifiers
- Journal of Computer Science and Technology
, 2004
"... It is well-known that in order to build a strong ensemble, the component learners should be with high diversity as well as high accuracy. If perturbing the training set can cause significant changes in the component learners constructed, then Bagging can effectively improve accuracy. However, for st ..."
Abstract
-
Cited by 5 (2 self)
- Add to MetaCart
, for stable learners such as nearest neighbor classifiers, perturbing the training set can hardly produce diverse component learners, therefore Bagging does not work well. This paper adapts Bagging to nearest neighbor classifiers through injecting randomness to distance metrics. In detail, in constructing
Large margin nearest neighbor classifiers
- IEEE Transactions on Neural Networks
, 2005
"... Abstract—The nearest neighbor technique is a simple and appealing approach to addressing classification problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a finite number of examples due to the curse of d ..."
Abstract
-
Cited by 13 (0 self)
- Add to MetaCart
Abstract—The nearest neighbor technique is a simple and appealing approach to addressing classification problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a finite number of examples due to the curse
Center-based nearest neighbor classifier
, 2007
"... In this paper, a novel center-based nearest neighbor (CNN) classifier is proposed to deal with the pattern classification problems. Unlike nearest feature line (NFL) method, CNN considers the line passing through a sample point with known label and the center of the sample class. This line is called ..."
Abstract
-
Cited by 8 (0 self)
- Add to MetaCart
In this paper, a novel center-based nearest neighbor (CNN) classifier is proposed to deal with the pattern classification problems. Unlike nearest feature line (NFL) method, CNN considers the line passing through a sample point with known label and the center of the sample class. This line
Probabilistic Characterization of Nearest Neighbor Classifier
"... The k-Nearest Neighbor classification algorithm (kNN) is one of the most simple yet effective classification algorithms in use. It finds major applications in text categorization, outlier detection, handwritten character recognition, fraud detection and in other related areas. Though sound theoret ..."
Abstract
-
Cited by 5 (4 self)
- Add to MetaCart
The k-Nearest Neighbor classification algorithm (kNN) is one of the most simple yet effective classification algorithms in use. It finds major applications in text categorization, outlier detection, handwritten character recognition, fraud detection and in other related areas. Though sound
Fast Implementations of Nearest Neighbor Classifiers
"... Statistical classifiers for OCR have been widely investigated. Using Karhunen-Loève (KL) transforms of normalized binary images it has been found that the non-parametric classifiers work better than many commonly used neural networks [1]. Indeed the simplicity and efficacy of the KNN method [2] has ..."
Abstract
- Add to MetaCart
Statistical classifiers for OCR have been widely investigated. Using Karhunen-Loève (KL) transforms of normalized binary images it has been found that the non-parametric classifiers work better than many commonly used neural networks [1]. Indeed the simplicity and efficacy of the KNN method [2] has
Use of K-Nearest Neighbor Classifier for Intrusion Detection
, 2002
"... A new approach, based on the k-Nearest Neighbor (kNN) classifier, is used to classify program behavior as normal or intrusive. Program behavior, in turn, is represented by frequencies of system calls. Each system call is treated as a word and the collection of system calls over each program executio ..."
Abstract
-
Cited by 46 (2 self)
- Add to MetaCart
A new approach, based on the k-Nearest Neighbor (kNN) classifier, is used to classify program behavior as normal or intrusive. Program behavior, in turn, is represented by frequencies of system calls. Each system call is treated as a word and the collection of system calls over each program
Results 1 - 10
of
2,526