• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 2,526
Next 10 →

Learning Nearest-Neighbor Classifiers with

by Hua Ouyang, Alexander Gray
"... We consider improving the performance of k-Nearest Neighbor classifiers. A reg-ularized kNN is proposed to learn an optimal dissimilarity function to substitute the Euclidean metric. The learning process employs hyperkernels and shares a similar regularization framework as support vector machines (S ..."
Abstract - Add to MetaCart
We consider improving the performance of k-Nearest Neighbor classifiers. A reg-ularized kNN is proposed to learn an optimal dissimilarity function to substitute the Euclidean metric. The learning process employs hyperkernels and shares a similar regularization framework as support vector machines

Validation of Nearest Neighbor Classifiers

by Eric Bax , 1998
"... We develop a probabilistic bound on the error rate of the nearest neighbor classifier formed from a set of labelled examples. The bound is computed using only the examples in the set. A subset of the examples is used as a validation set to bound the error rate of the classifier formed from the remai ..."
Abstract - Cited by 3 (0 self) - Add to MetaCart
We develop a probabilistic bound on the error rate of the nearest neighbor classifier formed from a set of labelled examples. The bound is computed using only the examples in the set. A subset of the examples is used as a validation set to bound the error rate of the classifier formed from

Selective Sampling For Nearest Neighbor Classifiers

by Michael Lindenbaum, Shaul Markovitch, DMITRY RUSAKOV - MACHINE LEARNING , 2004
"... Most existing inductive learning algorithms work under the assumption that their training examples are already tagged. There are domains, however, where the tagging procedure requires significant computation resources or manual labor. In such cases, it may be beneficial for the learner to be active, ..."
Abstract - Cited by 81 (3 self) - Add to MetaCart
, intelligently selecting the examples for labeling with the goal of reducing the labeling cost. In this paper we present LSS---a lookahead algorithm for selective sampling of examples for nearest neighbor classifiers. The algorithm is looking for the example with the highest utility, taking its effect

k nearest neighbor classifier

by Santosh S. Venkatesh , 1997
"... derivation of the finite-sample risk of the ..."
Abstract - Add to MetaCart
derivation of the finite-sample risk of the

Adapt Bagging to Nearest Neighbor Classifiers

by Zhi-hua Zhou, Yang Yu - Journal of Computer Science and Technology , 2004
"... It is well-known that in order to build a strong ensemble, the component learners should be with high diversity as well as high accuracy. If perturbing the training set can cause significant changes in the component learners constructed, then Bagging can effectively improve accuracy. However, for st ..."
Abstract - Cited by 5 (2 self) - Add to MetaCart
, for stable learners such as nearest neighbor classifiers, perturbing the training set can hardly produce diverse component learners, therefore Bagging does not work well. This paper adapts Bagging to nearest neighbor classifiers through injecting randomness to distance metrics. In detail, in constructing

Large margin nearest neighbor classifiers

by Carlotta Domeniconi, Dimitrios Gunopulos, Jing Peng - IEEE Transactions on Neural Networks , 2005
"... Abstract—The nearest neighbor technique is a simple and appealing approach to addressing classification problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a finite number of examples due to the curse of d ..."
Abstract - Cited by 13 (0 self) - Add to MetaCart
Abstract—The nearest neighbor technique is a simple and appealing approach to addressing classification problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a finite number of examples due to the curse

Center-based nearest neighbor classifier

by Qing-bin Gao, Zheng-zhi Wang , 2007
"... In this paper, a novel center-based nearest neighbor (CNN) classifier is proposed to deal with the pattern classification problems. Unlike nearest feature line (NFL) method, CNN considers the line passing through a sample point with known label and the center of the sample class. This line is called ..."
Abstract - Cited by 8 (0 self) - Add to MetaCart
In this paper, a novel center-based nearest neighbor (CNN) classifier is proposed to deal with the pattern classification problems. Unlike nearest feature line (NFL) method, CNN considers the line passing through a sample point with known label and the center of the sample class. This line

Probabilistic Characterization of Nearest Neighbor Classifier

by Amit Dhurandhar, Alin Dobra
"... The k-Nearest Neighbor classification algorithm (kNN) is one of the most simple yet effective classification algorithms in use. It finds major applications in text categorization, outlier detection, handwritten character recognition, fraud detection and in other related areas. Though sound theoret ..."
Abstract - Cited by 5 (4 self) - Add to MetaCart
The k-Nearest Neighbor classification algorithm (kNN) is one of the most simple yet effective classification algorithms in use. It finds major applications in text categorization, outlier detection, handwritten character recognition, fraud detection and in other related areas. Though sound

Fast Implementations of Nearest Neighbor Classifiers

by unknown authors
"... Statistical classifiers for OCR have been widely investigated. Using Karhunen-Loève (KL) transforms of normalized binary images it has been found that the non-parametric classifiers work better than many commonly used neural networks [1]. Indeed the simplicity and efficacy of the KNN method [2] has ..."
Abstract - Add to MetaCart
Statistical classifiers for OCR have been widely investigated. Using Karhunen-Loève (KL) transforms of normalized binary images it has been found that the non-parametric classifiers work better than many commonly used neural networks [1]. Indeed the simplicity and efficacy of the KNN method [2] has

Use of K-Nearest Neighbor Classifier for Intrusion Detection

by Yihua Liao, V. Rao Vemuri , 2002
"... A new approach, based on the k-Nearest Neighbor (kNN) classifier, is used to classify program behavior as normal or intrusive. Program behavior, in turn, is represented by frequencies of system calls. Each system call is treated as a word and the collection of system calls over each program executio ..."
Abstract - Cited by 46 (2 self) - Add to MetaCart
A new approach, based on the k-Nearest Neighbor (kNN) classifier, is used to classify program behavior as normal or intrusive. Program behavior, in turn, is represented by frequencies of system calls. Each system call is treated as a word and the collection of system calls over each program
Next 10 →
Results 1 - 10 of 2,526
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University