Results 1  10
of
707,226
Use of KNearest Neighbor Classifier for Intrusion Detection
, 2002
"... A new approach, based on the kNearest Neighbor (kNN) classifier, is used to classify program behavior as normal or intrusive. Program behavior, in turn, is represented by frequencies of system calls. Each system call is treated as a word and the collection of system calls over each program executio ..."
Abstract

Cited by 44 (2 self)
 Add to MetaCart
A new approach, based on the kNearest Neighbor (kNN) classifier, is used to classify program behavior as normal or intrusive. Program behavior, in turn, is represented by frequencies of system calls. Each system call is treated as a word and the collection of system calls over each program
kNearest Neighbour Classifiers
, 2007
"... Abstract. Perhaps the most straightforward classifier in the arsenal or machine learning techniques is the Nearest Neighbour Classifier – classification is achieved by identifying the nearest neighbours to a query example and using those neighbours to determine the class of the query. This approach ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
Abstract. Perhaps the most straightforward classifier in the arsenal or machine learning techniques is the Nearest Neighbour Classifier – classification is achieved by identifying the nearest neighbours to a query example and using those neighbours to determine the class of the query. This approach
kNearest Neighbors in Uncertain Graphs
"... Complex networks, such as biological, social, and communication networks, often entail uncertainty, and thus, can be modeled as probabilistic graphs. Similar to the problem of similarity search in standard graphs, a fundamental problem for probabilistic graphs is to efficiently answer knearest neig ..."
Abstract

Cited by 31 (4 self)
 Add to MetaCart
Complex networks, such as biological, social, and communication networks, often entail uncertainty, and thus, can be modeled as probabilistic graphs. Similar to the problem of similarity search in standard graphs, a fundamental problem for probabilistic graphs is to efficiently answer knearest
An Interval Valued KNearest Neighbors Classifier
"... The KNearest Neighbors (kNN) classifier has become a wellknown, successful method for pattern classification tasks. In recent years, many enhancements to the original algorithm have been proposed. Fuzzy sets theory has been the basis of several proposed models towards the enhancement of the nea ..."
Abstract
 Add to MetaCart
of the nearest neighbors rule, being the Fuzzy KNearest Neighbors (FuzzyKNN) classifier the most notable procedure in the field. In this work we present a new approach to the nearest neighbor classifier based on the use of interval valued fuzzy sets. The use and implementation of interval values facilitates
Sparse RepresentationBased Nearest Neighbor Classifiers for Hyperspectral Imagery
"... Abstract—In this letter, a sparse representationbased nearest neighbor (SRNN) classifier is proposed. Unlike the traditional knearest neighbor (NN) classifier that employs the Euclidean distance as similarity metric, the proposed SRNN considers sparse coefficients to determine the label of testing ..."
Abstract
 Add to MetaCart
Abstract—In this letter, a sparse representationbased nearest neighbor (SRNN) classifier is proposed. Unlike the traditional knearest neighbor (NN) classifier that employs the Euclidean distance as similarity metric, the proposed SRNN considers sparse coefficients to determine the label
Bayesian Network Classifiers
, 1997
"... Recent work in supervised learning has shown that a surprisingly simple Bayesian classifier with strong assumptions of independence among features, called naive Bayes, is competitive with stateoftheart classifiers such as C4.5. This fact raises the question of whether a classifier with less restr ..."
Abstract

Cited by 788 (23 self)
 Add to MetaCart
restrictive assumptions can perform even better. In this paper we evaluate approaches for inducing classifiers from data, based on the theory of learning Bayesian networks. These networks are factored representations of probability distributions that generalize the naive Bayesian classifier and explicitly
Estimating Continuous Distributions in Bayesian Classifiers
 In Proceedings of the Eleventh Conference on Uncertainty in Artificial Intelligence
, 1995
"... When modeling a probability distribution with a Bayesian network, we are faced with the problem of how to handle continuous variables. Most previous work has either solved the problem by discretizing, or assumed that the data are generated by a single Gaussian. In this paper we abandon the normality ..."
Abstract

Cited by 489 (2 self)
 Add to MetaCart
the normality assumption and instead use statistical methods for nonparametric density estimation. For a naive Bayesian classifier, we present experimental results on a variety of natural and artificial domains, comparing two methods of density estimation: assuming normality and modeling each conditional
Exploiting Generative Models in Discriminative Classifiers
 In Advances in Neural Information Processing Systems 11
, 1998
"... Generative probability models such as hidden Markov models provide a principled way of treating missing information and dealing with variable length sequences. On the other hand, discriminative methods such as support vector machines enable us to construct flexible decision boundaries and often resu ..."
Abstract

Cited by 538 (11 self)
 Add to MetaCart
result in classification performance superior to that of the model based approaches. An ideal classifier should combine these two complementary approaches. In this paper, we develop a natural way of achieving this combination by deriving kernel functions for use in discriminative methods such as support
Coordination of Groups of Mobile Autonomous Agents Using Nearest Neighbor Rules
, 2002
"... In a recent Physical Review Letters paper, Vicsek et. al. propose a simple but compelling discretetime model of n autonomous agents fi.e., points or particlesg all moving in the plane with the same speed but with dierent headings. Each agent's heading is updated using a local rule based on ..."
Abstract

Cited by 1245 (60 self)
 Add to MetaCart
In a recent Physical Review Letters paper, Vicsek et. al. propose a simple but compelling discretetime model of n autonomous agents fi.e., points or particlesg all moving in the plane with the same speed but with dierent headings. Each agent's heading is updated using a local rule based
Optimal MultiStep kNearest Neighbor Search
, 1998
"... For an increasing number of modern database applications, efficient support of similarity search becomes an important task. Along with the complexity of the objects such as images, molecules and mechanical parts, also the complexity of the similarity models increases more and more. Whereas algorithm ..."
Abstract

Cited by 199 (23 self)
 Add to MetaCart
, and our investigations substantiate that the number of candidates which are produced in the filter step and exactly evaluated in the refinement step is a fundamental efficiency parameter. After revealing the strong performance shortcomings of the stateoftheart algorithm for knearest neighbor search
Results 1  10
of
707,226