Results 1 
5 of
5
Estimating the Support of a HighDimensional Distribution
, 1999
"... Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified between 0 and 1. We propose a metho ..."
Abstract

Cited by 501 (32 self)
 Add to MetaCart
Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified between 0 and 1. We propose a method to approach this problem by trying to estimate a function f which is positive on S and negative on the complement. The functional form of f is given by a kernel expansion in terms of a potentially small subset of the training data; it is regularized by controlling the length of the weight vector in an associated feature space. The expansion coefficients are found by solving a quadratic programming problem, which we do by carrying out sequential optimization over pairs of input patterns. We also provide a preliminary theoretical analysis of the statistical performance of our algorithm. The algorithm is a natural extension of the support vector algorithm to the case of unlabelled d...
Data domain description using support vectors
 Proceedings of the European Symposium on Artificial Neural Networks
, 1999
"... Abstract. This paper introduces a new method for data domain description, inspired by the Support Vector Machine by V.Vapnik, called the Support Vector Domain Description (SVDD). This method computes a sphere shaped decision boundary with minimal volume around a set of objects. This data description ..."
Abstract

Cited by 44 (6 self)
 Add to MetaCart
Abstract. This paper introduces a new method for data domain description, inspired by the Support Vector Machine by V.Vapnik, called the Support Vector Domain Description (SVDD). This method computes a sphere shaped decision boundary with minimal volume around a set of objects. This data description can be used for novelty or outlier detection. It contains support vectors describing the sphere boundary and it has the possibility of obtaining higher order boundary descriptions without much extra computational cost. By using the di erent kernels this SVDD can obtain more exible and more accurate data descriptions. The error of the rst kind, the fraction of the training objects which will be rejected, can be estimated immediately from the description. 1.
Investigation of the Use of Neural Networks for Computerised Medical Image Analysis
, 1998
"... Advances in clinical medical imaging have brought about the routine production of vast numbers of medical images that need to be analysed. As a result an enormous amount of computer vision research effort has been targeted at achieving automated medical image analysis. This has proved to be an elusi ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Advances in clinical medical imaging have brought about the routine production of vast numbers of medical images that need to be analysed. As a result an enormous amount of computer vision research effort has been targeted at achieving automated medical image analysis. This has proved to be an elusive goal in many cases. The complexity of the problems encountered has prompted considerable interest in the use of neural networks for such applications. However, many reports of such work have been unsatisfactory in that often only qualitative results are reported, or only few patient cases are used. This thesis presents a study of the use of neural networks and computer vision for medical image analysis which aims to quantitatively investigate and demonstrate the potential of neural networks in such an application. A medical image analysis problem was selected whichwould facilitate this. The problem chosen was the automatic detection of acoustic neuromas in MR images of the head.Since neural networks excel at,,,
Novelty Detection Model Selection Using Volume Estimation
, 2005
"... In this paper, we present an approach to selecting models for novelty (outlier) detection. Our approach minimizes the risk of accepting outliers at a fixed normal rejection rate, under the assumption that the distribution of abnormal (outlier) data is uniformly distributed in some bounded region of ..."
Abstract
 Add to MetaCart
In this paper, we present an approach to selecting models for novelty (outlier) detection. Our approach minimizes the risk of accepting outliers at a fixed normal rejection rate, under the assumption that the distribution of abnormal (outlier) data is uniformly distributed in some bounded region of the input space. This risk is minimized by selecting the model with the smallest volume acceptance region, using a randomized volume estimation algorithm. The volume estimation algorithm can estimate the volume of a body in highdimensional space and scales polynomially in dimension with the number of calls to the model. We have performed extensive experiments which show that the combined model selection criteria are able to select not only the best models from a given model class, but also among all model classes. 1
ARTICLE Communicated by Vladimir Vapnik Estimating the Support of a HighDimensional Distribution
"... Suppose you are given some data set drawn from an underlying probability distribution P and you want to estimate a “simple ” subset S of input space such that the probability that a test point drawn from P lies outside of S equals some a priori speci�ed value between 0 and 1. We propose a method to ..."
Abstract
 Add to MetaCart
Suppose you are given some data set drawn from an underlying probability distribution P and you want to estimate a “simple ” subset S of input space such that the probability that a test point drawn from P lies outside of S equals some a priori speci�ed value between 0 and 1. We propose a method to approach this problem by trying to estimate a function f that is positive on S and negative on the complement. The functional form of f is given by a kernel expansion in terms of a potentially small subset of the training data;it is regularized by controlling the length of the weight vector in an associated feature space. The expansion coef�cients are found by solving a quadratic programming problem, which we do by carrying out sequential optimization over pairs of input patterns. We also provide a theoretical analysis of the statistical performance of our algorithm. The algorithm is a natural extension of the support vector algorithm to the case of unlabeled data. 1