Results 1  10
of
11
Fisher Discriminant Analysis With Kernels
, 1999
"... A nonlinear classification technique based on Fisher's discriminant is proposed. The main ingredient is the kernel trick which allows the efficient computation of Fisher discriminant in feature space. The linear classification in feature space corresponds to a (powerful) nonlinear decision functi ..."
Abstract

Cited by 312 (15 self)
 Add to MetaCart
A nonlinear classification technique based on Fisher's discriminant is proposed. The main ingredient is the kernel trick which allows the efficient computation of Fisher discriminant in feature space. The linear classification in feature space corresponds to a (powerful) nonlinear decision function in input space. Large scale simulations demonstrate the competitiveness of our approach.
A survey of kernel and spectral methods for clustering
, 2008
"... Clustering algorithms are a useful tool to explore data structures and have been employed in many disciplines. The focus of this paper is the partitioning clustering problem with a special interest in two recent approaches: kernel and spectral methods. The aim of this paper is to present a survey of ..."
Abstract

Cited by 45 (5 self)
 Add to MetaCart
Clustering algorithms are a useful tool to explore data structures and have been employed in many disciplines. The focus of this paper is the partitioning clustering problem with a special interest in two recent approaches: kernel and spectral methods. The aim of this paper is to present a survey of kernel and spectral clustering methods, two approaches able to produce nonlinear separating hypersurfaces between clusters. The presented kernel clustering methods are the kernel version of many classical clustering algorithms, e.g., Kmeans, SOM and neural gas. Spectral clustering arise from concepts in spectral graph theory and the clustering problem is configured as a graph cut problem where an appropriate objective function has to be optimized. An explicit proof of the fact that these two paradigms have the same objective is reported since it has been proven that these two seemingly different approaches have the same mathematical foundation. Besides, fuzzy kernel clustering methods are presented as extensions of kernel Kmeans clustering algorithm.
Machine learning techniques for braincomputer interfaces
 BIOMEDICAL ENGINEERING
, 2004
"... This review discusses machine learning methods and their application to BrainComputer Interfacing. A particular focus is placed on feature selection. We also point out common flaws when validating machine learning methods in the context of BCI. Finally we provide a brief overview on the BerlinBrai ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
This review discusses machine learning methods and their application to BrainComputer Interfacing. A particular focus is placed on feature selection. We also point out common flaws when validating machine learning methods in the context of BCI. Finally we provide a brief overview on the BerlinBrain Computer Interface (BBCI).
Noncommutative positive kernels and their matrix evaluations
 Proc. Amer. Math. Soc
"... Abstract. We show that a formal power series in 2N noncommuting indeterminates is a positive noncommutative kernel if and only if the kernel on Ntuples of matrices of any size obtained from this series by matrix substitution is positive. We present two versions of this result related to different ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Abstract. We show that a formal power series in 2N noncommuting indeterminates is a positive noncommutative kernel if and only if the kernel on Ntuples of matrices of any size obtained from this series by matrix substitution is positive. We present two versions of this result related to different classes of matrix substitutions. In the general case we consider substitutions of jointly nilpotent Ntuples of matrices, and thus the question of convergence does not arise. In the “convergent ” case we consider substitutions of Ntuples of matrices from a neighborhood of zero where the series converges. Moreover, in the first case the result can be improved: the positivity of a noncommutative kernel is guaranteed by the positivity of its values on the diagonal, i.e., on pairs of coinciding jointly nilpotent Ntuples of matrices. In particular this yields an analogue of a recent result of Helton on noncommutative sumsofsquares representations for the class of hereditary noncommutative polynomials. We show by an example that the improved formulation does not apply in the “convergent ” case.
Kernel Fisher Discriminant for Shapebased Classification in Epilepsy
, 2008
"... In this paper, we present the application of Kernel Fisher Discriminant in the statistical analysis of shape deformations that indicate the hemispheric location of an epileptic focus. The scans of two classes of patients with epilepsy, those with a right and those with a left medial temporal lobe fo ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
In this paper, we present the application of Kernel Fisher Discriminant in the statistical analysis of shape deformations that indicate the hemispheric location of an epileptic focus. The scans of two classes of patients with epilepsy, those with a right and those with a left medial temporal lobe focus (RATL and LATL), as validated by clinical consensus and subsequent surgery, were compared to a set of age and sex matched healthy volunteers using both volume and shape based features. Shapebased features are derived from the displacement field characterizing the nonrigid deformation between the left and right hippocampi of a control or a patient as the case may be. Using the shapebased features, the results show a significant improvement in distinguishing between the controls and the rest (RATL and LATL) visavis volumebased features. Using a novel feature, namely, the normalized histogram of the 3D displacement field, we also achieved significant improvement over the volumebased feature in classifying the patients as belonging to either of the two classes LATL or RATL respectively. It should be noted that automated identification of hemispherical foci of epilepsy has not been previously reported.
Uniform Designs Limit Aliasing
, 2000
"... This paper shows how uniform designs can reduce this aliasing. The discrepancy is a quantitative measure of how uniformly design points are placed on an experimental domain. It is shown that in very general situations lowdiscrepancy designs limit aliasing. For the case of regular fractional factori ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
This paper shows how uniform designs can reduce this aliasing. The discrepancy is a quantitative measure of how uniformly design points are placed on an experimental domain. It is shown that in very general situations lowdiscrepancy designs limit aliasing. For the case of regular fractional factorial designs it is shown that minimum discrepancy designs have maximum resolution and minimum aberration. Since the concept of discrepancy is more general than resolution or aberration, discrepancy can be used to generalize the definitions of resolution and aberration to other types of designs. 1. Introduction. There are di#erent approaches to experimental design. If the form of the model relating the response to the factors is known, then optimal designs can be used to estimate the unknown parameters e#ciently. However, in many cases one does not know the form of the model a priori. Rather the model is selected based on regression diagnostics when analyzing the experimental data. Uniform designs (Wang and Fang, 1981, Fang and Wang, 1994, Bates et al., 1996) spread experimental points evenly over the domain. It is shown in this article that such an approach reduces the e#ect of aliasing, i.e., the extent to which terms not included in the model a#ect the estimates of terms included in the model. For fractional factorial designs, it is shown that uniform designs are equivalent to designs with minimum aberration. Suppose that an experiment has s factors and the design region is
A reproducing kernel condition for indeterminacy in the multidimensional moment problem
 Proc. Amer. Math. Soc
, 2007
"... Abstract. Using the smallest eigenvalues of Hankel forms associated with a multidimensional moment problem, we establish a condition equivalent to the existence of a reproducing kernel. This result is a multivariate analogue of Berg, Chen, and Ismail’s 2002 result. We also present a class of measure ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. Using the smallest eigenvalues of Hankel forms associated with a multidimensional moment problem, we establish a condition equivalent to the existence of a reproducing kernel. This result is a multivariate analogue of Berg, Chen, and Ismail’s 2002 result. We also present a class of measures for which the existence of a reproducing kernel implies indeterminacy. 1.
Sestieri of Venice
, 2008
"... We have investigated space syntax of Venice by means of random walks. Random walks being defined on an undirected graph establish the Euclidean space in which distances and angles between nodes acquire the clear statistical interpretation. The properties of nodes with respect to random walks allow p ..."
Abstract
 Add to MetaCart
We have investigated space syntax of Venice by means of random walks. Random walks being defined on an undirected graph establish the Euclidean space in which distances and angles between nodes acquire the clear statistical interpretation. The properties of nodes with respect to random walks allow partitioning the city canal network into disjoint divisions which may be identified with the traditional divisions of the city (sestieri).
Connections between Uniformity and E(s²)Optimality in Level SUPERSATURATED DESIGNS
"... Supersaturated experimental designs are often assessed by the E(s²) criterion, and some methods have been found for constructing E(s²)optimal designs. Another criterion for assessing experimental designs is discrepancy, of which there are several diffrent kinds. The discrepancy measures h ..."
Abstract
 Add to MetaCart
Supersaturated experimental designs are often assessed by the E(s²) criterion, and some methods have been found for constructing E(s²)optimal designs. Another criterion for assessing experimental designs is discrepancy, of which there are several diffrent kinds. The discrepancy measures how much the empirical distribution of the design points deviate from the uniform distribution. Here it is shown that for 2level supersaturated designs the E(s²) criterion is equivalent to one kind of discrepancy and shares the same optimal designs with other kinds of discrepancies.