Results 1  10
of
416
Generalized Discriminant Analysis Using a Kernel Approach
, 2000
"... We present a new method that we call Generalized Discriminant Analysis (GDA) to deal with nonlinear discriminant analysis using kernel function operator. The underlying theory is close to the Support Vector Machines (SVM) insofar as the GDA method provides a mapping of the input vectors into high di ..."
Abstract

Cited by 336 (2 self)
 Add to MetaCart
We present a new method that we call Generalized Discriminant Analysis (GDA) to deal with nonlinear discriminant analysis using kernel function operator. The underlying theory is close to the Support Vector Machines (SVM) insofar as the GDA method provides a mapping of the input vectors into high dimensional feature space. In the transformed space, linear properties make it easy to extend and generalize the classical Linear Discriminant Analysis (LDA) to non linear discriminant analysis. The formulation is expressed as an eigenvalue problem resolution. Using a different kernel, one can cover a wide class of nonlinearities. For both simulated data and alternate kernels, we give classification results as well as the shape of the separating function. The results are confirmed using a real data to perform seed classification. 1. Introduction Linear discriminant analysis (LDA) is a traditional statistical method which has proven successful on classification problems [Fukunaga, 1990]. The p...
Data Association in Stochastic Mapping Using the Joint Compatibility Test
, 2001
"... In this paper, we address the problem of robust data association for simultaneous vehicle localization and map building. We show that the classical gated nearest neighbor approach, which considers each matching between sensor observations and features independently, ignores the fact that measurement ..."
Abstract

Cited by 253 (15 self)
 Add to MetaCart
In this paper, we address the problem of robust data association for simultaneous vehicle localization and map building. We show that the classical gated nearest neighbor approach, which considers each matching between sensor observations and features independently, ignores the fact that measurement prediction errors are correlated. This leads to easily accepting incorrect matchings when clutter or vehicle errors increase. We propose a new measurement of the joint compatibility of a set of pairings that successfully rejects spurious matchings. We show experimentally that this restrictive criterion can be used to efficiently search for the best solution to data association. Unlike the nearest neighbor, this method provides a robust solution in complex situations, such as cluttered environments or when revisiting previously mapped regions.
A robust and precise method for solving the permutation problem of frequencydomain blind source separation
 IEEE Trans. on Speech and Audio Processing 12
, 2004
"... This paper presents a robust and precise method for solving the permutation problem of frequencydomain blind source separation. It is based on two previous approaches: the direction of arrival estimation and the interfrequency correlation. We discuss the advantages and disadvantages of the two app ..."
Abstract

Cited by 115 (27 self)
 Add to MetaCart
(Show Context)
This paper presents a robust and precise method for solving the permutation problem of frequencydomain blind source separation. It is based on two previous approaches: the direction of arrival estimation and the interfrequency correlation. We discuss the advantages and disadvantages of the two approaches, and integrate them to exploit their respective advantages. We also present a closed form formula to estimate the directions of source signals from a separating matrix obtained by ICA. Experimental results show that our method solved permutation problems almost perfectly for a situation that two sources were mixed in a room whose reverberation time was 300 ms. 1.
Waveletbased functional mixed models
 Journal of the Royal Statistical Society, Series B
, 2006
"... Summary. Increasingly, scientific studies yield functional data, in which the ideal units of observation are curves and the observed data consist of sets of curves that are sampled on a fine grid. We present new methodology that generalizes the linear mixed model to the functional mixed model framew ..."
Abstract

Cited by 84 (16 self)
 Add to MetaCart
Summary. Increasingly, scientific studies yield functional data, in which the ideal units of observation are curves and the observed data consist of sets of curves that are sampled on a fine grid. We present new methodology that generalizes the linear mixed model to the functional mixed model framework, with model fitting done by using a Bayesian waveletbased approach. This method is flexible, allowing functions of arbitrary form and the full range of fixed effects structures and betweencurve covariance structures that are available in the mixed model framework. It yields nonparametric estimates of the fixed and randomeffects functions as well as the various betweencurve and withincurve covariance matrices.The functional fixed effects are adaptively regularized as a result of the nonlinear shrinkage prior that is imposed on the fixed effects’ wavelet coefficients, and the randomeffect functions experience a form of adaptive regularization because of the separately estimated variance components for each wavelet coefficient. Because we have posterior samples for all model quantities, we can perform pointwise or joint Bayesian inference or prediction on the quantities of the model.The adaptiveness of the method makes it especially appropriate for modelling irregular functional data that are characterized by numerous local features like peaks.
Graphical models and automatic speech recognition
 Mathematical Foundations of Speech and Language Processing
, 2003
"... Graphical models provide a promising paradigm to study both existing and novel techniques for automatic speech recognition. This paper first provides a brief overview of graphical models and their uses as statistical models. It is then shown that the statistical assumptions behind many pattern recog ..."
Abstract

Cited by 77 (15 self)
 Add to MetaCart
(Show Context)
Graphical models provide a promising paradigm to study both existing and novel techniques for automatic speech recognition. This paper first provides a brief overview of graphical models and their uses as statistical models. It is then shown that the statistical assumptions behind many pattern recognition techniques commonly used as part of a speech recognition system can be described by a graph – this includes Gaussian distributions, mixture models, decision trees, factor analysis, principle component analysis, linear discriminant analysis, and hidden Markov models. Moreover, this paper shows that many advanced models for speech recognition and language processing can also be simply described by a graph, including many at the acoustic, pronunciation, and languagemodeling levels. A number of speech recognition techniques born directly out of the graphicalmodels paradigm are also surveyed. Additionally, this paper includes a novel graphical analysis regarding why derivative (or delta) features improve hidden Markov modelbased speech recognition by improving structural discriminability. It also includes an example where a graph can be used to represent language model smoothing constraints. As will be seen, the space of models describable by a graph is quite large. A thorough exploration of this space should yield techniques that ultimately will supersede the hidden Markov model.
DETERMINISTIC EQUIVALENTS FOR CERTAIN FUNCTIONALS OF LARGE RANDOM MATRICES
, 2007
"... Consider an N × n random matrix Yn = (Y n ij) where the entries are given by Y n ij = σij(n) √ X n n ij, the X n ij being independent and identically distributed, centered with unit variance and satisfying some mild moment assumption. Consider now a deterministic N ×n matrix An whose columns and row ..."
Abstract

Cited by 74 (20 self)
 Add to MetaCart
Consider an N × n random matrix Yn = (Y n ij) where the entries are given by Y n ij = σij(n) √ X n n ij, the X n ij being independent and identically distributed, centered with unit variance and satisfying some mild moment assumption. Consider now a deterministic N ×n matrix An whose columns and rows are uniformly bounded in the Euclidean norm. Let Σn = Yn + An. We prove in this article that there exists a deterministic N ×N matrixvalued function Tn(z) analytic in C −R + such that, almost surely, 1 lim
A generalization of blind source separation algorithms for convolutive mixtures based on secondorder statistics
 IEEE Transactions on Speech and Audio Processing
, 2005
"... ..."
(Show Context)
Decoupled Stochastic Mapping
, 2001
"... This paper describes decoupled stochastic mapping (DSM), a new computationally efficient approach to largescale concurrent mapping and localization (CML). DSM reduces the computational burden of conventional stochastic mapping by dividing the environment into multiple overlapping submap regions, ea ..."
Abstract

Cited by 59 (8 self)
 Add to MetaCart
This paper describes decoupled stochastic mapping (DSM), a new computationally efficient approach to largescale concurrent mapping and localization (CML). DSM reduces the computational burden of conventional stochastic mapping by dividing the environment into multiple overlapping submap regions, each with its own stochastic map. Two new approximation techniques are utilized for transferring vehicle state information from one submap to another, yielding a constanttime algorithm whose memory requirements scale linearly with the number of submaps. The approach is demonstrated via simulations and experiments. Simulation results are presented for the case of an autonomous underwater vehicle (AUV) navigating in an unknown environments with 110 and 1200 features using simulated observations of point features by a forward look sonar. Empirical tests are used to examine the consistency of the error bounds calculated by the different methods. Experimental results are also presented for an environment with 93 features using sonar data obtained in a 3 by 9 by 1 m testing tank.