Results 1 - 10
of
253
Reproducing kernel Hilbert spaces for spike train analysis
- In Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP-2008, Las Vegas
, 2008
"... This paper introduces a generalized cross-correlation (GCC) measure for spike train analysis derived from reproducing kernel Hilbert spaces (RKHS) theory. An estimator for GCC is derived that does not depend on binning or a specific kernel and it operates directly and efficiently on spike times. For ..."
Abstract
-
Cited by 5 (5 self)
- Add to MetaCart
This paper introduces a generalized cross-correlation (GCC) measure for spike train analysis derived from reproducing kernel Hilbert spaces (RKHS) theory. An estimator for GCC is derived that does not depend on binning or a specific kernel and it operates directly and efficiently on spike times
Online Learning with Kernels
, 2003
"... Kernel based algorithms such as support vector machines have achieved considerable success in various problems in the batch setting where all of the training data is available in advance. Support vector machines combine the so-called kernel trick with the large margin idea. There has been little u ..."
Abstract
-
Cited by 2831 (123 self)
- Add to MetaCart
use of these methods in an online setting suitable for real-time applications. In this paper we consider online learning in a Reproducing Kernel Hilbert Space. By considering classical stochastic gradient descent within a feature space, and the use of some straightforward tricks, we develop simple
Kernel independent component analysis
- Journal of Machine Learning Research
, 2002
"... We present a class of algorithms for independent component analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical propert ..."
Abstract
-
Cited by 464 (24 self)
- Add to MetaCart
We present a class of algorithms for independent component analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical
Optimization in reproducing kernel Hilbert spaces of spike trains
- IN COMPUTATIONAL NEUROSCIENCE
, 2010
"... This paper presents a framework based on reproducing kernel Hilbert spaces (RKHS) for optimization with spike trains. To establish the RKHS for optimization we start by introducing kernels for spike trains. It is shown that spike train kernels can be built from ideas of kernel methods, or from the i ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
This paper presents a framework based on reproducing kernel Hilbert spaces (RKHS) for optimization with spike trains. To establish the RKHS for optimization we start by introducing kernels for spike trains. It is shown that spike train kernels can be built from ideas of kernel methods, or from
A reproducing kernel Hilbert space framework for spike train signal processing
- Neural Comp
, 2009
"... This paper presents a general framework based on reproducing kernel Hilbert spaces (RKHS) to mathematically describe and manipulate spike trains. The main idea is the definition of inner products to allow spike train signal processing from basic principles while incorporating their statistical descr ..."
Abstract
-
Cited by 23 (11 self)
- Add to MetaCart
This paper presents a general framework based on reproducing kernel Hilbert spaces (RKHS) to mathematically describe and manipulate spike trains. The main idea is the definition of inner products to allow spike train signal processing from basic principles while incorporating their statistical
AND REPRODUCING KERNEL HILBERT SPACES
"... I have been blessed with many wonderful people around me in this journey towards my doctoral degree. First and foremost, I would like to thank my adviser, Dr. Vern I. Paulsen, for all his contribution of time, ideas, insightful comments, and funding in this pursuit of mine. It was an honor and pleas ..."
Abstract
- Add to MetaCart
teachers at the Mathematical Sciences Foundation (MSF) and University of Delhi for training me in various foundational areas of Mathematics. I extend my special thanks to Dr. Sanjeev Agrawal, Dr. Amber Habib, and Dr.
Consistency of the group lasso and multiple kernel learning
- JOURNAL OF MACHINE LEARNING RESEARCH
, 2007
"... We consider the least-square regression problem with regularization by a block 1-norm, i.e., a sum of Euclidean norms over spaces of dimensions larger than one. This problem, referred to as the group Lasso, extends the usual regularization by the 1-norm where all spaces have dimension one, where it ..."
Abstract
-
Cited by 274 (33 self)
- Add to MetaCart
are replaced by functions and reproducing kernel Hilbert norms, the problem is usually referred to as multiple kernel learning and is commonly used for learning from heterogeneous data sources and for non linear variable selection. Using tools from functional analysis, and in particular covariance operators
Frame analysis and approximation in reproducing kernel Hilbert spaces
, 2008
"... Abstract. We consider frames F in a given Hilbert space, and we show that every F may be obtained in a constructive way from a reproducing kernel and an orthonormal basis in an ambient Hilbert space. The construction is operator-theoretic, building on a geometric formula for the analysis operator de ..."
Abstract
-
Cited by 3 (2 self)
- Add to MetaCart
Abstract. We consider frames F in a given Hilbert space, and we show that every F may be obtained in a constructive way from a reproducing kernel and an orthonormal basis in an ambient Hilbert space. The construction is operator-theoretic, building on a geometric formula for the analysis operator
Bayesian Learning in Reproducing Kernel Hilbert Spaces
- MACHINE LEARNING
, 1999
"... Support Vector Machines find the hypothesis that corresponds to the centre of the largest hypersphere that can be placed inside version space, i.e. the space of all consistent hypotheses given a training set. The boundaries of version space touched by this hypersphere define the support vectors. An ..."
Abstract
-
Cited by 21 (13 self)
- Add to MetaCart
Support Vector Machines find the hypothesis that corresponds to the centre of the largest hypersphere that can be placed inside version space, i.e. the space of all consistent hypotheses given a training set. The boundaries of version space touched by this hypersphere define the support vectors
Sparseness of support vector machines
, 2003
"... Support vector machines (SVMs) construct decision functions that are linear combinations of kernel evaluations on the training set. The samples with non-vanishing coefficients are called support vectors. In this work we establish lower (asymptotical) bounds on the number of support vectors. On our w ..."
Abstract
-
Cited by 271 (35 self)
- Add to MetaCart
in the associated reproducing kernel Hilbert space.
Results 1 - 10
of
253