## Time-frequency learning machines (2007)

### Cached

### Download Links

- [www.cedric-richard.fr]
- [www.cedric-richard.fr]
- DBLP

### Other Repositories/Bibliography

Venue: | IEEE Trans. on Signal Proc |

Citations: | 6 - 5 self |

### BibTeX

@ARTICLE{Honeine07time-frequencylearning,

author = {Paul Honeine and Cédric Richard and Patrick Flandrin},

title = {Time-frequency learning machines},

journal = {IEEE Trans. on Signal Proc},

year = {2007},

volume = {55},

number = {7},

pages = {3930--3936}

}

### OpenURL

### Abstract

Over the last decade, the theory of reproducing kernels has made a major breakthrough in the field of pattern recognition. It has led to new algorithms, with improved performance and lower computational cost, for non-linear analysis in high dimensional feature spaces. Our paper is a further contribution which extends the framework of the so-called kernel learning machines to time-frequency analysis, showing that some specific reproducing kernels allow these algorithms to operate in the time-frequency domain. This link offers new perspectives in the field of non-stationary signal analysis, which can benefit from the developments of pattern recognition and Statistical Learning Theory.

### Citations

9811 | Statistical Learning Theory
- Vapnik
- 1998
(Show Context)
Citation Context ...principle is commonly called the kernel trick. Kernelbased algorithms are computationally very efficient, and generally have their generalization performance guaranteed by Statistical Learning Theory =-=[11,12]-=-. With the exception of [13, 14], there are very few works combining kernel learning machines and time-frequency analysis, although the interest in pattern recognition based on time-frequency represen... |

1403 | A training algorithm for optimal margin classifiers
- Boser
- 1992
(Show Context)
Citation Context ... work of Aronszajin [6], pattern recognition based on reproducing kernel Hilbert spaces (RKHS) has gained wide popularity. The most prominent recent developments include support vector machines (SVM) =-=[7]-=-, kernel principal component analysis (KPCA) [8], kernel Fisher discriminant analysis (KFDA) [9], and its generalization to multiclass problems, kernel generalized discriminant analysis (KGDA) [10]. A... |

1135 | Nonlinear component analysis as a kernel eigenvalue problem
- Scholkopf, Smola, et al.
- 1998
(Show Context)
Citation Context ...ed on reproducing kernel Hilbert spaces (RKHS) has gained wide popularity. The most prominent recent developments include support vector machines (SVM) [7], kernel principal component analysis (KPCA) =-=[8]-=-, kernel Fisher discriminant analysis (KFDA) [9], and its generalization to multiclass problems, kernel generalized discriminant analysis (KGDA) [10]. A key property behind such algorithms is that the... |

832 |
Theory of reproducing kernels
- Aronszajn
- 1950
(Show Context)
Citation Context ...d. As an example, distributions dedicated to signal analysis are studied in [1,2,3] whereas optimal distributions for signal detection are considered in [4,5]. Since the pioneering work of Aronszajin =-=[6]-=-, pattern recognition based on reproducing kernel Hilbert spaces (RKHS) has gained wide popularity. The most prominent recent developments include support vector machines (SVM) [7], kernel principal c... |

355 | Fisher discriminant analysis with kernels
- Mika, Ratsch, et al.
- 1999
(Show Context)
Citation Context ...as gained wide popularity. The most prominent recent developments include support vector machines (SVM) [7], kernel principal component analysis (KPCA) [8], kernel Fisher discriminant analysis (KFDA) =-=[9]-=-, and its generalization to multiclass problems, kernel generalized discriminant analysis (KGDA) [10]. A key property behind such algorithms is that they can be expressed in terms of inner products on... |

339 | Kernel independent component analysis
- Bach, Jordan
(Show Context)
Citation Context ...for non-stationary signal analysis through unsupervised and supervised learning problems. Time-frequency learning machines can also be used in many other applications, such as blind source separation =-=[23]-=- and filtering [24], where kernel-based methods have proved their efficiency. In ongoing studies, we are investigating kernel-based methodologies that could be advantageously used to solve recurrent p... |

293 |
Some results on Tchebycheffian spline functions
- Kimeldorf, Wahba
- 1971
(Show Context)
Citation Context ...l learning machines are statistical learning algorithms that take advantage of the geometric and regularizing properties of RKHS, which are established by the kernel trick and the representer theorem =-=[15,16]-=-. In this section, we briefly introduce these concepts through an example. A. Example of kernel-based method: the KPCA algorithm Problems commonly encountered in machine learning start with a training... |

237 | Generalized discriminant analysis using a kernel approach - Baudat, Anouar - 2000 |

234 | On the mathematical foundations of learning
- CUCKER, SMALE
- 2001
(Show Context)
Citation Context ...principle is commonly called the kernel trick. Kernelbased algorithms are computationally very efficient, and generally have their generalization performance guaranteed by Statistical Learning Theory =-=[11,12]-=-. With the exception of [13, 14], there are very few works combining kernel learning machines and time-frequency analysis, although the interest in pattern recognition based on time-frequency represen... |

146 | A generalized representer theorem - Schölkopf, Herbrich, et al. - 2001 |

111 | Support vector method for novelty detection
- Schölkopf, Williamson, et al.
- 2000
(Show Context)
Citation Context ...er and will be addressed in the future. D. Signal classification with time-frequency learning machines The last ten years have seen an explosion of research in supervised [9, 10, 20] and unsupervised =-=[21]-=- classification techniques based on kernels; see [22] for a recent survey. These include SVM, which map data into a high dimensionals0.5 frequency (a) (b) 0 0 time 63 Fig. 3. Discriminant information ... |

101 |
Improving the readability of timefrequency and time-scale representations by reassignment methods
- Auger, Flandrin
- 1995
(Show Context)
Citation Context ... time-frequency distributions, parametric or otherwise, in which optimal solutions for a given signal or task can be selected. As an example, distributions dedicated to signal analysis are studied in =-=[1,2,3]-=- whereas optimal distributions for signal detection are considered in [4,5]. Since the pioneering work of Aronszajin [6], pattern recognition based on reproducing kernel Hilbert spaces (RKHS) has gain... |

96 |
Time-Frequency/Time-Scale analysis
- Flandrin
- 1999
(Show Context)
Citation Context ...hines: general principles The reason for time-frequency analysis is to give a mathematical core to the intuitive concept of time-varying Fourier spectrum for non-stationary signals. For a survey, see =-=[17]-=- and references therein. Most of the parametric distributions of current interest belong to the Cohen class, which has proven useful in identifying non-stationarities in signals produced by a host of ... |

33 |
A time–frequency formulation of optimum detection
- Flandrin
- 1988
(Show Context)
Citation Context ...tions for a given signal or task can be selected. As an example, distributions dedicated to signal analysis are studied in [1,2,3] whereas optimal distributions for signal detection are considered in =-=[4,5]-=-. Since the pioneering work of Aronszajin [6], pattern recognition based on reproducing kernel Hilbert spaces (RKHS) has gained wide popularity. The most prominent recent developments include support ... |

33 | Optimal detection using bilinear time-frequency and time-scale representations
- Sayeed, Jones
- 1995
(Show Context)
Citation Context ...tions for a given signal or task can be selected. As an example, distributions dedicated to signal analysis are studied in [1,2,3] whereas optimal distributions for signal detection are considered in =-=[4,5]-=-. Since the pioneering work of Aronszajin [6], pattern recognition based on reproducing kernel Hilbert spaces (RKHS) has gained wide popularity. The most prominent recent developments include support ... |

29 | An adaptive optimal-kernel time-frequency representation
- Baraniuk, Jones
- 1995
(Show Context)
Citation Context ... time-frequency distributions, parametric or otherwise, in which optimal solutions for a given signal or task can be selected. As an example, distributions dedicated to signal analysis are studied in =-=[1,2,3]-=- whereas optimal distributions for signal detection are considered in [4,5]. Since the pioneering work of Aronszajin [6], pattern recognition based on reproducing kernel Hilbert spaces (RKHS) has gain... |

16 |
Optimised support vector machines for nonstationary signal classification
- Davy, Gretton, et al.
(Show Context)
Citation Context ...e kernel trick. Kernelbased algorithms are computationally very efficient, and generally have their generalization performance guaranteed by Statistical Learning Theory [11,12]. With the exception of =-=[13, 14]-=-, there are very few works combining kernel learning machines and time-frequency analysis, although the interest in pattern recognition based on time-frequency representations remains strong. In [13],... |

11 | A primer on kernel methods
- Vert, Tsuda, et al.
- 2004
(Show Context)
Citation Context ...ssification with time-frequency learning machines The last ten years have seen an explosion of research in supervised [9, 10, 20] and unsupervised [21] classification techniques based on kernels; see =-=[22]-=- for a recent survey. These include SVM, which map data into a high dimensionals0.5 frequency (a) (b) 0 0 time 63 Fig. 3. Discriminant information extracted by (a) Wigner-based KFDA and (b) Wigner-bas... |

10 | Kernel recursive least squares
- Engel, Mannor, et al.
(Show Context)
Citation Context ...signal analysis through unsupervised and supervised learning problems. Time-frequency learning machines can also be used in many other applications, such as blind source separation [23] and filtering =-=[24]-=-, where kernel-based methods have proved their efficiency. In ongoing studies, we are investigating kernel-based methodologies that could be advantageously used to solve recurrent problems in the fiel... |

7 | Decomposing erp timefrequency energy using pca, Clinical Neurophysiology 116 - Bernat, Williams, et al. |

6 |
Joint Recursive Implementation of Time-Frequency Representations and their Modified Version by the Reassignment Method. Signal Process
- RICHARD, LENGELLE
- 1997
(Show Context)
Citation Context ...panned by the eigendistributions Φ1 and Φ2. i=1 TABLE I The Wigner-based KPCA algorithm Instructions Complexity 1. Compute the Wigner distribution of each one of the n signals O(d 2 log d) per signal =-=[18]-=- 2. Compute the Gram matrix KW(i,j) = κW(xi,xj) O(dn 2 ) 3. Perform eigendecomposition of KW − 1nKW − KW1n + 1nKW1n O(n 3 ) 4. Compute and normalize the eigendistributions Φk O(d 2 n) per eigendistrib... |

5 |
A Generalized Representer Theorem NeuroCOLT, Royal Holloway
- Schölkopf, Herbrich, et al.
- 2000
(Show Context)
Citation Context ...l learning machines are statistical learning algorithms that take advantage of the geometric and regularizing properties of RKHS, which are established by the kernel trick and the representer theorem =-=[15,16]-=-. In this section, we briefly introduce these concepts through an example. A. Example of kernel-based method: the KPCA algorithm Problems commonly encountered in machine learning start with a training... |

4 | Optimal selection of time-frequency representations for signal classification: A kerneltarget alignment approach
- Honeine, Richard, et al.
(Show Context)
Citation Context ...lve recurrent problems in the field of non-stationary signal analysis. For instance, we have recently proposed a method for selecting time-frequency distributions appropriate for given learning tasks =-=[25]-=-. It is based on a criterion that has recently emerged from the machine learning literature: the kernel-target alignment. Further work may contribute to strengthen these connections with the most rece... |

2 |
Adaptive diffusion of time-frequency and time-scale representations: A review
- Gosme, Richard, et al.
- 2005
(Show Context)
Citation Context ... time-frequency distributions, parametric or otherwise, in which optimal solutions for a given signal or task can be selected. As an example, distributions dedicated to signal analysis are studied in =-=[1,2,3]-=- whereas optimal distributions for signal detection are considered in [4,5]. Since the pioneering work of Aronszajin [6], pattern recognition based on reproducing kernel Hilbert spaces (RKHS) has gain... |

2 |
An improved training algorithm for nonlinear kernel discriminants
- Abdallah, Richard, et al.
- 2004
(Show Context)
Citation Context ... beyond the scope of this paper and will be addressed in the future. D. Signal classification with time-frequency learning machines The last ten years have seen an explosion of research in supervised =-=[9, 10, 20]-=- and unsupervised [21] classification techniques based on kernels; see [22] for a recent survey. These include SVM, which map data into a high dimensionals0.5 frequency (a) (b) 0 0 time 63 Fig. 3. Dis... |

1 |
Non parametric regression with wavelet kernels
- Rakotomamonjy, Mary, et al.
(Show Context)
Citation Context ...e kernel trick. Kernelbased algorithms are computationally very efficient, and generally have their generalization performance guaranteed by Statistical Learning Theory [11,12]. With the exception of =-=[13, 14]-=-, there are very few works combining kernel learning machines and time-frequency analysis, although the interest in pattern recognition based on time-frequency representations remains strong. In [13],... |

1 |
Discrete time and frequency WignerVille distribution: Moyal’s formula and aliasing
- Chassande-Mottin, Pai
- 2005
(Show Context)
Citation Context ...the timefrequency domain. Wigner-based KPCA was performed to determine the eigendistributions Φk. Their calculation was based on the discrete-time discrete-frequency Wigner distribution introduced in =-=[19]-=- since it satisfies most of properties of its continuous counterpart (5), in particular unitarity. The first eigendistribution Φ1 represented in Figure 1(a) shows that significant information has been... |

1 | Time-Frequency Signal Analysis and Applications - Boashash, Ed - 2003 |

1 | The singular value decomposition of the wigner distribution and its applications - Marinovitch - 1997 |

1 | Probabilistic classifiers and time-scale representations: Application to the monitoring of a tramway guiding system - Mamar, Chainais, et al. |