Results 1  10
of
13
Denoising Source Separation
"... A new algorithmic framework called denoising source separation (DSS) is introduced. The main benefit of this framework is that it allows for easy development of new source separation algorithms which are optimised for specific problems. In this framework, source separation algorithms are constuct ..."
Abstract

Cited by 49 (7 self)
 Add to MetaCart
A new algorithmic framework called denoising source separation (DSS) is introduced. The main benefit of this framework is that it allows for easy development of new source separation algorithms which are optimised for specific problems. In this framework, source separation algorithms are constucted around denoising procedures. The resulting algorithms can range from almost blind to highly specialised source separation algorithms. Both simple linear and more complex nonlinear or adaptive denoising schemes are considered. Some existing independent component analysis algorithms are reinterpreted within DSS framework and new, robust blind source separation algorithms are suggested. Although DSS algorithms need not be explicitly based on objective functions, there is often an implicit objective function that is optimised. The exact relation between the denoising procedure and the objective function is derived and a useful approximation of the objective function is presented. In the experimental section, various DSS schemes are applied extensively to artificial data, to real magnetoencephalograms and to simulated CDMA mobile network signals. Finally, various extensions to the proposed DSS algorithms are considered. These include nonlinear observation mappings, hierarchical models and overcomplete, nonorthogonal feature spaces. With these extensions, DSS appears to have relevance to many existing models of neural information processing.
The Cocktail Party Problem
, 2005
"... This review presents an overview of a challenging problem in auditory perception, the cocktail party phenomenon, the delineation of which goes back to a classic paper by Cherry in 1953. In this review, we address the following issues: (1) human auditory scene analysis, which is a general process car ..."
Abstract

Cited by 47 (0 self)
 Add to MetaCart
This review presents an overview of a challenging problem in auditory perception, the cocktail party phenomenon, the delineation of which goes back to a classic paper by Cherry in 1953. In this review, we address the following issues: (1) human auditory scene analysis, which is a general process carried out by the auditory system of a human listener; (2) insight into auditory perception, which is derived from Marr’s vision theory; (3) computational auditory scene analysis, which focuses on specific approaches aimed at solving the machine cocktail party problem; (4) active audition, the proposal for which is motivated by analogy with active vision, and (5) discussion of brain theory and independent component analysis, on the one hand, and correlative neural firing, on the other.
Variational and stochastic inference for Bayesian source separation
, 2007
"... We tackle the general linear instantaneous model (possibly underdetermined and noisy) where we model the source prior with a Student t distribution. The conjugateexponential characterisation of the t distribution as an infinite mixture of scaled Gaussians enables us to do efficient inference. We st ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
We tackle the general linear instantaneous model (possibly underdetermined and noisy) where we model the source prior with a Student t distribution. The conjugateexponential characterisation of the t distribution as an infinite mixture of scaled Gaussians enables us to do efficient inference. We study two wellknown inference methods, Gibbs sampler and variational Bayes for Bayesian source separation. We derive both techniques as local message passing algorithms to highlight their algorithmic similarities and to contrast their different convergence characteristics and computational requirements. Our simulation results suggest that typical posterior distributions in source separation have multiple local maxima. Therefore we propose a hybrid approach where we explore the state space with a Gibbs sampler and then switch to a deterministic algorithm. This approach seems to be able to combine the speed of the variational approach with the robustness of the Gibbs sampler.
A novel framework for imputation of missing values in databases
 IEEE Transactions on Systems, Man, and Cybernetics Part A: Systems and Humans
, 2007
"... ..."
(Show Context)
Autoregressive independent process analysis without combinatorial efforts
 PATTERN ANAL APPLIC
, 2009
"... ..."
(Show Context)
Separation Theorem for Independent Subspace Analysis and its Consequences ✩
"... Independent component analysis (ICA) the theory of mixed, independent, nonGaussian sources has a central role in signal processing, computer vision and pattern recognition. One of the most fundamental conjectures of this research eld is that independent subspace analysis (ISA) the extension of the ..."
Abstract
 Add to MetaCart
(Show Context)
Independent component analysis (ICA) the theory of mixed, independent, nonGaussian sources has a central role in signal processing, computer vision and pattern recognition. One of the most fundamental conjectures of this research eld is that independent subspace analysis (ISA) the extension of the ICA problem, where groups of sources are independent can be solved by traditional ICA followed by grouping the ICA components. The conjecture, called ISA separation principle, (i) has been rigorously proven for some distribution types recently, (ii) forms the basis of the stateoftheart ISA solvers, (iii) enables one to estimate the unknown number and the dimensions of the sources e ciently, and (iv) can be extended to generalizations of the ISA task, such as di erent linear, controlled, post nonlinear, complex valued, partially observed problems, as well as to problems dealing with nonparametric source dynamics. Here, we shall review the advances on this eld.
Some Imputation Methods to Treat Missing Values in Knowledge Discovery in Data warehouse
"... One major problem in the data cleaning & data reduction step of KDD process is the presence of missing values in attributes. Many of analysis task have to deal with missing values and have developed several treatments to guess them. One of the most common method to replace the missing values is ..."
Abstract
 Add to MetaCart
One major problem in the data cleaning & data reduction step of KDD process is the presence of missing values in attributes. Many of analysis task have to deal with missing values and have developed several treatments to guess them. One of the most common method to replace the missing values is the mean method of imputation. In this paper we suggested a new imputation method by combining factor type and compromised imputation method, using twophase sampling scheme and by using this method we impute the missing values of a target attribute in a data warehouse. Our simulation study shows that the estimator of mean from this method is found more efficient than compare to other.
Journal of Machine Learning Research Submitted 03/04; Revised Denoising Source Separation
"... A new algorithmic framework called denoising source separation (DSS) is introduced. The main benefit of this framework is that it allows for easy development of new source separation algorithms which are optimised for specific problems. In this framework, source separation algorithms are constucted ..."
Abstract
 Add to MetaCart
A new algorithmic framework called denoising source separation (DSS) is introduced. The main benefit of this framework is that it allows for easy development of new source separation algorithms which are optimised for specific problems. In this framework, source separation algorithms are constucted around denoising procedures. The resulting algorithms can range from almost blind to highly specialised source separation algorithms. Both simple linear and more complex nonlinear or adaptive denoising schemes are considered. Some existing independent component analysis algorithms are reinterpreted within DSS framework and new, robust blind source separation algorithms are suggested. Although DSS algorithms need not be explicitly based on objective functions, there is often an implicit objective function that is optimised. The exact relation between the denoising procedure and the objective function is derived and a useful approximation of the objective function is presented. In the experimental section, various DSS schemes are applied extensively to artificial data, to real magnetoencephalograms and to simulated CDMA mobile network signals. Finally, various extensions to the proposed DSS algorithms are considered. These include nonlinear observation mappings, hierarchical models and overcomplete, nonorthogonal feature spaces. With these extensions, DSS appears to have relevance to many existing models of neural information processing.
Blind Separation of Nonlinear Mixtures by Variational Bayesian Learning
"... Blind separation of sources from nonlinear mixtures is a challenging and often illposed problem. We present three methods for solving this problem: an improved nonlinear factor analysis (NFA) method using multilayer perceptron (MLP) network to model the nonlinearity, a hierarchical NFA (HNFA) method ..."
Abstract
 Add to MetaCart
(Show Context)
Blind separation of sources from nonlinear mixtures is a challenging and often illposed problem. We present three methods for solving this problem: an improved nonlinear factor analysis (NFA) method using multilayer perceptron (MLP) network to model the nonlinearity, a hierarchical NFA (HNFA) method suitable for larger problems and a postnonlinear NFA (PNFA) method for more restricted postnonlinear mixtures. The methods are based on variational Bayesian learning, which provides the needed regularisation and allows for easy handling of missing data. While the basic methods are incapable of recovering the correct rotation of the source space, they can discover the underlying nonlinear manifold and allow reconstruction of the original sources using standard linear independent component analysis (ICA) techniques. Key words: Bayesian learning, blind source separation, nonlinear mixtures, postnonlinear mixtures, variational Bayes PACS: 1