Results 11  20
of
60
Unified inference for variational Bayesian linear Gaussian statespace models
 In Proceedings of NIPS 2006
"... Abstract. Linear Gaussian StateSpace Models are widely used and a Bayesian treatment of parameters is therefore of considerable interest. The approximate Variational Bayesian method applied to these models is an attractive approach, used successfully in applications ranging from acoustics to bioinf ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
Abstract. Linear Gaussian StateSpace Models are widely used and a Bayesian treatment of parameters is therefore of considerable interest. The approximate Variational Bayesian method applied to these models is an attractive approach, used successfully in applications ranging from acoustics to bioinformatics. The most challenging aspect of implementing the method is in performing inference on the hidden state sequence of the model. We show how to convert the inference problem so that standard and stable Kalman Filtering/Smoothing recursions from the literature may be applied. This is in contrast to previously published approaches based on Belief Propagation. Our framework both simplifies and unifies the inference problem, so that future applications may be easily developed. We demonstrate the elegance of the approach on Bayesian temporal ICA, with an application to finding independent components in noisy EEG signals. IDIAP–RR 0650 1 1 Linear Gaussian StateSpace Models Linear Gaussian StateSpace Models (LGSSMs) 1 are fundamental in timeseries analysis [1, 2, 3]. In these models the observations v1:T 2 are generated from an underlying dynamical system on h1:T according to vt = Bht + η v t, η v t ∼ N(0V,ΣV); ht = Aht−1 + η h t, η h t ∼ N (0H,ΣH), where N(µ,Σ) denotes a Gaussian with mean µ and covariance Σ, and 0X denotes an Xdimensional zero vector. The observation vt has dimension V and the hidden state ht dimension H. Probabilistically, the LGSSM is defined by: T∏ p(v1:T,h1:T Θ) = p(v1h1)p(h1) p(vtht)p(htht−1), t=2
Learning nonlinear statespace models for control
, 2005
"... Abstract — This paper studies the learning of nonlinear statespace models for a control task. This has some advantages over traditional methods. Variational Bayesian learning provides a framework where uncertainty is explicitly taken into account and system identification can be combined with model ..."
Abstract

Cited by 13 (7 self)
 Add to MetaCart
Abstract — This paper studies the learning of nonlinear statespace models for a control task. This has some advantages over traditional methods. Variational Bayesian learning provides a framework where uncertainty is explicitly taken into account and system identification can be combined with modelpredictive control. Three different control schemes are used. One of them, optimistic inference control, is a novel method based directly on the probabilistic modelling. Simulations with a cartpole swingup task confirm that the latent state space provides a representation that is easier to predict and control than the original observation space. I.
Dynamical Factor Analysis Of Rhythmic Magnetoencephalographic Activity
 in Proc. Int. Conf. on Independent Component Analysis and Signal Separation (ICA2001
, 2001
"... Dynamical factor analysis (DFA) is a generative dynamical algorithm, with linear mapping from factors to the observations and nonlinear mapping of the factor dynamics. The latter is modeled by a multilayer perceptron. Ensemble learning is used to estimate the DFA model in an unsupervised manner. The ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
Dynamical factor analysis (DFA) is a generative dynamical algorithm, with linear mapping from factors to the observations and nonlinear mapping of the factor dynamics. The latter is modeled by a multilayer perceptron. Ensemble learning is used to estimate the DFA model in an unsupervised manner. The performance of the DFA have been tested in a set of artificially generated noisy modulated sinusoids. Furthermore, we have applied it to magnetoencephalographic data containing bursts of oscillatory brain activity. This paper shows that DFA can correctly estimate the underlying factors in both data sets.
Building Blocks For Variational Bayesian Learning Of Latent Variable Models
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2006
"... We introduce standardised building blocks designed to be used with variational Bayesian learning. The blocks include Gaussian variables, summation, multiplication, nonlinearity, and delay. A large variety of latent variable models can be constructed from these blocks, including variance models a ..."
Abstract

Cited by 11 (8 self)
 Add to MetaCart
We introduce standardised building blocks designed to be used with variational Bayesian learning. The blocks include Gaussian variables, summation, multiplication, nonlinearity, and delay. A large variety of latent variable models can be constructed from these blocks, including variance models and nonlinear modelling, which are lacking from most existing variational systems. The introduced blocks are designed to fit together and to yield e#cient update rules. Practical implementation of various models is easy thanks to an associated software package which derives the learning formulas automatically once a specific model structure has been fixed. Variational Bayesian learning provides a cost function which is used both for updating the variables of the model and for optimising the model structure. All the computations can be carried out locally, resulting in linear computational complexity. We present
Missing Values in Hierarchical Nonlinear Factor Analysis
 In Proc. of the Int. Conf. on Artificial Neural Networks and Neural Information Processing  ICANN/ICONIP 2003
, 2003
"... The properties of hierarchical nonlinear factor analysis (HNFA) recently introduced by Valpola and others [3] are studied by reconstructing values. The variational Bayesian learning algorithm for HNFA has linear computational complexity and is able to infer the structure of the model in addition to ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
The properties of hierarchical nonlinear factor analysis (HNFA) recently introduced by Valpola and others [3] are studied by reconstructing values. The variational Bayesian learning algorithm for HNFA has linear computational complexity and is able to infer the structure of the model in addition to estimating the parameters. To compare HNFA with other methods, we continued the experiments with speech spectrograms in [1] comparing nonlinear factor analysis (NFA) with linear factor analysis (FA) and with the selforganising map. Experiments suggest that HNFA lies between FA and NFA in handling nonlinear problems. Furthermore, HNFA gives better reconstructions than FA and it is more reliable than NFA.
Natural Conjugate Gradient in Variational Inference
"... in machine learning often adapt a parametric probability distribution to optimize a given objective function. This view is especially useful when applying variational Bayes (VB) to models outside the conjugateexponential family. For them, variational Bayesian expectation maximization (VB EM) algori ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
in machine learning often adapt a parametric probability distribution to optimize a given objective function. This view is especially useful when applying variational Bayes (VB) to models outside the conjugateexponential family. For them, variational Bayesian expectation maximization (VB EM) algorithms are not easily available, and gradientbased methods are often used as alternatives. Traditional natural gradient methods use the Riemannian structure (or geometry) of the predictive distribution to speed up maximum likelihood estimation. We propose using the geometry of the variational approximating distribution instead to speed up a conjugate gradient method for variational learning and inference. The computational overhead is small due to the simplicity of the approximating distribution. Experiments with realworld speech data show significant
Semiblind source separation of climate data detects El Niño as the component with the highest interannual variability
 in Proceedings of International Joint Conference on Neural Networks (IJCNN’2005
, 2005
"... This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Helsinki University of Technology's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this m ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Helsinki University of Technology's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to
Detecting Process State Changes By Nonlinear Blind Source Separation
, 2001
"... A variant of nonlinear blind source separation, the Nonlinear Dynamic Factor Analysis (NDFA) model, is based on noisy nonlinear mixtures of state variables, which are controlled by nonlinear system dynamics. The problem setting is blind because both the state variables, the nonlinear mixing model, a ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
A variant of nonlinear blind source separation, the Nonlinear Dynamic Factor Analysis (NDFA) model, is based on noisy nonlinear mixtures of state variables, which are controlled by nonlinear system dynamics. The problem setting is blind because both the state variables, the nonlinear mixing model, and the nonlinear dynamics are unknown. As a special problem we consider the ability of NDFA to detect abrupt changes in the process dynamics. It is shown experimentally that NDFA is highly accurate and outperforms several standard change detection methods in this task.
Adaptive BCI based on variational Bayesian Kalman filtering: an empirical evaluation
"... This paper proposes the use of variational Kalman ltering as an inference technique for adaptive classi cation in a brain computer interface (BCI). The proposed algorithm translates EEG segments adaptively into probabilities of cognitive states. It thus allows for nonstationarities in the joint p ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
This paper proposes the use of variational Kalman ltering as an inference technique for adaptive classi cation in a brain computer interface (BCI). The proposed algorithm translates EEG segments adaptively into probabilities of cognitive states. It thus allows for nonstationarities in the joint process over cognitive state and generated EEG which may occur during a consecutive number of trials. Nonstationarities may have technical reasons (e.g. changes in impedance between scalp and electrodes) or be caused by learning eects in subjects. We compare the performance of the proposed method against an equivalent static classi er by estimating the generalization accuracy and the bit rate of the BCI. Using data from two studies with healthy subjects, we conclude that adaptive classi cation signi cantly improves BCI performance. Averaging over all subjects that participated in the respective study, we obtain, depending on the cognitive task pairing, an increase both in generalization accuracy and bit rate of up to 8%. We may thus conclude that adaptive inference can play a signi cant contribution in the quest of increasing bit rates and robustness of current BCI technology. This is especially true since the proposed algorithm can be applied in real time.
Bayes Blocks: An implementation of the variational Bayesian building blocks framework
 In Proceedings of the 21st Conference on Uncertainty in Artificial Intelligence, UAI 2005
, 2005
"... A software library for constructing and learning probabilistic models is presented. The library offers a set of building blocks from which a large variety of static and dynamic models can be built. These include hierarchical models for variances of other variables and many nonlinear models. The unde ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
A software library for constructing and learning probabilistic models is presented. The library offers a set of building blocks from which a large variety of static and dynamic models can be built. These include hierarchical models for variances of other variables and many nonlinear models. The underlying variational Bayesian machinery, providing for fast and robust estimation but being mathematically rather involved, is almost completely hidden from the user thus making it very easy to use the library. The building blocks include Gaussian, rectified Gaussian and mixtureofGaussians variables and computational nodes which can be combined rather freely. 1