Results 1 
6 of
6
Bayesian Modeling with Gaussian Processes using the GPstuff Toolbox
, 2014
"... Gaussian processes (GP) are powerful tools for probabilistic modeling purposes. They can be used to define prior distributions over latent functions in hierarchical Bayesian models. The prior over functions is defined implicitly by the mean and covariance function, which determine the smoothness and ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
Gaussian processes (GP) are powerful tools for probabilistic modeling purposes. They can be used to define prior distributions over latent functions in hierarchical Bayesian models. The prior over functions is defined implicitly by the mean and covariance function, which determine the smoothness and variability of the function. The inference can then be conducted directly in the function space by evaluating or approximating the posterior process. Despite their attractive theoretical properties GPs provide practical challenges in their implementation. GPstuff is a versatile collection of computational tools for GP models compatible with Linux and Windows MATLAB and Octave. It includes, among others, various inference methods, sparse approximations and tools for model assessment. In this work, we review these tools and demonstrate the use of GPstuff in several models.
Batch ContinuousTime Trajectory Estimation as Exactly Sparse Gaussian Process Regression
"... Abstract—In this paper, we revisit batch state estimation through the lens of Gaussian process (GP) regression. We consider continuousdiscrete estimation problems wherein a trajectory is viewed as a onedimensional GP, with time as the independent variable. Our continuoustime prior can be defined ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract—In this paper, we revisit batch state estimation through the lens of Gaussian process (GP) regression. We consider continuousdiscrete estimation problems wherein a trajectory is viewed as a onedimensional GP, with time as the independent variable. Our continuoustime prior can be defined by any linear, timevarying stochastic differential equation driven by white noise; this allows the possibility of smoothing our trajectory estimates using a variety of vehicle dynamics models (e.g., ‘constantvelocity’). We show that this class of prior results in an inverse kernel matrix (i.e., covariance matrix between all pairs of measurement times) that is exactly sparse (blocktridiagonal) and that this can be exploited to carry out GP regression (and interpolation) very efficiently. Though the prior is continuous, we consider measurements to occur at discrete times. When the measurement model is also linear, this GP approach is equivalent to classical, discretetime smoothing (at the measurement times). When the measurement model is nonlinear, we iterate over the whole trajectory (as is common in vision and robotics) to maximize accuracy. We test the approach experimentally on a simultaneous trajectory estimation and mapping problem using a mobile robot dataset. I.
GAUSSIAN QUADRATURES FOR STATE SPACE APPROXIMATION OF SCALE MIXTURES OF SQUARED EXPONENTIAL COVARIANCE FUNCTIONS
"... Stationary onedimensional Gaussian process models in machine learning can be reformulated as state space equations. This reduces the cubic computational complexity of the naive full GP solution to linear with respect to the number of training data points. For infinitely differentiable covariance ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Stationary onedimensional Gaussian process models in machine learning can be reformulated as state space equations. This reduces the cubic computational complexity of the naive full GP solution to linear with respect to the number of training data points. For infinitely differentiable covariance functions the representation is an approximation. In this paper, we study a class of covariance functions that can be represented as a scale mixture of squared exponentials. We show how the generalized Gauss–Laguerre quadrature rule can be employed in a state space approximation in this class. The explicit form of the rational quadratic covariance function approximation is written out, and we demonstrate the results in a regression and logGaussian Cox process study. Index Terms — Gaussian process, state space model, rational quadratic covariance function, Gaussian quadrature
Batch ContinuousTime Trajectory Estimation as Exactly Sparse Gaussian Process Regression
, 2014
"... experiment overview sparse priors factor graphs GP regression short talk ..."
(Show Context)
Gaussian Process Kernels for Popular StateSpace Time Series Models Gaussian Process Kernels for Popular StateSpace Time Series Models
"... AbstractIn this paper we investigate a link between statespace models and Gaussian Processes (GP) for time series modeling and forecasting. In particular, several widely used statespace models are transformed into continuous time form and corresponding Gaussian Process kernels are derived. Experim ..."
Abstract
 Add to MetaCart
(Show Context)
AbstractIn this paper we investigate a link between statespace models and Gaussian Processes (GP) for time series modeling and forecasting. In particular, several widely used statespace models are transformed into continuous time form and corresponding Gaussian Process kernels are derived. Experimental results demonstrate that the derived GP kernels are correct and appropriate for Gaussian Process Regression. An experiment with a real world dataset shows that the modeling is identical with statespace models and with the proposed GP kernels. The considered connection allows the researchers to look at their models from a different angle and facilitate sharing ideas between these two different modeling approaches.