Results 1  10
of
21
Bayesian Modeling with Gaussian Processes using the GPstuff Toolbox
, 2014
"... Gaussian processes (GP) are powerful tools for probabilistic modeling purposes. They can be used to define prior distributions over latent functions in hierarchical Bayesian models. The prior over functions is defined implicitly by the mean and covariance function, which determine the smoothness and ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
Gaussian processes (GP) are powerful tools for probabilistic modeling purposes. They can be used to define prior distributions over latent functions in hierarchical Bayesian models. The prior over functions is defined implicitly by the mean and covariance function, which determine the smoothness and variability of the function. The inference can then be conducted directly in the function space by evaluating or approximating the posterior process. Despite their attractive theoretical properties GPs provide practical challenges in their implementation. GPstuff is a versatile collection of computational tools for GP models compatible with Linux and Windows MATLAB and Octave. It includes, among others, various inference methods, sparse approximations and tools for model assessment. In this work, we review these tools and demonstrate the use of GPstuff in several models.
InfiniteDimensional Kalman Filtering Approach to SpatioTemporal Gaussian Process Regression
"... We show how spatiotemporal Gaussian process (GP) regression problems (or the equivalent Kriging problems) can be formulated as infinitedimensional Kalman filtering and RauchTungStriebel (RTS) smoothing problems, and present a procedure for converting spatiotemporal covariance functions into inf ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
We show how spatiotemporal Gaussian process (GP) regression problems (or the equivalent Kriging problems) can be formulated as infinitedimensional Kalman filtering and RauchTungStriebel (RTS) smoothing problems, and present a procedure for converting spatiotemporal covariance functions into infinitedimensional stochastic differential equations (SDEs). The resulting infinitedimensional SDEs belong to the class of stochastic pseudodifferential equations and can be numerically treated using the methods developed for deterministic counterparts oftheequations. Thescalingofthecomputational cost in the proposed approach is linear in the number of time steps as opposed to the cubic scaling of the direct GP regression solution. We also show how separable covariance functions lead to a finitedimensional Kalman filtering and RTS smoothing problem, present analytical and numerical examples, and discuss numerical methods for computing the solutions. 1
StateSpace Inference for NonLinear Latent Force Models with Application to Satellite Orbit Prediction
"... Latent force models (LFMs) are flexible models that combine mechanistic modelling principles (i.e., physical models) with nonparametric datadriven components. Several key applications of LFMs need nonlinearities, which results in analytically intractable inference. In this work we show how nonline ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Latent force models (LFMs) are flexible models that combine mechanistic modelling principles (i.e., physical models) with nonparametric datadriven components. Several key applications of LFMs need nonlinearities, which results in analytically intractable inference. In this work we show how nonlinear LFMs can be represented as nonlinear white noise driven statespace models and present an efficient nonlinear Kalman filtering and smoothing based method for approximate state and parameter inference. We illustrate the performance of the proposed methodology via two simulated examples, and apply it to a realworld problem of longterm prediction of GPS satellite orbits. 1.
Explicit Link Between Periodic Covariance Functions and State Space Models
"... This paper shows how periodic covariance functions in Gaussian process regression can be reformulated as state space models, which can be solved with classical Kalman filtering theory. This reduces the problematic cubic complexity of Gaussian process regression in the number of time steps into lin ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
This paper shows how periodic covariance functions in Gaussian process regression can be reformulated as state space models, which can be solved with classical Kalman filtering theory. This reduces the problematic cubic complexity of Gaussian process regression in the number of time steps into linear time complexity. The representation is based on expanding periodic covariance functions into a series of stochastic resonators. The explicit representation of the canonical periodic covariance function is written out and the expansion is shown to uniformly converge to the exact covariance function with a known convergence rate. The framework is generalized to quasiperiodic covariance functions by introducing damping terms in the system and applied to two sets of real data. The approach could be easily extended to nonstationary and spatiotemporal variants. 1
1 Scaling Multidimensional Inference for Structured Gaussian Processes
, 1209
"... Abstract—Exact Gaussian Process (GP) regression has O(N 3) runtime for data size N, making it intractable for large N. Many algorithms for improving GP scaling approximate the covariance with lower rank matrices. Other work has exploited structure inherent in particular covariance functions, includi ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
Abstract—Exact Gaussian Process (GP) regression has O(N 3) runtime for data size N, making it intractable for large N. Many algorithms for improving GP scaling approximate the covariance with lower rank matrices. Other work has exploited structure inherent in particular covariance functions, including GPs with implied Markov structure, and equispaced inputs (both enable O(N) runtime). However, these GP advances have not been extended to the multidimensional input setting, despite the preponderance of multidimensional applications. This paper introduces and tests novel extensions of structured GPs to multidimensional inputs. We present new methods for additive GPs, showing a novel connection between the classic backfitting method and the Bayesian framework. To achieve optimal accuracycomplexity tradeoff, we extend this model with a novel variant of projection pursuit regression. Our primary result – projection pursuit Gaussian Process Regression – shows orders of magnitude speedup while preserving high accuracy. The natural second and third steps include nonGaussian observations and higher dimensional equispaced grid methods. We introduce novel techniques to address both of these necessary directions. We thoroughly illustrate the power of these three advances on several datasets, achieving close performance to the naive Full GP at orders of magnitude less cost. Index Terms—Gaussian Processes, Backfitting, ProjectionPursuit Regression, Kronecker matrices. 1
Sparse spatiotemporal Gaussian processes with general likelihoods
 In Proceedings of ICANN’11
, 2011
"... Abstract. In this paper, we consider learning of spatiotemporal processes by formulating a Gaussian process model as a solution to an evolution type stochastic partial differential equation. Our approach is based on converting the stochastic infinitedimensional differential equation into a finite ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper, we consider learning of spatiotemporal processes by formulating a Gaussian process model as a solution to an evolution type stochastic partial differential equation. Our approach is based on converting the stochastic infinitedimensional differential equation into a finite dimensional linear time invariant (LTI) stochastic differential equation (SDE) by discretizing the process spatially. The LTI SDE is timediscretized analytically, resulting in a state space model with linearGaussian dynamics. We use expectation propagation to perform approximate inference on nonGaussian data, and show how to incorporate sparse approximations to further reduce the computational complexity. We briefly illustrate the proposed methodology with a simulation study and with a real world modelling problem.
Sequential Inference for Latent Force Models
"... Latent force models (LFMs) are hybrid models combining mechanistic principles with nonparametric components. In this article, we shall show how LFMs can be equivalently formulated and solved using the state variable approach. We shall also show how the Gaussian process prior used in LFMs can be equ ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
Latent force models (LFMs) are hybrid models combining mechanistic principles with nonparametric components. In this article, we shall show how LFMs can be equivalently formulated and solved using the state variable approach. We shall also show how the Gaussian process prior used in LFMs can be equivalently formulated as a linear statespace model driven by a white noise process and how inference on the resulting model can be efficiently implemented using Kalman filter and smoother. Then we shall show how the recently proposed switching LFM can be reformulated using the state variable approach, and how we can construct a probabilistic model for the switches by formulating a similar switching LFM as a switching linear dynamic system (SLDS). We illustrate the performance of the proposed methodology in simulated scenarios and apply it to inferring the switching points in GPS data collected from car movement data in urban environment. 1
Efficient statespace inference of periodic latent force models
, 2013
"... Latent force models (LFM) are principled approaches to incorporating solutions to differential equations within nonparametric inference methods. Unfortunately, the development and application of LFMs can be inhibited by their computational cost, especially when closedform solutions for the LFM ar ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Latent force models (LFM) are principled approaches to incorporating solutions to differential equations within nonparametric inference methods. Unfortunately, the development and application of LFMs can be inhibited by their computational cost, especially when closedform solutions for the LFM are unavailable, as is the case in many real world problems where these latent forces exhibit periodic behaviour. Given this, we develop a new sparse representation of LFMs which considerably improves their computational efficiency, as well as broadening their applicability, in a principled way, to domains with periodic or near periodic latent forces. Our approach uses a linear basis model to approximate one generative model for each periodic force. We assume that the latent forces are generated from Gaussian process priors and develop a linear basis model which fully expresses these priors. We apply our approach to model the thermal dynamics of domestic buildings and show that it is effective at predicting dayahead temperatures within the homes. We also apply our approach within queueing theory in which quasiperiodic arrival rates are modelled as latent forces. In both cases, we demonstrate that our approach can be implemented
Batch ContinuousTime Trajectory Estimation as Exactly Sparse Gaussian Process Regression
"... Abstract—In this paper, we revisit batch state estimation through the lens of Gaussian process (GP) regression. We consider continuousdiscrete estimation problems wherein a trajectory is viewed as a onedimensional GP, with time as the independent variable. Our continuoustime prior can be defined ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract—In this paper, we revisit batch state estimation through the lens of Gaussian process (GP) regression. We consider continuousdiscrete estimation problems wherein a trajectory is viewed as a onedimensional GP, with time as the independent variable. Our continuoustime prior can be defined by any linear, timevarying stochastic differential equation driven by white noise; this allows the possibility of smoothing our trajectory estimates using a variety of vehicle dynamics models (e.g., ‘constantvelocity’). We show that this class of prior results in an inverse kernel matrix (i.e., covariance matrix between all pairs of measurement times) that is exactly sparse (blocktridiagonal) and that this can be exploited to carry out GP regression (and interpolation) very efficiently. Though the prior is continuous, we consider measurements to occur at discrete times. When the measurement model is also linear, this GP approach is equivalent to classical, discretetime smoothing (at the measurement times). When the measurement model is nonlinear, we iterate over the whole trajectory (as is common in vision and robotics) to maximize accuracy. We test the approach experimentally on a simultaneous trajectory estimation and mapping problem using a mobile robot dataset. I.
GAUSSIAN QUADRATURES FOR STATE SPACE APPROXIMATION OF SCALE MIXTURES OF SQUARED EXPONENTIAL COVARIANCE FUNCTIONS
"... Stationary onedimensional Gaussian process models in machine learning can be reformulated as state space equations. This reduces the cubic computational complexity of the naive full GP solution to linear with respect to the number of training data points. For infinitely differentiable covariance ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Stationary onedimensional Gaussian process models in machine learning can be reformulated as state space equations. This reduces the cubic computational complexity of the naive full GP solution to linear with respect to the number of training data points. For infinitely differentiable covariance functions the representation is an approximation. In this paper, we study a class of covariance functions that can be represented as a scale mixture of squared exponentials. We show how the generalized Gauss–Laguerre quadrature rule can be employed in a state space approximation in this class. The explicit form of the rational quadratic covariance function approximation is written out, and we demonstrate the results in a regression and logGaussian Cox process study. Index Terms — Gaussian process, state space model, rational quadratic covariance function, Gaussian quadrature