Results 1  10
of
40
Gaussian processes for machine learning
 in: Adaptive Computation and Machine Learning
, 2006
"... Abstract. We give a basic introduction to Gaussian Process regression models. We focus on understanding the role of the stochastic process and how it is used to define a distribution over functions. We present the simple equations for incorporating training data and examine how to learn the hyperpar ..."
Abstract

Cited by 280 (2 self)
 Add to MetaCart
Abstract. We give a basic introduction to Gaussian Process regression models. We focus on understanding the role of the stochastic process and how it is used to define a distribution over functions. We present the simple equations for incorporating training data and examine how to learn the hyperparameters using the marginal likelihood. We explain the practical advantages of Gaussian Process and end with conclusions and a look at the current trends in GP work. Supervised learning in the form of regression (for continuous outputs) and classification (for discrete outputs) is an important constituent of statistics and machine learning, either for analysis of data sets, or as a subgoal of a more complex problem. Traditionally parametric 1 models have been used for this purpose. These have a possible advantage in ease of interpretability, but for complex data sets, simple parametric models may lack expressive power, and their more complex counterparts (such as feed forward neural networks) may not be easy to work with
Gaussian Process Priors with Uncertain Inputs  Application to MultipleStep Ahead Time Series Forecasting
, 2003
"... We consider the problem of multistep ahead prediction in time series analysis using the nonparametric Gaussian process model. kstep ahead forecasting of a discretetime nonlinear dynamic system can be performed by doing repeated onestep ahead predictions. For a statespace model of the form ..."
Abstract

Cited by 49 (15 self)
 Add to MetaCart
We consider the problem of multistep ahead prediction in time series analysis using the nonparametric Gaussian process model. kstep ahead forecasting of a discretetime nonlinear dynamic system can be performed by doing repeated onestep ahead predictions. For a statespace model of the form y t # f#y t## ;:::;y t#L #, the prediction of y at time t # k is based on the point estimates of the previous outputs. In this paper, we show how, using an analytical Gaussian approximation, we can formally incorporate the uncertainty about intermediate regressor values, thus updating the uncertainty on the current prediction.
Algorithms and Representations for Reinforcement Learning
, 2005
"... “If we knew what it was we were doing, it would not be called research, would it?” ..."
Abstract

Cited by 37 (7 self)
 Add to MetaCart
“If we knew what it was we were doing, it would not be called research, would it?”
Accelerating Evolutionary Algorithms with Gaussian Process Fitness Function Models
 IEEE Transactions on Systems, Man and Cybernetics
, 2004
"... We present an overview of evolutionary algorithms that use empirical models of the fitness function to accelerate convergence, distinguishing between Evolution Control and the Surrogate Approach. We describe the Gaussian process model and propose using it as an inexpensive fitness function surrogate ..."
Abstract

Cited by 35 (1 self)
 Add to MetaCart
We present an overview of evolutionary algorithms that use empirical models of the fitness function to accelerate convergence, distinguishing between Evolution Control and the Surrogate Approach. We describe the Gaussian process model and propose using it as an inexpensive fitness function surrogate. Implementation issues such as efficient and numerically stable computation, exploration vs. exploitation, local modeling, multiple objectives and constraints, and failed evaluations are addressed. Our resulting Gaussian Process Optimization Procedure (GPOP) clearly outperforms other evolutionary strategies on standard test functions as well as on a realworld problem: the optimization of stationary gas turbine compressor profiles.
Dependent Gaussian processes
 In Advances in Neural Information Processing Systems 17
, 2005
"... Gaussian processes are usually parameterised in terms of their covariance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white noi ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
Gaussian processes are usually parameterised in terms of their covariance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white noise sources convolved with smoothing kernels, and to parameterise the kernel instead. Using this, we extend Gaussian processes to handle multiple, coupled outputs. 1
Healing the relevance vector machine through augmentation
 In Proc. of the 22nd International Conference on Machine learning (ICML 2005
, 2005
"... The Relevance Vector Machine (RVM) is a sparse approximate Bayesian kernel method. It provides full predictive distributions for test cases. However, the predictive uncertainties have the unintuitive property, that they get smaller the further you move away from the training cases. We give a thoroug ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
The Relevance Vector Machine (RVM) is a sparse approximate Bayesian kernel method. It provides full predictive distributions for test cases. However, the predictive uncertainties have the unintuitive property, that they get smaller the further you move away from the training cases. We give a thorough analysis. Inspired by the analogy to nondegenerate Gaussian Processes, we suggest augmentation to solve the problem. The purpose of the resulting model, RVM*, is primarily to corroborate the theoretical and experimental analysis. Although RVM * could be used in practical applications, it is no longer a truly sparse model. Experiments show that sparsity comes at the expense of worse predictive distributions. Bayesian inference based on Gaussian Processes (GPs) has become widespread in the machine learning community. However, their naïve applicability is marred by computational constraints. A number of recent publications have addressed this issue by means of sparse approximations, although ideologically sparseness is at variance with Bayesian principles 1. In this paper we view sparsity purely as a way to achieve computational convenience and not as under other nonBayesian paradigms where sparseness itself is seen as a means to ensure good generalization.
Multiplestep ahead prediction for non linear dynamic systems  A Gaussian Process treatment with propagation of the uncertainty
, 2003
"... We consider the problem of multistep ahead prediction in time series analysis using the nonparametric Gaussian process model. kstep ahead forecasting of a discretetime nonlinear dynamic system can be performed by doing repeated onestep ahead predictions. For a statespace model of the form ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
We consider the problem of multistep ahead prediction in time series analysis using the nonparametric Gaussian process model. kstep ahead forecasting of a discretetime nonlinear dynamic system can be performed by doing repeated onestep ahead predictions. For a statespace model of the form y t = f(y t 1 ; : : : ; y t L ), the prediction of y at time t + k is based on the estimates ^ y t+k 1 ; : : : ; ^ y t+k L of the previous outputs.
Incremental Gaussian Processes
"... In this paper, we consider Tipping's relevance vector machine (RVM) [1] and formalize an incremental training strategy as a variant of the expectationmaximization (EM) algorithm that we call subspace EM. ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
In this paper, we consider Tipping's relevance vector machine (RVM) [1] and formalize an incremental training strategy as a variant of the expectationmaximization (EM) algorithm that we call subspace EM.
Truncated Covariance Matrices and Toeplitz Methods in Gaussian Processes
 In ICANN99
, 1999
"... Gaussian processes are a limit extension of neural networks. Standard Gaussian process techniques use a squared exponential covariance function. Here, the use of truncated covariances is proposed. Such covariances have compact support. Their use speeds up matrix inversion and increases precision. Fu ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Gaussian processes are a limit extension of neural networks. Standard Gaussian process techniques use a squared exponential covariance function. Here, the use of truncated covariances is proposed. Such covariances have compact support. Their use speeds up matrix inversion and increases precision. Furthermore they allow the use of speedy, memory efficient Toeplitz inversion for high dimensional grid based Gaussian process predictors. 1 Introduction Gaussian process methods are a natural extension of Bayesian neural network approaches. However Gaussian processes suffer from the need to invert an n \Theta n matrix, where n is the number of data points. This takes o(n 3 ) floating point operations. For many real life problems, there is some control over how data is collected, and this data often takes a regular form. For example data can be collected at regular time intervals or at points on a grid (e.g. video pictures) . Often this structure can be used to ensure covariance matrices ...
Gaussian Process Implicit Surfaces
"... Many applications in computer vision and computer graphics require the definition of curves and surfaces. Implicit surfaces [7] are a popular choice for this because they are smooth, can be appropriately constrained by known geometry, and require no special treatment for topology changes. Given a sc ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Many applications in computer vision and computer graphics require the definition of curves and surfaces. Implicit surfaces [7] are a popular choice for this because they are smooth, can be appropriately constrained by known geometry, and require no special treatment for topology changes. Given a scalar function f: R d ↦ → R, one can define a manifold S of dimension d − 1 wherever f(x) passes through a certain value (e.g., 0) S0 � {x ∈ R d f(x) = 0}. (1) In this paper we introduce Gaussian processes (GPs) to this area by deriving a covariance function equivalent to the thin plate spline regularizer [2] in which smoothness of a function f(x) is encouraged by the energy � � T 2 E(f) = ∇ ∇f(x) dx (2)