Results 1 - 10
of
74
Dependent Gaussian processes
- In NIPS
, 2004
"... Gaussian processes are usually parameterised in terms of their covari-ance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white no ..."
Abstract
-
Cited by 50 (0 self)
- Add to MetaCart
(Show Context)
Gaussian processes are usually parameterised in terms of their covari-ance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white noise sources convolved with smoothing kernels, and to param-eterise the kernel instead. Using this, we extend Gaussian processes to handle multiple, coupled outputs. 1
Kernels for Vector-Valued Functions: a Review
, 2011
"... Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kern ..."
Abstract
-
Cited by 32 (2 self)
- Add to MetaCart
(Show Context)
Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a Bayesian/generative perspective they are the key in the context of Gaussian processes, where the kernel function is also known as the covariance function. Traditionally, kernel methods have been used in supervised learning problem with scalar outputs and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partly by frameworks like multitask learning. In this paper, we review different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.
Latent Force Models
"... Purely data driven approaches for machine learning present difficulties when data is scarce relative to the complexity of the model or when the model is forced to extrapolate. On the other hand, purely mechanistic approaches need to identify and specify all the interactions in the problem at hand (w ..."
Abstract
-
Cited by 31 (6 self)
- Add to MetaCart
(Show Context)
Purely data driven approaches for machine learning present difficulties when data is scarce relative to the complexity of the model or when the model is forced to extrapolate. On the other hand, purely mechanistic approaches need to identify and specify all the interactions in the problem at hand (which may not be feasible) and still leave the issue of how to parameterize the system. In this paper, we present a hybrid approach using Gaussian processes and differential equations to combine data driven modelling with a physical model of the system. We show how different, physically-inspired, kernel functions can be developed through sensible, simple, mechanistic assumptions about the underlying system. The versatility of our approach is illustrated with three case studies from computational biology, motion capture and geostatistics. 1
Sparse convolved Gaussian processes for multi-output regression
- In Advances in Neural Information Processing Systems 21
, 2009
"... We present a sparse approximation approach for dependent output Gaussian processes (GP). Employing a latent function framework, we apply the convolution process formalism to establish dependencies between output variables, where each latent function is represented as a GP. Based on these latent func ..."
Abstract
-
Cited by 30 (5 self)
- Add to MetaCart
(Show Context)
We present a sparse approximation approach for dependent output Gaussian processes (GP). Employing a latent function framework, we apply the convolution process formalism to establish dependencies between output variables, where each latent function is represented as a GP. Based on these latent functions, we establish an approximation scheme using a conditional independence assumption between the output processes, leading to an approximation of the full covariance which is determined by the locations at which the latent functions are evaluated. We show results of the proposed methodology for synthetic data and real world applications on pollution prediction and a sensor network. 1
Computationally efficient convolved multiple output gaussian processes
- Journal of Machine Learning Research
"... Recently there has been an increasing interest in regression methods that deal with multiple outputs. This has been motivated partly by frameworks like multitask learning, multisensor networks or structured output data. From a Gaussian processes perspective, the problem reduces to specifying an appr ..."
Abstract
-
Cited by 27 (2 self)
- Add to MetaCart
Recently there has been an increasing interest in regression methods that deal with multiple outputs. This has been motivated partly by frameworks like multitask learning, multisensor networks or structured output data. From a Gaussian processes perspective, the problem reduces to specifying an appropriate covariance function that, whilst being positive semi-definite, captures the dependencies between all the data points and across all the outputs. One approach to account for non-trivial correlations between outputs employs convolution processes. Under a latent function interpretation of the convolution transform we establish dependencies between output variables. The main drawbacks of this approach are the associated computational and storage demands. In this paper we address these issues. We present different efficient approximations for dependent output Gaussian processes constructed through the convolution formalism. We exploit the conditional independencies present naturally in the model. This leads to a form of the covariance similar in spirit to the so called PITC and FITC approximations for a single output. We show experimental results with synthetic and real data, in particular, we show results in school exams score prediction, pollution prediction and gene expression data.
Particle learning of Gaussian process models for sequential design and optimization. Working Paper. The
, 2010
"... We develop a simulation-based method for the online updating of Gaussian process regression and classification models. Our method exploits sequential Monte Carlo to produce a thrifty sequential design algorithm, in terms of computational speed, compared to the established MCMC alternative. The latte ..."
Abstract
-
Cited by 21 (9 self)
- Add to MetaCart
We develop a simulation-based method for the online updating of Gaussian process regression and classification models. Our method exploits sequential Monte Carlo to produce a thrifty sequential design algorithm, in terms of computational speed, compared to the established MCMC alternative. The latter is less ideal for sequential design since it must be restarted and iterated to convergence with the inclusion of each new design point. We illustrate some attractive ensemble aspects of our SMC approach, and how active learning heuristics may be implemented via particles to optimize a noisy function or to explore classification boundaries online.
Parameter space exploration with Gaussian process trees
- Proceedings of the International Conference on Machine L earning (pp. 353–360). Omnipress & ACM Digital Library
, 2004
"... Computer experiments often require dense sweeps over input parameters to obtain a qualitative understanding of their response. ..."
Abstract
-
Cited by 19 (4 self)
- Add to MetaCart
(Show Context)
Computer experiments often require dense sweeps over input parameters to obtain a qualitative understanding of their response.
Dimension reduction and alleviation of confounding for spatial generalized linear mixed models
- Journal of the Royal Statistical Society: Series B (Statistical Methodology
, 2013
"... Abstract. Non-gaussian spatial data are very common in many disciplines. For instance, count data are common in disease mapping, and binary data are common in ecology. When fitting spatial regressions for such data, one needs to account for dependence to ensure reliable inference for the regression ..."
Abstract
-
Cited by 14 (1 self)
- Add to MetaCart
(Show Context)
Abstract. Non-gaussian spatial data are very common in many disciplines. For instance, count data are common in disease mapping, and binary data are common in ecology. When fitting spatial regressions for such data, one needs to account for dependence to ensure reliable inference for the regression coefficients. The spatial generalized linear mixed model (SGLMM) offers a very popular and flexible approach to modeling such data, but the SGLMM suffers from three major shortcomings: (1) uninterpretability of parameters due to spatial confounding, (2) variance inflation due to spatial confounding, and (3) high-dimensional spatial random effects that make fully Bayesian inference for such models computationally challenging. We propose a new parameterization of the SGLMM that alleviates spatial confounding and speeds computation by greatly reducing the dimension of the spatial random effects. We illustrate the application of our approach to simulated binary, count, and Gaussian spatial datasets, and to a large infant mortality dataset.