• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

Space and space-time modeling using process convolutions. In Quantitative methods for current environmental issues (2002)

by D M Higdon
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 74
Next 10 →

Dependent Gaussian processes

by Phillip Boyle, Marcus Frean - In NIPS , 2004
"... Gaussian processes are usually parameterised in terms of their covari-ance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white no ..."
Abstract - Cited by 50 (0 self) - Add to MetaCart
Gaussian processes are usually parameterised in terms of their covari-ance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white noise sources convolved with smoothing kernels, and to param-eterise the kernel instead. Using this, we extend Gaussian processes to handle multiple, coupled outputs. 1
(Show Context)

Citation Context

...rough kernel convolutions. A Gaussian process V (s) can be constructed over a region S by convolving a continuous white noise process X(s) with a smoothing kernel h(s), V (s) = h(s) ⋆ X(s) for s ∈ S, =-=[7]-=-. To this can be added a second white noise source, representing measurement uncertainty, and together this gives a model for observations Y . This view of GPs is shown in graphical form in Figure 1(a...

Kernels for Vector-Valued Functions: a Review

by Mauricio A. Alvarez, Lorenzo Rosasco, Neil D. Lawrence, Mauricio A. Álvarez, Lorenzo Rosasco, Neil D. Lawrence , 2011
"... Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kern ..."
Abstract - Cited by 32 (2 self) - Add to MetaCart
Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a Bayesian/generative perspective they are the key in the context of Gaussian processes, where the kernel function is also known as the covariance function. Traditionally, kernel methods have been used in supervised learning problem with scalar outputs and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partly by frameworks like multitask learning. In this paper, we review different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.
(Show Context)

Citation Context

...othing kernel. 4 If the base process is a Gaussian process, it turns out that the convolved process is also a Gaussian process. We can therefore exploit convolutions to construct covariance functions =-=[7, 87, 37, 38]-=-. In a similar way to the linear model of coregionalization, we consider Q groups of functions, where a particular group q has elements u i q(z), for i = 1, . . . , Rq. Each member of the group has th...

Latent Force Models

by Mauricio Alvarez, David Luengo, Neil D. Lawrence
"... Purely data driven approaches for machine learning present difficulties when data is scarce relative to the complexity of the model or when the model is forced to extrapolate. On the other hand, purely mechanistic approaches need to identify and specify all the interactions in the problem at hand (w ..."
Abstract - Cited by 31 (6 self) - Add to MetaCart
Purely data driven approaches for machine learning present difficulties when data is scarce relative to the complexity of the model or when the model is forced to extrapolate. On the other hand, purely mechanistic approaches need to identify and specify all the interactions in the problem at hand (which may not be feasible) and still leave the issue of how to parameterize the system. In this paper, we present a hybrid approach using Gaussian processes and differential equations to combine data driven modelling with a physical model of the system. We show how different, physically-inspired, kernel functions can be developed through sensible, simple, mechanistic assumptions about the underlying system. The versatility of our approach is illustrated with three case studies from computational biology, motion capture and geostatistics. 1
(Show Context)

Citation Context

...bserved outputs which provides a general framework for multi-output GP regression. We are not the first to suggest the use of convolution processes for multi-output regression, they were proposed by (=-=Higdon, 2002-=-) and built on by (Boyle and Frean, 2005) — the ideas in these papers have also recently been made more computationally practical through sparse approximations suggested by (Alvarez and Lawrence, 2009...

Sparse convolved Gaussian processes for multi-output regression

by Mauricio Alvarez, Neil D. Lawrence - In Advances in Neural Information Processing Systems 21 , 2009
"... We present a sparse approximation approach for dependent output Gaussian processes (GP). Employing a latent function framework, we apply the convolution process formalism to establish dependencies between output variables, where each latent function is represented as a GP. Based on these latent func ..."
Abstract - Cited by 30 (5 self) - Add to MetaCart
We present a sparse approximation approach for dependent output Gaussian processes (GP). Employing a latent function framework, we apply the convolution process formalism to establish dependencies between output variables, where each latent function is represented as a GP. Based on these latent functions, we establish an approximation scheme using a conditional independence assumption between the output processes, leading to an approximation of the full covariance which is determined by the locations at which the latent functions are evaluated. We show results of the proposed methodology for synthetic data and real world applications on pollution prediction and a sensor network. 1
(Show Context)

Citation Context

...outputs. In geostatistics this is known as cokriging. Whilst cross covariances allow us to improve our predictions of one output given the others because the correlations between outputs are modelled =-=[6, 2, 15, 12]-=- they also come with a computational and storage overhead. The main aim of this paper is to address these overheads in the context of convolution processes [6, 2]. One neat approach to account for non...

Computationally efficient convolved multiple output gaussian processes

by Mauricio A. Álvarez, Neil D. Lawrence, Edward Rasmussen - Journal of Machine Learning Research
"... Recently there has been an increasing interest in regression methods that deal with multiple outputs. This has been motivated partly by frameworks like multitask learning, multisensor networks or structured output data. From a Gaussian processes perspective, the problem reduces to specifying an appr ..."
Abstract - Cited by 27 (2 self) - Add to MetaCart
Recently there has been an increasing interest in regression methods that deal with multiple outputs. This has been motivated partly by frameworks like multitask learning, multisensor networks or structured output data. From a Gaussian processes perspective, the problem reduces to specifying an appropriate covariance function that, whilst being positive semi-definite, captures the dependencies between all the data points and across all the outputs. One approach to account for non-trivial correlations between outputs employs convolution processes. Under a latent function interpretation of the convolution transform we establish dependencies between output variables. The main drawbacks of this approach are the associated computational and storage demands. In this paper we address these issues. We present different efficient approximations for dependent output Gaussian processes constructed through the convolution formalism. We exploit the conditional independencies present naturally in the model. This leads to a form of the covariance similar in spirit to the so called PITC and FITC approximations for a single output. We show experimental results with synthetic and real data, in particular, we show results in school exams score prediction, pollution prediction and gene expression data.

Analyzing nonstationary spatial data using . . .

by Hyoung-moon Kim, Bani K. Mallick, C. C. Holmes
"... ..."
Abstract - Cited by 26 (0 self) - Add to MetaCart
Abstract not found

Particle learning of Gaussian process models for sequential design and optimization. Working Paper. The

by Robert B. Gramacy, Nicholas G. Polson , 2010
"... We develop a simulation-based method for the online updating of Gaussian process regression and classification models. Our method exploits sequential Monte Carlo to produce a thrifty sequential design algorithm, in terms of computational speed, compared to the established MCMC alternative. The latte ..."
Abstract - Cited by 21 (9 self) - Add to MetaCart
We develop a simulation-based method for the online updating of Gaussian process regression and classification models. Our method exploits sequential Monte Carlo to produce a thrifty sequential design algorithm, in terms of computational speed, compared to the established MCMC alternative. The latter is less ideal for sequential design since it must be restarted and iterated to convergence with the inclusion of each new design point. We illustrate some attractive ensemble aspects of our SMC approach, and how active learning heuristics may be implemented via particles to optimize a noisy function or to explore classification boundaries online.

Parameter space exploration with Gaussian process trees

by Robert B. Gramacy, Herbert K. H. Lee, William G. Macready - Proceedings of the International Conference on Machine L earning (pp. 353–360). Omnipress & ACM Digital Library , 2004
"... Computer experiments often require dense sweeps over input parameters to obtain a qualitative understanding of their response. ..."
Abstract - Cited by 19 (4 self) - Add to MetaCart
Computer experiments often require dense sweeps over input parameters to obtain a qualitative understanding of their response.
(Show Context)

Citation Context

...ther adaptive sample is chosen from a new set of N # LH samples. 5.1. Synthetic Data 1-d Sinusoidal dataset: Our first example is a simulated dataset on the input space [0, 60]. The true response is (=-=Higdon, 2002-=-): t(x) = # sin # #x 5 # + 1 5 cos # 4#x 5 ## #(x - 35.75) (5) where # is the step function defined by #(x) = 1 if x > 0 and #(x) = 0 otherwise. Zero mean Gaussian noise with sd = 0.1 is added to the ...

Adaptive design and analysis of supercomputer experiments

by Robert B. Gramacy, Herbert K. H. Lee , 2009
"... ..."
Abstract - Cited by 19 (5 self) - Add to MetaCart
Abstract not found

Dimension reduction and alleviation of confounding for spatial generalized linear mixed models

by John Hughes, Murali Haran - Journal of the Royal Statistical Society: Series B (Statistical Methodology , 2013
"... Abstract. Non-gaussian spatial data are very common in many disciplines. For instance, count data are common in disease mapping, and binary data are common in ecology. When fitting spatial regressions for such data, one needs to account for dependence to ensure reliable inference for the regression ..."
Abstract - Cited by 14 (1 self) - Add to MetaCart
Abstract. Non-gaussian spatial data are very common in many disciplines. For instance, count data are common in disease mapping, and binary data are common in ecology. When fitting spatial regressions for such data, one needs to account for dependence to ensure reliable inference for the regression coefficients. The spatial generalized linear mixed model (SGLMM) offers a very popular and flexible approach to modeling such data, but the SGLMM suffers from three major shortcomings: (1) uninterpretability of parameters due to spatial confounding, (2) variance inflation due to spatial confounding, and (3) high-dimensional spatial random effects that make fully Bayesian inference for such models computationally challenging. We propose a new parameterization of the SGLMM that alleviates spatial confounding and speeds computation by greatly reducing the dimension of the spatial random effects. We illustrate the application of our approach to simulated binary, count, and Gaussian spatial datasets, and to a large infant mortality dataset.
(Show Context)

Citation Context

...that our model makes feasible the analyses of large areal datasets. Several recent papers have focused on dimension reduction for point-level, i.e., Gaussian process-based, spatial models (cf., e.g., =-=Higdon, 2002-=-; Cressie and Johannesson, 2008; Banerjee, Gelfand, Finley, and Sang, 2008; Furrer, Genton, and Nychka, 2006; Rue and Tjelmeland, 2002). To our knowledge, this paper is the first to propose a principl...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University