Results 1  10
of
74
An Explicit Link between Gaussian Fields and . . .
 PREPRINTS IN MATHEMATICAL SCIENCES
, 2010
"... Continuously indexed Gaussian fields (GFs) is the most important ingredient in spatial statistical modelling and geostatistics. The specification through the covariance function gives an intuitive interpretation of its properties. On the computational side, GFs are hampered with the bign problem, ..."
Abstract

Cited by 21 (7 self)
 Add to MetaCart
Continuously indexed Gaussian fields (GFs) is the most important ingredient in spatial statistical modelling and geostatistics. The specification through the covariance function gives an intuitive interpretation of its properties. On the computational side, GFs are hampered with the bign problem, since the cost of factorising dense matrices is cubic in the dimension. Although the computational power today is alltimehigh, this fact seems still to be a computational bottleneck in applications. Along with GFs, there is the class of Gaussian Markov random fields (GMRFs) which are discretely indexed. The Markov property makes the involved precision matrix sparse which enables the use of numerical algorithms for sparse matrices, that for fields in R 2 only use the squareroot of the time required by general algorithms. The specification of a GMRF is through its full conditional distributions but its marginal properties are not transparent in such a parametrisation. In this paper, we show that using an approximate stochastic weak solution to (linear) stochastic partial differential equations (SPDEs), we can, for some GFs in the Matérn class, provide an explicit link, for any triangulation of R d, between GFs and GMRFs. The consequence is that we can take the best from the two worlds and do the modelling using GFs but do the computations using GMRFs. Perhaps more importantly,
Elliptical slice sampling
 JMLR: W&CP
"... Many probabilistic models introduce strong dependencies between variables using a latent multivariate Gaussian distribution or a Gaussian process. We present a new Markov chain Monte Carlo algorithm for performing inference in models with multivariate Gaussian priors. Its key properties are: 1) it h ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
Many probabilistic models introduce strong dependencies between variables using a latent multivariate Gaussian distribution or a Gaussian process. We present a new Markov chain Monte Carlo algorithm for performing inference in models with multivariate Gaussian priors. Its key properties are: 1) it has simple, generic code applicable to many models, 2) it has no free parameters, 3) it works well for a variety of Gaussian process based models. These properties make our method ideal for use while model building, removing the need to spend time deriving and tuning updates for more complex algorithms.
Approximate bayesian inference in spatial generalized linear mixed models
, 2006
"... In this paper we propose fast approximate methods for computing posterior marginals in spatial generalized linear mixed models. We consider the common geostatistical special case with a high dimensional latent spatial variable and observations at only a few known registration sites. Our methods of i ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
In this paper we propose fast approximate methods for computing posterior marginals in spatial generalized linear mixed models. We consider the common geostatistical special case with a high dimensional latent spatial variable and observations at only a few known registration sites. Our methods of inference are deterministic, using no random sampling. We present two methods of approximate inference. The first is very fast to compute and via examples we find that this approximation is ’practically sufficient’. By this expression we mean that the results obtained by this approximate method do not show any bias or dispersion effects that might affect decision making. The other approximation is an improved version of the first one, and via examples we demonstrate that the inferred posterior approximations of this improved version are ’practically exact’. By this expression we mean that one would have to run Markov chain Monte Carlo simulations for longer than is typically done to detect any indications of bias or dispersion error effects in the approximate results. The two methods of approximate inference can help to expand the scope of geostatistical models, for instance in the context of model choice, model assessment, and sampling design. The
Gaussian process regression with Studentt likelihood
"... In the Gaussian process regression the observation model is commonly assumed to be Gaussian, which is convenient in computational perspective. However, the drawback is that the predictive accuracy of the model can be significantly compromised if the observations are contaminated by outliers. A robus ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
In the Gaussian process regression the observation model is commonly assumed to be Gaussian, which is convenient in computational perspective. However, the drawback is that the predictive accuracy of the model can be significantly compromised if the observations are contaminated by outliers. A robust observation model, such as the Studentt distribution, reduces the influence of outlying observations and improves the predictions. The problem, however, is the analytically intractable inference. In this work, we discuss the properties of a Gaussian process regression model with the Studentt likelihood and utilize the Laplace approximation for approximate inference. We compare our approach to a variational approximation and a Markov chain Monte Carlo scheme, which utilize the commonly used scale mixture representation of the Studentt distribution. 1
Portfolio Allocation for Bayesian Optimization
"... Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive blackbox optimization scenarios. It uses Bayesian methods t ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive blackbox optimization scenarios. It uses Bayesian methods to sample the objective efficiently using an acquisition function which incorporates the posterior estimate of the objective. However, there are several different parameterized acquisition functions in the literature, and it is often unclear which one to use. Instead of using a single acquisition function, we adopt a portfolio of acquisition functions governed by an online multiarmed bandit strategy. We propose several portfolio strategies, the best of which we call GPHedge, and show that this method outperforms the best individual acquisition function. We also provide a theoretical bound on the algorithm’s performance. 1
Implementing Approximate Bayesian Inference for Survival Analysis using Integrated Nested Laplace Approximations
, 2010
"... In this report, we investigate the use of INLA, (Martino and Rue, 2008) to solve Bayesian inferential problems in Bayesisan Survival analsysis. In particular we consider the Exponential and Weibulldistributed lifetimes with and without censoring and frailty, and Coxmodels with piecewise constant an ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
In this report, we investigate the use of INLA, (Martino and Rue, 2008) to solve Bayesian inferential problems in Bayesisan Survival analsysis. In particular we consider the Exponential and Weibulldistributed lifetimes with and without censoring and frailty, and Coxmodels with piecewise constant and piecewise linear baseline hazard. We demonstrate that all these models can (in most cases) be expressed as a latent Gaussian model (LGM) so that integrated nested Laplace approximations proposed by Rue et al. (2009) can be applied. We show comparison with the results obtained with INLA and those obtained with extensive runs with Markov chain Monte Carlo methods. The results obtained are again ”practically exact” and support the general experience of Rue et al. (2009).
Efficient Sampling for Gaussian Process Inference using Control Variables
"... Sampling functions in Gaussian process (GP) models is challenging because of the highly correlated posterior distribution. We describe an efficient Markov chain Monte Carlo algorithm for sampling from the posterior process of the GP model. This algorithm uses control variables which are auxiliary fu ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Sampling functions in Gaussian process (GP) models is challenging because of the highly correlated posterior distribution. We describe an efficient Markov chain Monte Carlo algorithm for sampling from the posterior process of the GP model. This algorithm uses control variables which are auxiliary function values that provide a low dimensional representation of the function. At each iteration, the algorithm proposes new values for the control variables and generates the function from the conditional GP prior. The control variable input locations are found by continuously minimizing an objective function. We demonstrate the algorithm on regression and classification problems and we use it to estimate the parameters of a differential equation model of gene regulation. 1
A stickbreaking likelihood for categorical data analysis with latent Gaussian models
 In AISTATS
, 2012
"... The development of accurate models and efficient algorithms for the analysis of multivariate categorical data are important and longstanding problems in machine learning and computational statistics. In this paper, we focus on modeling categorical data using Latent Gaussian Models (LGMs). We propose ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
The development of accurate models and efficient algorithms for the analysis of multivariate categorical data are important and longstanding problems in machine learning and computational statistics. In this paper, we focus on modeling categorical data using Latent Gaussian Models (LGMs). We propose a novel stickbreaking likelihood function for categorical LGMs that exploits accurate linear and quadratic bounds on the logistic logpartition function, leading to an effective variational inference and learning framework. We thoroughly compare our approach to existing algorithms for multinomial logit/probit likelihoods on several problems, including inference in multinomial Gaussian process classification and learning in latent factor models. Our extensive comparisons demonstrate that our stickbreaking model effectively captures correlation in discrete data and is well suited for the analysis of categorical data. 1