Results 1  10
of
57
Implementing approximate Bayesian inference for latent Gaussian models using integrated nested Laplace approximations: A manual for the inlaprogram
, 2008
"... Structured additive regression models are perhaps the most commonly used class of models in statistical applications. It includes, among others, (generalised) linear models, (generalised) additive models, smoothingspline models, statespace models, semiparametric regression, spatial and spatiotemp ..."
Abstract

Cited by 79 (16 self)
 Add to MetaCart
Structured additive regression models are perhaps the most commonly used class of models in statistical applications. It includes, among others, (generalised) linear models, (generalised) additive models, smoothingspline models, statespace models, semiparametric regression, spatial and spatiotemporal models, logGaussian Coxprocesses, geostatistical and geoadditive models. In this paper we consider approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with nonGaussian response variables. The posterior marginals are not available in closed form due to the nonGaussian response variables. For such models, Markov chain Monte Carlo methods can be implemented, but they are not without problems, both in terms of convergence and computational time. In some practical applications, the extent of these problems is such that Markov chain Monte Carlo is simply not an appropriate tool for routine analysis. We show that, by using an integrated nested Laplace approximation and its simplified version, we can directly compute very accurate approximations to the posterior marginals. The main benefit of these approximations
Spatial modelling using a new class of nonstationary covariance functions
 Environmetrics
, 2006
"... We introduce a new class of nonstationary covariance functions for spatial modelling. Nonstationary covariance functions allow the model to adapt to spatial surfaces whose variability changes with location. The class includes a nonstationary version of the Matérn stationary covariance, in which the ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
We introduce a new class of nonstationary covariance functions for spatial modelling. Nonstationary covariance functions allow the model to adapt to spatial surfaces whose variability changes with location. The class includes a nonstationary version of the Matérn stationary covariance, in which the differentiability of the spatial surface is controlled by a parameter, freeing one from fixing the differentiability in advance. The class allows one to knit together local covariance parameters into a valid global nonstationary covariance, regardless of how the local covariance structure is estimated. We employ this new nonstationary covariance in a fully Bayesian model in which the unknown spatial process has a Gaussian process (GP) distribution with a nonstationary covariance function from the class. We model the nonstationary structure in a computationally efficient way that creates nearly stationary local behavior and for which stationarity is a special case. We also suggest nonBayesian approaches to nonstationary kriging. To assess the method, we compare the Bayesian nonstationary GP model with a Bayesian stationary GP model, various standard spatial smoothing approaches, and nonstationary models that can adapt to function heterogeneity. In simulations, the nonstationary GP model adapts to function heterogeneity, unlike the stationary models, and also outperforms the other nonstationary models. On a real dataset, GP models outperform the competitors, but while the nonstationary GP gives qualitatively more sensible results, it fails to outperform the stationary GP on heldout data, illustrating the difficulty in fitting complex spatial functions with relatively few observations. The nonstationary covariance model could also be used for nonGaussian data and embedded in additive models as well as in more complicated, hierarchical spatial or spatiotemporal models. More complicated models may require simpler parameterizations for computational efficiency.
Lang S: Generalized structured additive regression based on Bayesian P splines
 Computational Statistics & Data Analysis
"... Generalized additive models (GAM) for modelling nonlinear effects of continuous covariates are now well established tools for the applied statistician. In this paper we develop Bayesian GAM’s and extensions to generalized structured additive regression based on one or two dimensional Psplines as th ..."
Abstract

Cited by 27 (7 self)
 Add to MetaCart
Generalized additive models (GAM) for modelling nonlinear effects of continuous covariates are now well established tools for the applied statistician. In this paper we develop Bayesian GAM’s and extensions to generalized structured additive regression based on one or two dimensional Psplines as the main building block. The approach extends previous work by Lang and Brezger (2003) for Gaussian responses. Inference relies on Markov chain Monte Carlo (MCMC) simulation techniques, and is either based on iteratively weighted least squares (IWLS) proposals or on latent utility representations of (multi)categorical regression models. Our approach covers the most common univariate response distributions, e.g. the Binomial, Poisson or Gamma distribution, as well as multicategorical responses. As we will demonstrate through two applications on the forest health status of trees and a spacetime analysis of health insurance data, the approach allows realistic modelling of complex problems. We consider the enormous flexibility and extendability of our approach as a main advantage of Bayesian inference based on MCMC techniques compared to more traditional approaches. Software for the methodology presented in the paper is provided within the public domain package BayesX. Key words: geoadditive models, IWLS proposals, multicategorical response, structured additive
Penalized structured additive regression for spacetime data: a Bayesian perspective
 STATISTICA SINICA
, 2004
"... We propose extensions of penalized spline generalized additive models for analyzing spacetime regression data and study them from a Bayesian perspective. Nonlinear effects of continuous covariates and time trends are modelled through Bayesian versions of penalized splines, while correlated spati ..."
Abstract

Cited by 16 (11 self)
 Add to MetaCart
We propose extensions of penalized spline generalized additive models for analyzing spacetime regression data and study them from a Bayesian perspective. Nonlinear effects of continuous covariates and time trends are modelled through Bayesian versions of penalized splines, while correlated spatial effects follow a Markov random field prior. This allows to treat all functions and effects within a unified general framework by assigning appropriate priors with different forms and degrees of smoothness. Inference can be performed either with full (FB) or empirical Bayes (EB) posterior analysis. FB inference using MCMC techniques is a slight extension of previous work. For EB inference, a computationally efficient solution is developed on the basis of a generalized linear mixed model representation. The second approach can be viewed as posterior mode estimation and is closely related to penalized likelihood estimation in a frequentist setting. Variance components, corresponding to inverse smoothing parameters, are then estimated by marginal likelihood. We carefully compare both inferential procedures in simulation studies and illustrate them through data applications. The methodology is available in the open domain statistical package BayesX and as an Splus/R function.
Function Estimation With Locally Adaptive Dynamic Models
 Computational Statistics
, 1998
"... this paper, we present a Bayesian nonparametric approach, which is more closely related to spline fitting with locally adaptive penalties. Abramovich and Steinberg (1996) generalize the common penalized least squares criterion for smoothing splines with a global smoothing parameter by introducing a ..."
Abstract

Cited by 11 (8 self)
 Add to MetaCart
this paper, we present a Bayesian nonparametric approach, which is more closely related to spline fitting with locally adaptive penalties. Abramovich and Steinberg (1996) generalize the common penalized least squares criterion for smoothing splines with a global smoothing parameter by introducing a variable smoothing parameter into the roughness penalty. For estimation, they propose a twostep procedure: First a smoothing spline is fitted with a constant smoothing parameter chosen by generalized crossvalidation. Then an estimate for the variable smoothing parameter is constructed, based on the derivatives of this pilot estimate, and is plugged into their locally adaptive penalty to fit the smoothing spline in a second step. Ruppert and Carroll (2000) propose Psplines based on a truncated power series basis and di#erence penalties on the regression coe#cients with locally adaptive smoothing parameters. The latter are obtained by linear interpolation from a smaller number of smoothing parameters, defined for a subset of knots and estimated by generalized crossvalidation
A Bayesian semiparametric latent variable model for binary, ordinal and continuous response. Dissertation
, 2005
"... In this article we introduce a latent variable model (LVM) for mixed ordinal and continuous responses, where covariate effects on the continuous latent variables are modelled through a flexible semiparametric predictor. We extend existing LVM with simple linear covariate effects by including nonpara ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
In this article we introduce a latent variable model (LVM) for mixed ordinal and continuous responses, where covariate effects on the continuous latent variables are modelled through a flexible semiparametric predictor. We extend existing LVM with simple linear covariate effects by including nonparametric components for nonlinear effects of continuous covariates and interactions with other covariates as well as spatial effects. Full Bayesian modelling is based on penalized spline and Markov random field priors and is performed by computationally efficient Markov chain Monte Carlo (MCMC) methods. We apply our approach to a large German social science survey which motivated our methodological development.
Some asymptotic results on generalized penalized spline smoothing
 B
"... The paper discusses asymptotic properties of penalized spline smoothing if the spline basis increases with the sample size. The proof is provided in a generalized smoothing model allowing for nonnormal responses. The results are extended in two ways. First, assuming the spline coefficients to be a ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
The paper discusses asymptotic properties of penalized spline smoothing if the spline basis increases with the sample size. The proof is provided in a generalized smoothing model allowing for nonnormal responses. The results are extended in two ways. First, assuming the spline coefficients to be a priori normally distributed links the smoothing framework to generalized linear mixed models (GLMM). We consider the asymptotic rates such that Laplace approximation is justified and the resulting fits in the mixed model correspond to penalized spline estimates. Secondly, we make use of a fully Bayesian viewpoint by imposing a priori distribution on all parameters and coefficients. We argue that with the postulated rates at which the spline basis dimension increases with the sample size the posterior distribution of the spline coefficients is approximately normal. The validity of this result is investigated in finite samples by comparing Markov Chain Monte Carlo (MCMC) results with their asymptotic approximation in a simulation study. 1 1
Approximate Bayesian Inference for Survival Models
, 2010
"... Bayesian analysis of timetoevent data, usually called survival analysis, has received increasing attention in the last years. In Coxtype models it allows to use information from the full likelihood instead of from a partial likelihood, so that the baseline hazard function and the model parameters ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Bayesian analysis of timetoevent data, usually called survival analysis, has received increasing attention in the last years. In Coxtype models it allows to use information from the full likelihood instead of from a partial likelihood, so that the baseline hazard function and the model parameters can be jointly estimated. In general, Bayesian methods permit a full and exact posterior inference for any parameter or predictive quantity of interest. On the other side, Bayesian inference often relies on Markov Chain Monte Carlo (MCMC) techniques which, from the user point of view, may appear slow at delivering answers. In this paper, we show how a new inferential tool named Integrated Nested Laplace approximations (INLA) can be adapted and applied to many survival models making Bayesian analysis both fast and accurate without having to rely on MCMC based inference.
Simultaneous probability statements for Bayesian Psplines. Revised for Statistical Modeling
, 2006
"... Psplines are a popular approach for fitting nonlinear effects of continuous covariates in semiparametric regression models. Recently, a Bayesian version for Psplines has been developed on the basis of Markov chain Monte Carlo simulation techniques for inference. In this work we adopt and generaliz ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Psplines are a popular approach for fitting nonlinear effects of continuous covariates in semiparametric regression models. Recently, a Bayesian version for Psplines has been developed on the basis of Markov chain Monte Carlo simulation techniques for inference. In this work we adopt and generalize the concept of Bayesian contour probabilities to Bayesian Psplines within a generalized additive models framework. More specifically, we aim at computing the maximum credible level (sometimes called Bayesian pvalue) for which a particular parameter vector of interest lies within the corresponding highest posterior density (HPD) region. We are particularly interested in parameter vectors that correspond to a constant, linear or more generally a polynomial fit. As an alternative to HPD regions simultaneous credible intervals could be used to define pseudo contour probabilities. Efficient algorithms for computing contour and pseudo contour probabilities are developed. The performance of the approach is assessed through simulation studies and applications to data for the Munich rental guide and on undernutrition in Zambia and Tanzania. 1