Results 1  10
of
52
Spatial modelling using a new class of nonstationary covariance functions
 Environmetrics
, 2006
"... We introduce a new class of nonstationary covariance functions for spatial modelling. Nonstationary covariance functions allow the model to adapt to spatial surfaces whose variability changes with location. The class includes a nonstationary version of the Matérn stationary covariance, in which the ..."
Abstract

Cited by 60 (0 self)
 Add to MetaCart
(Show Context)
We introduce a new class of nonstationary covariance functions for spatial modelling. Nonstationary covariance functions allow the model to adapt to spatial surfaces whose variability changes with location. The class includes a nonstationary version of the Matérn stationary covariance, in which the differentiability of the spatial surface is controlled by a parameter, freeing one from fixing the differentiability in advance. The class allows one to knit together local covariance parameters into a valid global nonstationary covariance, regardless of how the local covariance structure is estimated. We employ this new nonstationary covariance in a fully Bayesian model in which the unknown spatial process has a Gaussian process (GP) distribution with a nonstationary covariance function from the class. We model the nonstationary structure in a computationally efficient way that creates nearly stationary local behavior and for which stationarity is a special case. We also suggest nonBayesian approaches to nonstationary kriging. To assess the method, we compare the Bayesian nonstationary GP model with a Bayesian stationary GP model, various standard spatial smoothing approaches, and nonstationary models that can adapt to function heterogeneity. In simulations, the nonstationary GP model adapts to function heterogeneity, unlike the stationary models, and also outperforms the other nonstationary models. On a real dataset, GP models outperform the competitors, but while the nonstationary GP gives qualitatively more sensible results, it fails to outperform the stationary GP on heldout data, illustrating the difficulty in fitting complex spatial functions with relatively few observations. The nonstationary covariance model could also be used for nonGaussian data and embedded in additive models as well as in more complicated, hierarchical spatial or spatiotemporal models. More complicated models may require simpler parameterizations for computational efficiency.
Dependent Gaussian processes
 In Advances in Neural Information Processing Systems 17
, 2005
"... Gaussian processes are usually parameterised in terms of their covariance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white noi ..."
Abstract

Cited by 44 (0 self)
 Add to MetaCart
(Show Context)
Gaussian processes are usually parameterised in terms of their covariance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white noise sources convolved with smoothing kernels, and to parameterise the kernel instead. Using this, we extend Gaussian processes to handle multiple, coupled outputs. 1
Posterior Consistency in Nonparametric Regression Problems under Gaussian Process Priors
, 2004
"... Posterior consistency can be thought of as a theoretical justification of the Bayesian method. One of the most popular approaches to nonparametric Bayesian regression is to put a nonparametric prior distribution on the unknown regression function using Gaussian processes. In this paper, we study pos ..."
Abstract

Cited by 35 (2 self)
 Add to MetaCart
Posterior consistency can be thought of as a theoretical justification of the Bayesian method. One of the most popular approaches to nonparametric Bayesian regression is to put a nonparametric prior distribution on the unknown regression function using Gaussian processes. In this paper, we study posterior consistency in nonparametric regression problems using Gaussian process priors. We use an extension of the theorem of Schwartz (1965) for nonidentically distributed observations, verifying its conditions when using Gaussian process priors for the regression function with normal or double exponential (Laplace) error distributions. We define a metric topology on the space of regression functions and then establish almost sure consistency of the posterior distribution. Our metric topology is weaker than the popular L 1 topology. With additional assumptions, we prove almost sure consistency when the regression functions have L 1 topologies. When the covariate (predictor) is assumed to be a random variable, we prove almost sure consistency for the joint density function of the response and predictor using the Hellinger metric.
Kernels for VectorValued Functions: a Review
, 2011
"... Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kern ..."
Abstract

Cited by 27 (2 self)
 Add to MetaCart
(Show Context)
Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a Bayesian/generative perspective they are the key in the context of Gaussian processes, where the kernel function is also known as the covariance function. Traditionally, kernel methods have been used in supervised learning problem with scalar outputs and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partly by frameworks like multitask learning. In this paper, we review different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.
1 Counting People with LowLevel Features and Bayesian Regression
"... Abstract—An approach to the problem of estimating the size of inhomogeneous crowds, composed of pedestrians that travel in different directions, without using explicit object segmentation or tracking is proposed. Instead, the crowd is segmented into components of homogeneous motion, using the mixtur ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
Abstract—An approach to the problem of estimating the size of inhomogeneous crowds, composed of pedestrians that travel in different directions, without using explicit object segmentation or tracking is proposed. Instead, the crowd is segmented into components of homogeneous motion, using the mixture of dynamic textures motion model. A set of holistic lowlevel features is extracted from each segmented region, and a function that maps features into estimates of the number of people per segment is learned with Bayesian regression. Two Bayesian regression models are examined. The first is a combination of Gaussian process regression (GPR) with a compound kernel, which accounts for both the global and local trends of the count mapping, but is limited by the realvalued outputs that do not match the discrete counts. We address this limitation with a second model, which is based on a Bayesian treatment of Poisson regression that introduces a prior distribution on the linear weights of the model. Since exact inference is analytically intractable, a closedform approximation is derived that is computationally efficient and kernelizable, enabling the representation of nonlinear functions. An approximate marginal likelihood is also derived for kernel hyperparameter learning. The two regressionbased crowd counting methods are evaluated on a large pedestrian dataset, containing very distinct camera views, pedestrian traffic, and outliers, such as bikes or skateboarders. Experimental results show that regressionbased counts are accurate, regardless of the crowd size, outperforming the count estimates produced by stateoftheart pedestrian detectors. Results on two hours of video demonstrate the efficiency and robustness of regressionbased crowd size estimation over long periods of time. Index Terms—surveillance, crowd analysis, Bayesian regression, Gaussian processes, Poisson regression
Gaussian process modeling of large scale terrain
 In the proceedings of the International Conference on Robotics and Automation (ICRA
, 2009
"... Building a model of large scale terrain that can adequately handle uncertainty and incompleteness in a statistically sound way is a challenging problem. This work proposes the use of Gaussian processes as models of large scale terrain. The proposed model naturally provides a multiresolution represe ..."
Abstract

Cited by 18 (7 self)
 Add to MetaCart
(Show Context)
Building a model of large scale terrain that can adequately handle uncertainty and incompleteness in a statistically sound way is a challenging problem. This work proposes the use of Gaussian processes as models of large scale terrain. The proposed model naturally provides a multiresolution representation of space, incorporates and handles uncertainties aptly and copes with incompleteness of sensory information. Gaussian process regression techniques are applied to estimate and interpolate (to fill gaps in occluded areas) elevation information across the field. The estimates obtained are the best linear unbiased estimates for the data under consideration. A single nonstationary (neural network) Gaussian process is shown to be powerful enough to model large and complex terrain, effectively handling issues relating to discontinuous data. A local approximation method based on a “moving window ” methodology and implemented using KDTrees is also proposed. This enables the approach to handle extremely large datasets, thereby completely addressing its scalability issues. Experiments are performed on large scale datasets taken from real mining applications. These datasets include sparse mine planning data, which is representative of a GPS based survey, as well as dense laser scanner data taken at different minesites. Further, extensive statistical performance evaluation and benchmarking of the technique has been performed through cross validation experiments. They conclude that for dense and/or flat data, the proposed approach will perform very competitively with grid based approaches using standard interpolation techniques and triangulated irregular networks using triangle based interpolation techniques; for sparse and/or complex data however, it would significantly outperform them. 1
Adaptive nonstationary kernel regression for terrain modelling
 In Proc. of the Robotics: Science and Systems Conference (RSS
, 2007
"... Abstract — Threedimensional digital terrain models are of fundamental importance in many areas such as the geosciences and outdoor robotics. Accurate modeling requires the ability to deal with a varying data density and to balance smoothing against the preservation of discontinuities. The latter i ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
(Show Context)
Abstract — Threedimensional digital terrain models are of fundamental importance in many areas such as the geosciences and outdoor robotics. Accurate modeling requires the ability to deal with a varying data density and to balance smoothing against the preservation of discontinuities. The latter is particularly important for robotics applications, as discontinuities that arise, for example, at steps, stairs, or building walls are important features for path planning or terrain segmentation tasks. In this paper, we present an extension of the wellestablished Gaussian process regression approach that utilizes nonstationary covariance functions to locally adapt to the structure of the terrain data. In this way, we achieve strong smoothing in flat areas and along edges and at the same time preserve edges and corners. The derived model yields predictive distributions for terrain elevations at arbitrary locations and thus allows to fill gaps in the data and to perform conservative predictions in occluded areas. I.
Nonstationary Gaussian Process Regression using Point Estimates of Local Smoothness
"... Abstract. Gaussian processes using nonstationary covariance functions are a powerful tool for Bayesian regression with inputdependent smoothness. A common approach is to model the local smoothness by a latent process that is integrated over using Markov chain Monte Carlo approaches. In this paper, ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
(Show Context)
Abstract. Gaussian processes using nonstationary covariance functions are a powerful tool for Bayesian regression with inputdependent smoothness. A common approach is to model the local smoothness by a latent process that is integrated over using Markov chain Monte Carlo approaches. In this paper, we demonstrate that an approximation that uses the estimated mean of the local smoothness yields good results and allows one to employ efficient gradientbased optimization techniques for jointly learning the parameters of the latent and the observed processes. Extensive experiments on both synthetic and realworld data, including challenging problems in robotics, show the relevance and feasibility of our approach. 1
Learning Predictive Terrain Models for Legged Robot Locomotion
"... Abstract — Legged robots require accurate models of their environment in order to plan and execute paths. We present a probabilistic technique based on Gaussian processes that allows terrain models to be learned and updated efficiently using sparse approximation techniques. The major benefit of our ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
(Show Context)
Abstract — Legged robots require accurate models of their environment in order to plan and execute paths. We present a probabilistic technique based on Gaussian processes that allows terrain models to be learned and updated efficiently using sparse approximation techniques. The major benefit of our terrain model is its ability to predict elevations at unseen locations more reliably than alternative approaches, while it also yields estimates of the uncertainty in the prediction. In particular, our nonstationary Gaussian process model adapts its covariance to the situation at hand, allowing more accurate inference of terrain height at points that have not been observed directly. We show how a conventional motion planner can use the learned terrain model to plan a path to a goal location, using a terrainspecific cost model to accept or reject candidate footholds. In experiments with a real quadruped robot equipped with a laser range finder, we demonstrate the usefulness of our approach and discuss its benefits compared to simpler terrain models such as elevations grids. I.
INTELLIGENT MAPS FOR AUTONOMOUS KILOMETERSCALE SCIENCE SURVEY
, 2008
"... We present a new approach for site survey by autonomous surface robots. In our method the agent constructs an intelligent map, a multiscale model of the explored environment incorporating in situ and remote sensing data. The agent learns the model’s parameters on the fly and exploits its prediction ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
We present a new approach for site survey by autonomous surface robots. In our method the agent constructs an intelligent map, a multiscale model of the explored environment incorporating in situ and remote sensing data. The agent learns the model’s parameters on the fly and exploits its predictions to guide adaptive navigation and sampling. In this manner the agent can respond appropriately to novel correlations, resource constraints and execution errors. Rover tests at Amboy Crater, California demonstrate improved performance over nonadaptive strategies for a geologic survey task.