Results 1  10
of
269
Prediction With Gaussian Processes: From Linear Regression To Linear Prediction And Beyond
 Learning and Inference in Graphical Models
, 1997
"... The main aim of this paper is to provide a tutorial on regression with Gaussian processes. We start from Bayesian linear regression, and show how by a change of viewpoint one can see this method as a Gaussian process predictor based on priors over functions, rather than on priors over parameters. Th ..."
Abstract

Cited by 195 (4 self)
 Add to MetaCart
The main aim of this paper is to provide a tutorial on regression with Gaussian processes. We start from Bayesian linear regression, and show how by a change of viewpoint one can see this method as a Gaussian process predictor based on priors over functions, rather than on priors over parameters. This leads in to a more general discussion of Gaussian processes in section 4. Section 5 deals with further issues, including hierarchical modelling and the setting of the parameters that control the Gaussian process, the covariance functions for neural network models and the use of Gaussian processes in classification problems. PREDICTION WITH GAUSSIAN PROCESSES: FROM LINEAR REGRESSION TO LINEAR PREDICTION AND BEYOND 2 1 Introduction In the last decade neural networks have been used to tackle regression and classification problems, with some notable successes. It has also been widely recognized that they form a part of a wide variety of nonlinear statistical techniques that can be used for...
Bayesian Experimental Design: A Review
 Statistical Science
, 1995
"... This paper reviews the literature on Bayesian experimental design, both for linear and nonlinear models. A unified view of the topic is presented by putting experimental design in a decision theoretic framework. This framework justifies many optimality criteria, and opens new possibilities. Various ..."
Abstract

Cited by 171 (1 self)
 Add to MetaCart
This paper reviews the literature on Bayesian experimental design, both for linear and nonlinear models. A unified view of the topic is presented by putting experimental design in a decision theoretic framework. This framework justifies many optimality criteria, and opens new possibilities. Various design criteria become part of a single, coherent approach.
A Rigorous Framework for Optimization of Expensive Functions by Surrogates
, 1998
"... The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which direct application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approxima ..."
Abstract

Cited by 132 (17 self)
 Add to MetaCart
The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which direct application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31variable helicopter rotor blade design example and for a standard optimization test example. Key Words: Approximation concepts, surrogate optimization, response surfaces, pattern search methods, derivativefree optimization, design and analysis of computer experiments (DACE), computational engineering. # ...
A Framework for Evolutionary Optimization with Approximate Fitness Functions
 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
, 2002
"... It is a common engineering practice to use approximate models instead of the original computationally expensive model in optimization. When an approximate model is used for evolutionary optimization, the convergence properties of the evolutionary algorithm are unclear due to the approximation error. ..."
Abstract

Cited by 72 (12 self)
 Add to MetaCart
It is a common engineering practice to use approximate models instead of the original computationally expensive model in optimization. When an approximate model is used for evolutionary optimization, the convergence properties of the evolutionary algorithm are unclear due to the approximation error. In this paper, extensive empirical studies on convergence of an evolution strategy are carried out on two benchmark problems. It is found that incorrect convergence will occur if the approximate model has false optima. To address this problem, individual and generation based evolution control is introduced and the resulting effects on the convergence properties are presented. A framework for managing approximate models in generationbased evolution control is proposed. This framework is well suited for parallel evolutionary optimization that is able to guarantee the correct convergence of the evolutionary algorithm and to reduce the computation costs as much as possible. Control o...
Computer Experiments
, 1996
"... Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, a ..."
Abstract

Cited by 67 (5 self)
 Add to MetaCart
Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, and so on. Some of the most widely used computer models, and the ones that lead us to work in this area, arise in the design of the semiconductors used in the computers themselves. A process simulator starts with a data structure representing an unprocessed piece of silicon and simulates the steps such as oxidation, etching and ion injection that produce a semiconductor device such as a transistor. A device simulator takes a description of such a device and simulates the flow of current through it under varying conditions to determine properties of the device such as its switching speed and the critical voltage at which it switches. A circuit simulator takes a list of devices and the
Comparative Studies Of Metamodeling Techniques Under Multiple Modeling Criteria
 Structural and Multidisciplinary Optimization
, 2000
"... 1 Despite the advances in computer capacity, the enormous computational cost of complex engineering simulations makes it impractical to rely exclusively on simulation for the purpose of design optimization. To cut down the cost, surrogate models, also known as metamodels, are constructed from and ..."
Abstract

Cited by 52 (3 self)
 Add to MetaCart
1 Despite the advances in computer capacity, the enormous computational cost of complex engineering simulations makes it impractical to rely exclusively on simulation for the purpose of design optimization. To cut down the cost, surrogate models, also known as metamodels, are constructed from and then used in lieu of the actual simulation models. In the paper, we systematically compare four popular metamodeling techniquesPolynomial Regression, Multivariate Adaptive Regression Splines, Radial Basis Functions, and Krigingbased on multiple performance criteria using fourteen test problems representing different classes of problems. Our objective in this study is to investigate the advantages and disadvantages these four metamodeling techniques using multiple modeling criteria and multiple test problems rather than a single measure of merit and a single test problem. 1 Introduction Simulationbased analysis tools are finding increased use during preliminary design to explore desi...
A Comparison Of Approximation Modeling Techniques: Polynomial Versus Interpolating Models
, 1998
"... Two methods of creating approximation models are compared through the calculation of the modeling accuracy on test problems involving one, five, and ten independent variables. Here, the test problems are representative of the modeling challenges typically encountered in realistic engineering optimiz ..."
Abstract

Cited by 44 (10 self)
 Add to MetaCart
Two methods of creating approximation models are compared through the calculation of the modeling accuracy on test problems involving one, five, and ten independent variables. Here, the test problems are representative of the modeling challenges typically encountered in realistic engineering optimization problems. The first approximation model is a quadratic polynomial created using the method of least squares. This type of polynomial model has seen considerable use in recent engineering optimization studies due to its computational simplicity and ease of use. However, quadratic polynomial models may be of limited accuracy when the response data to be modeled have multiple local extrema. The second approximation model employs an interpolation scheme known as kriging developed in the fields of spatial statistics and geostatistics. This class of interpolating model has the flexibility to model response data with multiple local extrema. However, this flexibility is obtained at an increase...
Bayesian Treed Gaussian Process Models with an Application to Computer Modeling
 Journal of the American Statistical Association
, 2007
"... This paper explores nonparametric and semiparametric nonstationary modeling methodologies that couple stationary Gaussian processes and (limiting) linear models with treed partitioning. Partitioning is a simple but effective method for dealing with nonstationarity. Mixing between full Gaussian proce ..."
Abstract

Cited by 44 (15 self)
 Add to MetaCart
This paper explores nonparametric and semiparametric nonstationary modeling methodologies that couple stationary Gaussian processes and (limiting) linear models with treed partitioning. Partitioning is a simple but effective method for dealing with nonstationarity. Mixing between full Gaussian processes and simple linear models can yield a more parsimonious spatial model while significantly reducing computational effort. The methodological developments and statistical computing details which make this approach efficient are described in detail. Illustrations of our model are given for both synthetic and real datasets. Key words: recursive partitioning, nonstationary spatial model, nonparametric regression, Bayesian model averaging 1
An EMO Algorithm Using the Hypervolume Measure as Selection Criterion
 2005 Intâ€™l Conference, March 2005
, 2005
"... The hypervolume measure is one of the most frequently applied measures for comparing the results of evolutionary multiobjective optimization algorithms (EMOA). The idea to use this measure for selection is selfevident. A steadystate EMOA will be devised, that combines concepts of nondominated sor ..."
Abstract

Cited by 40 (8 self)
 Add to MetaCart
The hypervolume measure is one of the most frequently applied measures for comparing the results of evolutionary multiobjective optimization algorithms (EMOA). The idea to use this measure for selection is selfevident. A steadystate EMOA will be devised, that combines concepts of nondominated sorting with a selection operator based on the hypervolume measure. The algorithm computes a well distributed set of solutions with bounded size thereby focussing on interesting regions of the Pareto front(s). By means of standard benchmark problems the algorithm will be compared to other well established EMOA. The results show that our new algorithm achieves good convergence to the Pareto front and outperforms standard methods in the hypervolume covered.
Predicting the Output from a Complex Computer Code when Fast Approximations are Available
 Biometrika
, 1998
"... this paper. ..."