Results 1  10
of
89
Calculations of sobol indices for the gaussian process metamodel
 RELIABILITY ENGINEERING & SYSTEM SAFETY
, 2009
"... Global sensitivity analysis of complex numerical models can be performed by calculating variancebased importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer c ..."
Abstract

Cited by 35 (3 self)
 Add to MetaCart
Global sensitivity analysis of complex numerical models can be performed by calculating variancebased importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well known and widely used decision consists in replacing the computer code by a metamodel, predicting the model responses with a negligible computation time and rending straightforward the estimation of Sobol indices. In this paper, we discuss about the Gaussian process model which gives analytical expressions of Sobol indices. Two approaches are studied to compute the Sobol indices: the first based on the predictor of the Gaussian process model and the second based on the global stochastic process model. Comparisons between the two estimates, made on analytical examples, show the superiority of the second approach in terms of convergence and robustness. Moreover, the second approach allows to integrate the modeling error of the Gaussian process model by directly giving some confidence intervals on the Sobol indices. These techniques are finally applied to a real case of hydrogeological modeling.
An efficient methodology for modeling complex computer codes with gaussian processes
, 2008
"... Complex computer codes are often too time expensive to be directly used to perform uncertainty propagation studies, global sensitivity analysis or to solve optimization problems. A well known and widely used method to circumvent this inconvenience consists in replacing the complex computer code by a ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
Complex computer codes are often too time expensive to be directly used to perform uncertainty propagation studies, global sensitivity analysis or to solve optimization problems. A well known and widely used method to circumvent this inconvenience consists in replacing the complex computer code by a reduced model, called a metamodel, or a response surface that represents the computer code and requires acceptable calculation time. One particular class of metamodels is studied: the Gaussian process model that is characterized by its mean and covariance functions. A specific estimation procedure is developed to adjust a Gaussian process model in complex cases (non linear relations, highly dispersed or discontinuous output, high dimensional input, inadequate sampling designs,...). The efficiency of this algorithm is compared to the efficiency of other existing algorithms on an analytical test case. The proposed methodology is also illustrated for the case of a complex hydrogeological computer code, simulating radionuclide transport in groundwater.
Gaussian Process Models for Computer Experiments With Qualitative and Quantitative Factors
, 2007
"... Modeling experiments with qualitative and quantitative factors is an important issue in computer modeling. A framework for building Gaussian process models that incorporate both types of factors is proposed. The key to the development of these new models is an approach for constructing correlation f ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
Modeling experiments with qualitative and quantitative factors is an important issue in computer modeling. A framework for building Gaussian process models that incorporate both types of factors is proposed. The key to the development of these new models is an approach for constructing correlation functions with qualitative and quantitative factors. An iterative estimation procedure is developed for the proposed models. Modern optimization techniques are used in the estimation to ensure the validity of the constructed correlation functions. The proposed method is illustrated with an example involving a known function and a real example for modeling the thermal distribution of a data center. KEY WORDS: Cokriging; Design of experiments; Kriging; Multivariate Gaussian processes; Semidefinite programming.
OrthogonalMaximin Latin Hypercube Designs
"... A randomly generated Latin hypercube design (LHD) can be quite structured: the variables may be highly correlated or the design may not have good spacefilling properties. There are procedures to find good LHDs by minimizing the pairwise correlations or maximizing the intersite distances. In this a ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
A randomly generated Latin hypercube design (LHD) can be quite structured: the variables may be highly correlated or the design may not have good spacefilling properties. There are procedures to find good LHDs by minimizing the pairwise correlations or maximizing the intersite distances. In this article we have shown that these two criteria need not agree with each other. In fact, maximization of intersite distances can result in LHDs where the variables are highly correlated and vice versa. Therefore, we propose a multiobjective optimization approach to find good LHDs by combining correlation and distance performance measures. We also propose a new exchange algorithm for efficiently generating such designs. Several examples are presented to show that the new algorithm is fast and that the optimal designs are good in terms of both the correlation and distance criteria.
Generation of yieldaware Pareto surfaces for hierarchical circuit design space exploration
 in Proc. Des. Autom. Conf., 2006
"... Pareto surfaces in the performance space determine the range of feasible performance values for a circuit topology in a given technology. We present a nondominated sorting based global optimization algorithm to generate the nominal pareto front efficiently using a simulatorinaloop approach. Th ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
(Show Context)
Pareto surfaces in the performance space determine the range of feasible performance values for a circuit topology in a given technology. We present a nondominated sorting based global optimization algorithm to generate the nominal pareto front efficiently using a simulatorinaloop approach. The solutions on this pareto front combined with efficient Monte Carlo approximation ideas are then used to compute the yieldaware pareto fronts. We show experimental results for both the nominal and yieldaware pareto fronts for power and phase noise for a voltage controlled oscillator (VCO) circuit. The presented methodology computes yieldaware pareto fronts in approximately 56 times the time required for a single circuit synthesis run and is thus practically efficient. We also show applications of yieldaware paretos to find the optimal VCO circuit to meet the system level specifications of a phase locked loop.
ACCURATE EMULATORS FOR LARGESCALE COMPUTER EXPERIMENTS
, 1203
"... Largescale computer experiments are becoming increasingly important in science. A multistep procedure is introduced to statisticians for modeling such experiments, which builds an accurate interpolator inmultiple steps. Inpractice, the procedureshows substantial improvements in overall accuracy, b ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
Largescale computer experiments are becoming increasingly important in science. A multistep procedure is introduced to statisticians for modeling such experiments, which builds an accurate interpolator inmultiple steps. Inpractice, the procedureshows substantial improvements in overall accuracy, but its theoretical properties are not well established. We introduce the terms nominal and numeric error and decompose the overall error of an interpolator into nominal and numeric portions. Bounds on the numeric and nominal error are developed to show theoretically that substantial gains in overall accuracy can be attained with the multistep approach.
Global sensitivity analysis for models with spatially dependent outputs
 Environmetrics
"... The global sensitivity analysis of a complex numerical model often calls for the estimation of variancebased importance measures, named Sobol ’ indices. Metamodelbased techniques have been developed in order to replace the cpu timeexpensive computer code with an inexpensive mathematical function ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
(Show Context)
The global sensitivity analysis of a complex numerical model often calls for the estimation of variancebased importance measures, named Sobol ’ indices. Metamodelbased techniques have been developed in order to replace the cpu timeexpensive computer code with an inexpensive mathematical function, which predicts the computer code output. The common metamodelbased sensitivity analysis methods are wellsuited for computer codes with scalar outputs. However, in the environmental domain, as in many areas of application, the numerical model outputs are often spatial maps, which may also vary with time. In this paper, we introduce an innovative method to obtain a spatial map of Sobol ’ indices with a minimal number of numerical model computations. It is based upon the functional decomposition of the spatial output onto a wavelet basis and the metamodeling of the wavelet coefficients by the Gaussian process. An analytical example is presented to clarify the various 1 ha l0
CONTROLLED STRATIFICATION FOR QUANTILE ESTIMATION
, 802
"... In this paper we propose and discuss variance reduction techniques for the estimation of quantiles of the output of a complex model with random input parameters. These techniques are based on the use of a reduced model, such as a metamodel or a response surface. The reduced model can be used as a co ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
In this paper we propose and discuss variance reduction techniques for the estimation of quantiles of the output of a complex model with random input parameters. These techniques are based on the use of a reduced model, such as a metamodel or a response surface. The reduced model can be used as a control variate; or a rejection method can be implemented to sample the realizations of the input parameters in prescribed relevant strata; or the reduced model can be used to determine a good biased distribution of the input parameters for the implementation of an importance sampling strategy. The different strategies are analyzed and the asymptotic variances are computed, which shows the benefit of an adaptive controlled stratification method. This method is finally applied to a real example (computation of the peak cladding temperature during a largebreak loss of coolant accident in a nuclear reactor). 1. Introduction. Quantile
Global sensitivity analysis of stochastic computer models with generalized additive models
, 2008
"... ..."
Recent developments in nonregular fractional factorial designs
, 2009
"... Nonregular fractional factorial designs such as PlackettBurman designs and other orthogonal arrays are widely used in various screening experiments for their run size economy and flexibility. The traditionalanalysis focuses on main effects only. Hamada and Wu (1992) went beyond the traditional app ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
(Show Context)
Nonregular fractional factorial designs such as PlackettBurman designs and other orthogonal arrays are widely used in various screening experiments for their run size economy and flexibility. The traditionalanalysis focuses on main effects only. Hamada and Wu (1992) went beyond the traditional approach and proposed an analysis strategy to demonstrate that some interactions could be entertained and estimated beyond a few significant main effects. Their ground breaking work stimulated much of the recent developments in optimality criteria, constructionand analysis of nonregular designs. This paper reviews important developments in nonregular designs, including projection properties, generalized resolution, generalized minimum aberration criteria, optimality results, construction methods and analysis strategies.