Results 1  10
of
66
Calculations of sobol indices for the gaussian process metamodel
 RELIABILITY ENGINEERING & SYSTEM SAFETY
, 2009
"... Global sensitivity analysis of complex numerical models can be performed by calculating variancebased importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer c ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
Global sensitivity analysis of complex numerical models can be performed by calculating variancebased importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well known and widely used decision consists in replacing the computer code by a metamodel, predicting the model responses with a negligible computation time and rending straightforward the estimation of Sobol indices. In this paper, we discuss about the Gaussian process model which gives analytical expressions of Sobol indices. Two approaches are studied to compute the Sobol indices: the first based on the predictor of the Gaussian process model and the second based on the global stochastic process model. Comparisons between the two estimates, made on analytical examples, show the superiority of the second approach in terms of convergence and robustness. Moreover, the second approach allows to integrate the modeling error of the Gaussian process model by directly giving some confidence intervals on the Sobol indices. These techniques are finally applied to a real case of hydrogeological modeling.
An efficient methodology for modeling complex computer codes with gaussian processes
, 2008
"... Complex computer codes are often too time expensive to be directly used to perform uncertainty propagation studies, global sensitivity analysis or to solve optimization problems. A well known and widely used method to circumvent this inconvenience consists in replacing the complex computer code by a ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
Complex computer codes are often too time expensive to be directly used to perform uncertainty propagation studies, global sensitivity analysis or to solve optimization problems. A well known and widely used method to circumvent this inconvenience consists in replacing the complex computer code by a reduced model, called a metamodel, or a response surface that represents the computer code and requires acceptable calculation time. One particular class of metamodels is studied: the Gaussian process model that is characterized by its mean and covariance functions. A specific estimation procedure is developed to adjust a Gaussian process model in complex cases (non linear relations, highly dispersed or discontinuous output, high dimensional input, inadequate sampling designs,...). The efficiency of this algorithm is compared to the efficiency of other existing algorithms on an analytical test case. The proposed methodology is also illustrated for the case of a complex hydrogeological computer code, simulating radionuclide transport in groundwater.
Gaussian Process Models for Computer Experiments With Qualitative and Quantitative Factors
, 2007
"... Modeling experiments with qualitative and quantitative factors is an important issue in computer modeling. A framework for building Gaussian process models that incorporate both types of factors is proposed. The key to the development of these new models is an approach for constructing correlation f ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
Modeling experiments with qualitative and quantitative factors is an important issue in computer modeling. A framework for building Gaussian process models that incorporate both types of factors is proposed. The key to the development of these new models is an approach for constructing correlation functions with qualitative and quantitative factors. An iterative estimation procedure is developed for the proposed models. Modern optimization techniques are used in the estimation to ensure the validity of the constructed correlation functions. The proposed method is illustrated with an example involving a known function and a real example for modeling the thermal distribution of a data center. KEY WORDS: Cokriging; Design of experiments; Kriging; Multivariate Gaussian processes; Semidefinite programming.
Generation of yieldaware Pareto surfaces for hierarchical circuit design space exploration
 in Proc. Des. Autom. Conf., 2006
"... Pareto surfaces in the performance space determine the range of feasible performance values for a circuit topology in a given technology. We present a nondominated sorting based global optimization algorithm to generate the nominal pareto front efficiently using a simulatorinaloop approach. Th ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
Pareto surfaces in the performance space determine the range of feasible performance values for a circuit topology in a given technology. We present a nondominated sorting based global optimization algorithm to generate the nominal pareto front efficiently using a simulatorinaloop approach. The solutions on this pareto front combined with efficient Monte Carlo approximation ideas are then used to compute the yieldaware pareto fronts. We show experimental results for both the nominal and yieldaware pareto fronts for power and phase noise for a voltage controlled oscillator (VCO) circuit. The presented methodology computes yieldaware pareto fronts in approximately 56 times the time required for a single circuit synthesis run and is thus practically efficient. We also show applications of yieldaware paretos to find the optimal VCO circuit to meet the system level specifications of a phase locked loop.
OrthogonalMaximin Latin Hypercube Designs
"... A randomly generated Latin hypercube design (LHD) can be quite structured: the variables may be highly correlated or the design may not have good spacefilling properties. There are procedures to find good LHDs by minimizing the pairwise correlations or maximizing the intersite distances. In this a ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
A randomly generated Latin hypercube design (LHD) can be quite structured: the variables may be highly correlated or the design may not have good spacefilling properties. There are procedures to find good LHDs by minimizing the pairwise correlations or maximizing the intersite distances. In this article we have shown that these two criteria need not agree with each other. In fact, maximization of intersite distances can result in LHDs where the variables are highly correlated and vice versa. Therefore, we propose a multiobjective optimization approach to find good LHDs by combining correlation and distance performance measures. We also propose a new exchange algorithm for efficiently generating such designs. Several examples are presented to show that the new algorithm is fast and that the optimal designs are good in terms of both the correlation and distance criteria.
Sliced spacefilling designs
 Biometrika
"... We propose an approach to constructing a new type of design, a sliced spacefilling design, intended for computer experiments with qualitative and quantitative factors. The approach starts with constructing a Latin hypercube design based on a special orthogonal array for the quantitative factors and ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
We propose an approach to constructing a new type of design, a sliced spacefilling design, intended for computer experiments with qualitative and quantitative factors. The approach starts with constructing a Latin hypercube design based on a special orthogonal array for the quantitative factors and then partitions the design into groups corresponding to different level combinations of the qualitative factors. The points in each group have good spacefilling properties. Some illustrative examples are given.
Global sensitivity analysis of stochastic computer models with generalized additive models
, 2008
"... ..."
CONTROLLED STRATIFICATION FOR QUANTILE ESTIMATION
, 802
"... In this paper we propose and discuss variance reduction techniques for the estimation of quantiles of the output of a complex model with random input parameters. These techniques are based on the use of a reduced model, such as a metamodel or a response surface. The reduced model can be used as a co ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
In this paper we propose and discuss variance reduction techniques for the estimation of quantiles of the output of a complex model with random input parameters. These techniques are based on the use of a reduced model, such as a metamodel or a response surface. The reduced model can be used as a control variate; or a rejection method can be implemented to sample the realizations of the input parameters in prescribed relevant strata; or the reduced model can be used to determine a good biased distribution of the input parameters for the implementation of an importance sampling strategy. The different strategies are analyzed and the asymptotic variances are computed, which shows the benefit of an adaptive controlled stratification method. This method is finally applied to a real example (computation of the peak cladding temperature during a largebreak loss of coolant accident in a nuclear reactor). 1. Introduction. Quantile
Kriginv: An ] efficient and userfriendly implementation of batch≤ y(k) i+ sequential inversion strategies based on kriging. Computational Statistics & Data Analysis (2013) Collette, ] Y., Siarry, P.: Multiobjective optimization: prin≤ y(k) ciples and cas
 Statistics and Computing
, 2001
"... batchsequential inversion strategies based on Kriging ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
batchsequential inversion strategies based on Kriging