Results 1  10
of
21
The damage costs of climate change toward more comprehensive calculations
 Environmental and Resource Economics
, 1995
"... Abstract. It is argued that estimating the damage costs of a certain benchmark climate change is not sufficient. What is needed are cost functions and confidence intervals. Although these are contained in the integrated models and their technical manuals, this paper brings them into the open in orde ..."
Abstract

Cited by 99 (28 self)
 Add to MetaCart
(Show Context)
Abstract. It is argued that estimating the damage costs of a certain benchmark climate change is not sufficient. What is needed are cost functions and confidence intervals. Although these are contained in the integrated models and their technical manuals, this paper brings them into the open in order to stimulate discussion. After briefly reviewing the benchmark climate change damage costs, regionspecific cost functions are presented which distinguish tangible from intangible losses and the losses due to a changing climate from those due to a changed climate. Furthermore, cost functions are assumed to be quadratic, as an approximation of the unknown but presumably convex functions. Results from the damage module of the integrated climate economy model FUND are presented, Next, uncertainties are incorporated and expected damages are calculated. It is shown that because of convex loss functions and rightskewed uncertainties, the risk premium is substantial, calling for more action than analysis based on bestguess estimates. The final section explores some needs for further scientific research. Key words. Climate change damage costs; cost functions; uncertainty. 1.
WorstCase Analysis and Optimization of VLSI Circuit Performances
, 1995
"... In this paper, we present a new approach for realistic worstcase analysis of VLSI circuit performances and a novel methodology for circuit performance optimization. Circuit performance measures are modeled as response surfaces of the designable and uncontrollable (noise) parameters. Worstcase anal ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
In this paper, we present a new approach for realistic worstcase analysis of VLSI circuit performances and a novel methodology for circuit performance optimization. Circuit performance measures are modeled as response surfaces of the designable and uncontrollable (noise) parameters. Worstcase analysis proceeds by first computing the worstcase circuit performance value and then determining the worstcase noise parameter values by solving a nonlinear programming problem. A new circuit optimization technique is developed to find an optimal design point at which all of the circuit specifications are met under worstcase conditions. This worstcase design optimization method is formulated as a constrained multicriteria optimization. The methodologies described in this paper are applied to several VLSI circuits to demonstrate their accuracy and efficiency. Keywords Worstcase analysis, worstcase design optimization. I. Introduction I NEVITABLE fluctuations in the manufacturing proces...
Feasibility and Performance Region Modeling of Analog and Digital Circuits
 Analog Integrated Circuits and Signal Processing
, 1996
"... Hierarchy plays a significant role in the design of digital and analog circuits. At each level of the hierarchy it becomes essential to evaluate if a subblock design is feasible and if so which design style is the best candidate for the particular problem. This paper proposes a general methodology ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
Hierarchy plays a significant role in the design of digital and analog circuits. At each level of the hierarchy it becomes essential to evaluate if a subblock design is feasible and if so which design style is the best candidate for the particular problem. This paper proposes a general methodology for evaluating the feasibility and the performance of subblocks at all levels of the hierarchy. A vertical binary search technique is used to generate the feasibility macromodel and a layered volumeslicing methodology with radial basis functions is used to generate the performance macromodel. Macromodels have been developed and verified for both analog and digital blocks. Analog macromodels have been developed at three different levels of hierarchy (current mirror, opamp, and A/D converter). The impact of different fabrication processes on the performance of analog circuits have also been explored. Though the modeling technique has been fine tuned to handle analog circuits the approach is ...
Adaptive Response Surface Method  A Global Optimization Scheme for Computationintensive Design Problems
 JOURNAL OF ENGINEERING OPTIMIZATION
, 2001
"... For design problems involving computationintensive analysis or simulation processes, approximation models are usually introduced to reduce computation time. Most approximationbased optimization methods make stepbystep improvements to the approximation model by adjusting the limits of the design ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
For design problems involving computationintensive analysis or simulation processes, approximation models are usually introduced to reduce computation time. Most approximationbased optimization methods make stepbystep improvements to the approximation model by adjusting the limits of the design variables. In this work, a new approximationbased optimization method for computationintensive design problems — the adaptive response surface method (ARSM), is presented. The ARSM creates quadratic approximation models for the computationintensive design objective function in a gradually reduced design space. The ARSM was designed to avoid being trapped by local optimum and to identify the global design optimum with a modest number of objective function evaluations. Extensive tests on the ARSM as a global optimization scheme using benchmark problems, as well as an industrial design application of the method, are presented. Advantages and limitations of the approach are also discussed.
Convexitybased Algorithms for Design Centering
 IN PROCEEDINGS OF THE IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTERAIDED DESIGN
, 1993
"... A new technique for design centering, and for polytope approximation of the feasible region for a design are presented. In the first phase, the feasible region is approximated by a convex polytope, using a method based on a theorem on convex sets. As a natural consequence of this approach, a good ap ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
A new technique for design centering, and for polytope approximation of the feasible region for a design are presented. In the first phase, the feasible region is approximated by a convex polytope, using a method based on a theorem on convex sets. As a natural consequence of this approach, a good approximation to the design center is obtained. In the next phase, the exact design center is estimated using one of two techniques that we present in this paper. The first inscribes the largest Hessian ellipsoid, which is known to be a good approximation to the shape of the polytope, within the polytope. This represents an improvement over previous methods, such as simplicial approximation, where a hypersphere or a crudely estimated ellipsoid is inscribed within the approximating polytope. However, when the pdf's of the design parameters are known, the design center does not necessarily correspond to the center of the largest inscribed ellipsoid. Hence, a second technique is developed, which incorporates the probability distributions of the parameters, under the assumption that their variation is modeled by Gaussian probability distributions. The problem is formulated as a convex programming problem and an efficient algorithm is used to calculate the design center, using fast and e#cient Monte Carlo methods to estimate the yield gradient. An example is provided to illustrate how ellipsoidbased methods fail to incorporate the probability density functions, and is solved using the convex programmingbased algorithm.
Screening the Input Variables to a Computer Model Via Analysis of Variance and
 Visualization,” in Screening: Methods for Experimentation in Industry, Drug Discovery and Genetics
, 2006
"... An experiment involving a complex computer model or code may have tens or even hundreds of input variables and, hence, the identication of the more important variables (screening) is often crucial. Methods are described for decomposing a complex inputoutput relationship into eects. Eects are more e ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
An experiment involving a complex computer model or code may have tens or even hundreds of input variables and, hence, the identication of the more important variables (screening) is often crucial. Methods are described for decomposing a complex inputoutput relationship into eects. Eects are more easily understood because each is due to only one or a small number of input variables. They can be assessed for importance either visually or via a functional analysis of variance. Eects are estimated from
exible approximations to the inputoutput relationships model of the computer model. This allows complex nonlinear and interaction relationships to be identied. The methodology is demonstrated on a computer model of the relationship between environmental policy and the world economy. 1
Bayesian Validation of a Computer Model for Vehicle Collision
"... A key question in evaluation of computer models is Does the computer model adequately represent reality? A complete Bayesian approach to answering this question is developed for the challenging practical context in which the computer model (and reality) produce functional data. The methodology is pa ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
A key question in evaluation of computer models is Does the computer model adequately represent reality? A complete Bayesian approach to answering this question is developed for the challenging practical context in which the computer model (and reality) produce functional data. The methodology is particularly suited to treating the major issues associated with the validation process: quantifying multiple sources of error and uncertainty in computer models; combining multiple sources of information; and being able to adapt to different – but related – scenarios through hierarchical modeling. It is also shown how one can formally test if the computer model reproduces reality. The approach is illustrated through study of a computer model developed to model vehicle crashworthiness.
Sensitivity analysis in linear and nonlinear models: A review
"... Question: How do the inputs affect the outputs? ..."
Modelbased Design Analysis and Yield Optimization
 IEEE TRANS. ON SEMICONDUCTOR MANUFACTURING
"... Fluctuations are inherent to any fabrication process. Integrated circuits and microelectromechanical systems are particularly affected by these variations, and due to high quality requirements the effect on the devices ’ performance has to be understood quantitatively. In recent years it has becom ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Fluctuations are inherent to any fabrication process. Integrated circuits and microelectromechanical systems are particularly affected by these variations, and due to high quality requirements the effect on the devices ’ performance has to be understood quantitatively. In recent years it has become possible to model the performance of such complex systems on the basis of design specifications, and modelbased Sensitivity Analysis has made its way into industrial engineering. We show how an efficient Bayesian approach, using a Gaussian process prior, can replace the commonly used bruteforce Monte Carlo scheme, making it possible to apply the analysis to computationally costly models. We introduce a number of global, statistically justified sensitivity measures for design analysis and optimization. Two models of integrated systems serve us as case studies to introduce the analysis and to assess its convergence properties. We show that the Bayesian Monte Carlo scheme can save costly simulation runs and can ensure a reliable accuracy of the analysis.
Design and Analysis of Robust Total Joint Replacements: Finite Element Model Experiments with Environmental Variables
"... riance D t D t L U o x d x e , ( ) x d b d , ( ) x e E Q , ( ) Y h Q s z 2 1 Summary Assessing the performance of orthopaedic devices using complex computer simulations can be prohibitively time consuming. One reason is that the performance of such devices depends on multiple facto ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
riance D t D t L U o x d x e , ( ) x d b d , ( ) x e E Q , ( ) Y h Q s z 2 1 Summary Assessing the performance of orthopaedic devices using complex computer simulations can be prohibitively time consuming. One reason is that the performance of such devices depends on multiple factors. Some of these factors are controllable design variables such as prosthesis material properties and geometry while others are uncontrollable environmental variables such as tissue properties and loading conditions. Chang et al. (1999) address these computational challenges using an efficient statistical predictor and optimization methodology and illustrate the method by optimizing a flexible hip implant subject to multiple environmental conditions. In the present study, we extend thi