Results 1  10
of
29
Firstorder incremental blockbased statistical timing analysis
 In DAC
, 2004
"... Variability in digital integrated circuits makes timing verification an extremely challenging task. In this paper, a canonical first order delay model is proposed that takes into account both correlated and independent randomness. A novel lineartime blockbased statistical timing algorithm is emplo ..."
Abstract

Cited by 156 (6 self)
 Add to MetaCart
(Show Context)
Variability in digital integrated circuits makes timing verification an extremely challenging task. In this paper, a canonical first order delay model is proposed that takes into account both correlated and independent randomness. A novel lineartime blockbased statistical timing algorithm is employed to propagate timing quantities like arrival times and required arrival times through the timing graph in this canonical form. At the end of the statistical timing, the sensitivities of all timing quantities to each of the sources of variation are available. Excessive sensitivities can then be targeted by manual or automatic optimization methods to improve the robustness of the design. The statistical timing analysis is incremental, and is therefore suitable for use in the inner loop of physical synthesis or other optimization programs. The second novel contribution of this paper is the computation of local and global criticality probabilities. For a very small cost in CPU time, the probability of each edge or node of the timing graph being critical is computed. These criticality probabilities provide additional useful diagnostics to synthesis, optimization, test generation and path enumeration programs. Numerical results are presented on industrial ASIC chips with over two million logic gates. 1.
Gate Sizing Using Incremental Parameterized Statistical Timing Analysis
 In ICCAD
, 2005
"... Abstract — As technology scales into the sub90nm domain, manufacturing variations become an increasingly significant portion of circuit delay. As a result, delays must be modeled as statistical distributions during both analysis and optimization. This paper uses incremental, parametric statistical ..."
Abstract

Cited by 32 (2 self)
 Add to MetaCart
(Show Context)
Abstract — As technology scales into the sub90nm domain, manufacturing variations become an increasingly significant portion of circuit delay. As a result, delays must be modeled as statistical distributions during both analysis and optimization. This paper uses incremental, parametric statistical static timing analysis (SSTA) to perform gate sizing with a required yield target. Both correlated and uncorrelated process parameters are considered by using a firstorder linear delay model with fitted process sensitivities. The fitted sensitivities are verified to be accurate with circuit simulations. Statistical information in the form of criticality probabilities are used to actively guide the optimization process which reduces runtime and improves area and performance. The gate sizing results show a significant improvement in worst slack at 99.86 % yield over deterministic optimization. I.
A General Framework for Accurate Statistical Timing Analysis Considering Correlations
 In DAC
, 2005
"... The impact of parameter variations on timing due to process and environmental variations has become significant in recent years. With each new technology node this variability is becoming more prominent. In this work, we present a general Statistical Timing Analysis (STA) framework that captures spa ..."
Abstract

Cited by 31 (6 self)
 Add to MetaCart
The impact of parameter variations on timing due to process and environmental variations has become significant in recent years. With each new technology node this variability is becoming more prominent. In this work, we present a general Statistical Timing Analysis (STA) framework that captures spatial correlations between gate delays. Our technique does not make any assumption about the distributions of the parameter variations, gate delay and arrival times. We propose a Taylorseries expansion based polynomial representation of gate delays and arrival times which is able to e#ectively capture the nonlinear dependencies that arise due to increasing parameter variations. In order to reduce the computational complexity introduced due to polynomial modeling during STA, we propose an e#cient linearmodeling driven polynomial STA scheme. On an average the degree2 polynomial scheme had a 7.3x speedup as compared to Monte Carlo with 0.049 units of rms error w.r.t Monte Carlo. Our technique is generic and can be applied to arbitrary variations in the underlying parameters.
A New Statistical Max Operation for Propagating Skewness in Statistical Timing Analysis
"... Statistical static timing analysis (SSTA) is emerging as a solution for predicting the timing characteristics of digital circuits under process variability. For computing the statistical max of two arrival time probability distributions, existing analytical SSTA approaches use the results given by C ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
Statistical static timing analysis (SSTA) is emerging as a solution for predicting the timing characteristics of digital circuits under process variability. For computing the statistical max of two arrival time probability distributions, existing analytical SSTA approaches use the results given by Clark in [8]. These analytical results are exact when the two operand arrival time distributions have jointly Gaussian distributions. Due to the nonlinear max operation, arrival time distributions are typically skewed. Furthermore, nonlinear dependence of gate delays and nongaussian process parameters also make the arrival time distributions asymmetric. Therefore, for computing the max accurately, a new approach is required that accounts for the inherent skewness in arrival time distributions. In this work, we present analytical solution for computing the statistical max operation. 1 First, the skewness in arrival time distribution is modeled by matching its first three moments to a socalled skewed normal distribution. Then by extending Clark’s work to handle skewed normal distributions we derive analytical expressions for computing the moments of the max. We then show using initial simulations results that using a skewness based max operation has a significant potential to improve the accuracy of the statistical max operation in SSTA while retaining its computational efficiency. 1.
Moment closures for performance models with highly nonlinear rates
 in 9th European Performance Engineering Workshop (EPEW
, 2012
"... rates ..."
(Show Context)
Construction of SkewNormal Random Variables: Are They Linear Combinations of Normal and HalfNormal?
"... Skewnormal distributions extend the normal distributions through a shape parameter α; they reduce to the standard normal random variable Z for α = 0 and to Z  or the halfnormal when α → ∞. In spite of the skewness they (dis)inherit some properties of normal random variables: Square of a skewnor ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Skewnormal distributions extend the normal distributions through a shape parameter α; they reduce to the standard normal random variable Z for α = 0 and to Z  or the halfnormal when α → ∞. In spite of the skewness they (dis)inherit some properties of normal random variables: Square of a skewnormal random variable has a chisquare distribution with one degree of freedom, but the sum of two independent skewnormal random variables is not generally skewnormal. We review and explain this lack of closure and other properties of skewnormal random variables via their representations as a special linear combination of independent normal and halfnormal random variables. Analogues of such representations are used to define multivariate skewnormal distributions with a closure property similar to that of multivariate normal distributions.
Meanfield analysis of Markov models with reward feedback
"... Abstract. We extend the population continuous time Markov chain formalism so that the state space is augmented with continuous variables accumulated over time as functions of component populations. System feedback can be expressed using accumulations that in turn can influence the Markov chain behav ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We extend the population continuous time Markov chain formalism so that the state space is augmented with continuous variables accumulated over time as functions of component populations. System feedback can be expressed using accumulations that in turn can influence the Markov chain behaviour via functional transition rates. We show how to obtain meanfield differential equations capturing means and higherorder moments of the discrete populations and continuous accumulation variables. We also provide first and secondorder convergence results and suggest a novel normal moment closure that can greatly improve the accuracy of means and higher moments. We demonstrate how such a framework is suitable for modelling feedback from globallyaccumulated quantities such as energy consumption, cost or temperature. Finally, we present a worked example modelling a hypothetical heterogeneous computing cluster and its interaction with air conditioning units. 1
European Central Bank
, 2007
"... We propose and compare procedures for inference about mean squared prediction error (MSPE) when comparing a benchmark model against a small number of alternatives that nest the benchmark. We evaluate two procedures that adjust MSPE differences in accordance with Clark and West (2007); one examines t ..."
Abstract
 Add to MetaCart
We propose and compare procedures for inference about mean squared prediction error (MSPE) when comparing a benchmark model against a small number of alternatives that nest the benchmark. We evaluate two procedures that adjust MSPE differences in accordance with Clark and West (2007); one examines the maximum tstatistic, the other computes a chisquared statistic. We also examine two procedures that do not adjust the MSPE differences: a chisquared statistic, and White’s (2000) reality check. In our simulations, the two statistics that adjust MSPE differences have most accurate size, and the procedure that looks at the maximum tstatistic has best power. We illustrate our procedures by comparing forecasts of different models for U.S. inflation.
Gaming and Strategic Ambiguity in Incentive Provision ∗
"... A central tenet of economics is that people respond to incentives. While an appropriately crafted incentive scheme can achieve the secondbest optimum in the presence of moral hazard, the principal must be very well informed about the environment (e.g. the agent’s preferences and the production tech ..."
Abstract
 Add to MetaCart
A central tenet of economics is that people respond to incentives. While an appropriately crafted incentive scheme can achieve the secondbest optimum in the presence of moral hazard, the principal must be very well informed about the environment (e.g. the agent’s preferences and the production technology) in order to achieve this. Indeed it is often suggested that incentive schemes can be gamed by an agent with superior knowledge of the environment, and furthermore that lack of transparency about the nature of the incentive scheme can reduce gaming. We provide a formal theory of these phenomena. We show that random or ambiguous incentive schemes induce more balanced efforts from an agent who performs multiple tasks and who is better informed about the environment than the principal is. On the other hand, such random schemes impose more risk on the agent per unit of effort induced. By identifying settings in which random schemes are especially effective in inducing balanced efforts, we show that, if tasks are sufficiently complementary for the principal, random incentive schemes can dominate the best deterministic scheme. (JEL L13, L22)