Results 1  10
of
12
Fast statistical timing analysis handling arbitrary delay correlations
 in Proc. IEEE/ACM Design Autom. Conf
"... An efficient statistical timing analysis algorithm that can handle arbitrary (spatial and structural) causes of delay correlation is described. The algorithm derives the entire cumulative distribution function of the circuit delay using a new mathematical formulation. Spatial as well as structural c ..."
Abstract

Cited by 38 (3 self)
 Add to MetaCart
An efficient statistical timing analysis algorithm that can handle arbitrary (spatial and structural) causes of delay correlation is described. The algorithm derives the entire cumulative distribution function of the circuit delay using a new mathematical formulation. Spatial as well as structural correlations between gate and wire delays can be taken into account. The algorithm can handle node delays described by nonGaussian distributions. Because the analytical computation of an exact cumulative distribution function for a probabilistic graph with arbitrary distributions is infeasible, we find tight upper and lower bounds on the true cumulative distribution. An efficient algorithm to compute the bounds is based on a PERTlike single traversal of the subgraph containing the set of N deterministically longest paths. The efficiency and accuracy of the algorithm is demonstrated on a set of ISCAS’85 benchmarks. Across all the benchmarks, the average rms error between the exact distribution and lower bound is 0.7%, and the average maximum error at 95 th percentile is 0.6%. The computation of bounds for the largest benchmark takes 39 seconds.
PolynomialTime Techniques For Approximate Timing Analysis Of Asynchronous Systems
, 1998
"... As designers strive to build systems on chips with ever diminishing device sizes, and as clock speeds of gigahertz and above are being contemplated, the limitations of synchronous circuits are beginning to surface. Consequently, there has been a renewed interest in asyn chronous design techniques t ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
As designers strive to build systems on chips with ever diminishing device sizes, and as clock speeds of gigahertz and above are being contemplated, the limitations of synchronous circuits are beginning to surface. Consequently, there has been a renewed interest in asyn chronous design techniques that use judicious timing assumptions to obtain fast circuits with low hardware overhead. However, the correct operation of these circuits depend on certain timing constraints being satisfied in the actual implementation. Since statistical variations in manufacturing conditions and operating conditions result in uncertainties in component delays in a chip, it is important to analyze asynchronous systems with uncer tain component delays to check for timing constraint violations and to determine sufficient conditions for their correct operation. Unfortunately, several timing analysis problems are computationally intractable when component delays are uncertain but bounded. This the sis presents polynomialtime techniques for approximate timing analysis of asynchronous systems with bounded component delays. Although the algorithms are conservative in the worst case, experiments indicate that they are fairly accurate in practice.
A Survey on Solution Methods for Task Graph Models
, 1993
"... We give in this paper a survey on models developed in the literature using the concept of task graphs, focusing on solution techniques. Different types of task graphs are considered, from PERTS networks to random task graphs. Reviewed solution methods include exact computations and bounds. 1 Int ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
We give in this paper a survey on models developed in the literature using the concept of task graphs, focusing on solution techniques. Different types of task graphs are considered, from PERTS networks to random task graphs. Reviewed solution methods include exact computations and bounds. 1 Introduction, Concepts and Notations The purpose of this paper is to survey models based on stochastic task graph representations and the solutions techniques that have been developed for them. The reason for doing this in the framework of the QMIPS project is that task graphs appear to be of central importance in the modeling and analysis of parallel programs and architectures. Yet, the solution of task graph problems is difficult in general. No really satisfactory and sufficiently general solutions have been proposed as of today, and research is still active in the area. The term "task graphs" covers now a wide variety of models. We shall begin the survey with what appears to be the initi...
Persistency Model and Its Applications in Choice Modeling ∗
, 2006
"... Given a discrete optimization problem Z(˜c) = max{˜c ′ x: x ∈ X}, with objective coefficients ˜c chosen randomly from a distribution θ, we would like to evaluate the expected value Eθ(Z(˜c)) and the probability Pθ(x ∗ i (˜c) = k) where x ∗ (˜c) is an optimal solution to Z(˜c). We call this the per ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Given a discrete optimization problem Z(˜c) = max{˜c ′ x: x ∈ X}, with objective coefficients ˜c chosen randomly from a distribution θ, we would like to evaluate the expected value Eθ(Z(˜c)) and the probability Pθ(x ∗ i (˜c) = k) where x ∗ (˜c) is an optimal solution to Z(˜c). We call this the persistency problem for a discrete optimization problem under uncertain objective, and Pθ(x ∗ i (˜c) = k), the persistence value of the variable xi at k. In general, this is a difficult problem to solve, even if θ is well specified. In this paper, we show that a subclass of this problem can be solved in polynomial time. In particular, we assume that θ belongs to the class of distributions Θ with given marginal distributions, or given marginal moments conditions. Under these models, we show that the persistency problem for θ ∗ ∈ argmax θ∈Θ Eθ[Z(˜c)] can be solved via a concave maximization problem. The persistency model solved using this formulation can be used to obtain important qualitative insights to the behaviour of stochastic discrete optimization problems. We demonstrate how the approach can be used to obtain insights to problems in discrete choice modeling. Using a set of survey data from a transport choice modeling study, we calibrate the random utility
Persistence in Discrete Optimization under Data Uncertainty
, 2004
"... An important question in discrete optimization under uncertainty is to understand the persistency of a decision variable, i.e., the probability that it is part of an optimal solution. For instance, in project management, when the task activity times are random, the challenge is to determine a set o ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
An important question in discrete optimization under uncertainty is to understand the persistency of a decision variable, i.e., the probability that it is part of an optimal solution. For instance, in project management, when the task activity times are random, the challenge is to determine a set of critical activities that will potentially lie on the longest path. In the spanning tree and shortest path network problems, when the arc lengths are random, the challenge is to preprocess the network and determine a smaller set of arcs that will most probably be a part of the optimal solution under different realizations of the arc lengths. Building on a characterization of moment cones for single variate problems, and its associated semidefinite constraint representation, we develop a limited marginal moment model to compute the persistency of a decision variable. Under this model, we show that finding the persistency is tractable for zeroone optimization problems with a polynomial sized representation of the convex hull of the feasible region. Through extensive experiments, we show that the persistency computed under the limited marginal moment model is often close to the simulated persistency value under various distributions that satisfy the prescribed marginal moments and are generated independently.
ABSTRACT τAU: Timing Analysis Under Uncertainty
"... Due to excessive reduction in the gate length, dopant concentrations and the oxide thickness, even the slightest of variations in these quantities can result in significant variations in the performance of a device. This has resulted in a need for efficient and accurate techniques for performing Sta ..."
Abstract
 Add to MetaCart
Due to excessive reduction in the gate length, dopant concentrations and the oxide thickness, even the slightest of variations in these quantities can result in significant variations in the performance of a device. This has resulted in a need for efficient and accurate techniques for performing Statistical Analysis of circuits. In this paper 1 we propose a methodology based on Bayesian Networks for computing the exact probability distribution of the delay of a circuit. In case of large circuits where it is not possible to compute the exact distribution, we propose methods to reduce the problem size and get a tight lower bound on the exact distribution. 1.
1 ABSTRACT Fast Statistical Timing Analysis Handling Arbitrary Delay Correlations
"... efficient statistical timing analysis algorithm that can handle arbitrary (spatial and structural) causes of delay correlation is described. The algorithm derives the entire cumulative distribution function of the circuit delay using a new mathematical formulation. Spatial as well as structural corr ..."
Abstract
 Add to MetaCart
efficient statistical timing analysis algorithm that can handle arbitrary (spatial and structural) causes of delay correlation is described. The algorithm derives the entire cumulative distribution function of the circuit delay using a new mathematical formulation. Spatial as well as structural correlations between gate and wire delays can be taken into account. The algorithm can handle node delays described by nonGaussian distributions. Because the analytical computation of an exact cumulative distribution function for a probabilistic graph with arbitrary distributions is infeasible, we find tight upper and lower bounds on the true cumulative distribution. An efficient algorithm to compute the bounds is based on a PERTlike single traversal of the subgraph containing the set of N deterministically longest paths. The efficiency and accuracy of the algorithm is demonstrated on a set of ISCAS’85 benchmarks. Across all the benchmarks, the average rms error between the exact distribution and lower bound is 0.7%, and the average maximum error at 95 th percentile is 0.6%. The computation of bounds for the largest benchmark takes 39 seconds.
28.2 A Methodology to Improve Timing Yield in the Presence of Process Variations ABSTRACT
"... The ability to control the variations in IC fabrication process is rapidly diminishing as feature sizes continue towards the sub100 nm regime. As a result, there is an increasing uncertainty in the performance of CMOS circuits. Accounting for the worst case values of all parameters will result in a ..."
Abstract
 Add to MetaCart
The ability to control the variations in IC fabrication process is rapidly diminishing as feature sizes continue towards the sub100 nm regime. As a result, there is an increasing uncertainty in the performance of CMOS circuits. Accounting for the worst case values of all parameters will result in an unacceptably low timing yield. Design for Variability, which involves designing to achieve a given level of confidence in the performance of ICs, is fast becoming an indispensable part of IC design methodology. This paper 1 describes a method to identify certain paths in the circuit that are responsible for the spread of timing performance. The method is based on defining a disutility function of the gate and path delays, which includes both the means and variances of the delay random variables. Based on the moments of this disutility function, an algorithm is presented which selects a subset of paths (called undominated paths) as being most responsible for the variation in timing performance. Next, a statistical gate sizing algorithm is presented, which is aimed at minimizing the delay variability of the nodes in the selected paths subject to constraints on the critical path delay and the area penalty. MonteCarlo simulations with ISCAS ’85 benchmark circuits show that our statistical optimization approach results in significant improvements in timing yield over traditional deterministic sizing methods.
CIPDATA LIBRARY TECHNISCHE UNIVERSITEIT EINDHOVEN
"... Towards predictable deepsubmicron manufacturing ..."