Results 1  10
of
14
Statistical Timing Analysis Under Spatial Correlations
 IEEE Transactions on ComputerAided Design of Integrated Circuits and Systems
, 2005
"... Abstract — Process variations are of increasing concern in today’s technologies, and can significantly affect circuit performance. We present an efficient statistical timing analysis algorithm that predicts the probability distribution of the circuit delay considering both interdie and intradie va ..."
Abstract

Cited by 41 (4 self)
 Add to MetaCart
Abstract — Process variations are of increasing concern in today’s technologies, and can significantly affect circuit performance. We present an efficient statistical timing analysis algorithm that predicts the probability distribution of the circuit delay considering both interdie and intradie variations, while accounting for the effects of spatial correlations of intradie parameter variations. The procedure uses a firstorder Taylor series expansion to approximate the gate and interconnect delays. Next, principal component analysis techniques are��and ��� are employed to transform the set of correlated parameters into an uncorrelated set. The statistical timing computation is then easily performed with a PERTlike circuit graph traversal. The runtime of our algorithm is linear in the number of gates and interconnects, as well as the number of varying parameters and grid partitions that are used to model spatial correlations. The accuracy of the method is verified with Monte Carlo simulation. On average, for 100nm technology, the errors of mean and standard deviation values computed by the proposed method respectively, and the errors of predicting the��and confidence point are ���and ���respectively. A testcase with about 17,800 gates was solved in about�seconds, with high accuracy as compared to a Monte Carlo simulation that required more than�hours.
Digital Circuit Optimization via Geometric Programming
 Operations Research
, 2005
"... informs ® doi 10.1287/opre.1050.0254 © 2005 INFORMS This paper concerns a method for digital circuit optimization based on formulating the problem as a geometric program (GP) or generalized geometric program (GGP), which can be transformed to a convex optimization problem and then very efficiently s ..."
Abstract

Cited by 27 (7 self)
 Add to MetaCart
informs ® doi 10.1287/opre.1050.0254 © 2005 INFORMS This paper concerns a method for digital circuit optimization based on formulating the problem as a geometric program (GP) or generalized geometric program (GGP), which can be transformed to a convex optimization problem and then very efficiently solved. We start with a basic gate scaling problem, with delay modeled as a simple resistorcapacitor (RC) time constant, and then add various layers of complexity and modeling accuracy, such as accounting for differing signal fall and rise times, and the effects of signal transition times. We then consider more complex formulations such as robust design over corners, multimode design, statistical design, and problems in which threshold and power supply voltage are also variables to be chosen. Finally, we look at the detailed design of gates and interconnect wires, again using a formulation that is compatible with GP or GGP.
A heuristic for optimizing stochastic activity networks with applications to statistical digital circuit sizing
 IEEE Transactions on Circuits and SystemsI
, 2004
"... A deterministic activity network (DAN) is a collection of activities, each with some duration, along with a set of precedence constraints, which specify that activities begin only when certain others have finished. One critical performance measure for an activity network is its makespan, which is th ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
A deterministic activity network (DAN) is a collection of activities, each with some duration, along with a set of precedence constraints, which specify that activities begin only when certain others have finished. One critical performance measure for an activity network is its makespan, which is the minimum time required to complete all activities. In a stochastic activity network (SAN), the durations of the activities and the makespan are random variables. The analysis of SANs is quite involved, but can be carried out numerically by Monte Carlo analysis. This paper concerns the optimization of a SAN, i.e., the choice of some design variables that affect the probability distributions of the activity durations. We concentrate on the problem of minimizing a quantile (e.g., 95%) of the makespan, subject to constraints on the variables. This problem has many applications, ranging from project management to digital integrated circuit (IC) sizing (the latter being our motivation). While there are effective methods for optimizing DANs, the SAN optimization problem is much more difficult; the few existing methods cannot handle largescale problems.
Statistical timing analysis with twosided constraints
 In IEEE/ACM International Conference on Computer Aided Design
, 2005
"... ..."
Advances in computation of the maximum of a set of random variables
 in Proceedings of the 7th International Symposium on Quality Electronic Design
, 2006
"... This paper quantifies the approximation error in Clark’s approach [1] to computing the maximum (max) of Gaussian random variables; a fundamental operation in statistical timing. We show that a finite Look Up Table can be used to store these errors. Based on the error computations, approaches to diff ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
This paper quantifies the approximation error in Clark’s approach [1] to computing the maximum (max) of Gaussian random variables; a fundamental operation in statistical timing. We show that a finite Look Up Table can be used to store these errors. Based on the error computations, approaches to different orderings for pairwise max operations on a set of Gaussians are proposed. Experiments show accuracy improvements in the computation of the max of multiple Gaussians by up to 50 % in comparison to the traditional approach. To the best of our knowledge, this is the first work addressing the mentioned issues. 1
Clock schedule verification under process variations
 IEEE/ACM International Conference on Computer Aided Design, 2004. ICCAD2004
, 2004
"... With aggressive scaling down of feature sizes in VLSI fabrication, process variations have become a critical issue in designs, especially for highperformance ICs. Usually having levelsensitive latches for their speed, highperformance IC designs need to verify the clock schedules. With process var ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
With aggressive scaling down of feature sizes in VLSI fabrication, process variations have become a critical issue in designs, especially for highperformance ICs. Usually having levelsensitive latches for their speed, highperformance IC designs need to verify the clock schedules. With process variations, the verification needs to compute the probability of correct clocking. Because of complex statistical correlations, traditional iterative approaches are difficult to get accurate results. Instead, a statistical checking of the structural conditions for correct clocking is proposed, where the central problem is to compute the probability of having a positive cycle in a graph with random edge weights. The proposed method only traverses the graph once to avoid the correlations among iterations, and it considers not only data delay variations but also clock skew variations. Experimental results showed that the proposed approach has an error of 0.14 % on average in comparisons with the Monte Carlo simulations. 1
A yield model for integrated circuits and its application to statistical timing analysis
 IEEE Transactions on ComputerAided Design
"... Abstract—A model for processinduced parameter variations is proposed, combining dietodie, withindie systematic, and withindie random variations. This model is put to use toward finding suitable timing margins and device file settings, to verify whether a circuit meets a desired timing yield. Whi ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Abstract—A model for processinduced parameter variations is proposed, combining dietodie, withindie systematic, and withindie random variations. This model is put to use toward finding suitable timing margins and device file settings, to verify whether a circuit meets a desired timing yield. While this parameter model is cognizant of withindie correlations, it does not require specific variation models, layout information, or prior knowledge of intrachip covariance trends. The approach works with a “generic ” critical path, leading to what is referred to as a “processspecific” statisticaltiminganalysis technique that depends only on the process technology, transistor parameters, and circuit style. A key feature is that the variation model can be easily built from process data. The derived results are “fullchip, ” applicable with ease to circuits with millions of components. As such, this provides a way to do a statistical timing analysis without the need for detailed statistical analysis of every path in the design. Index Terms—Correlations, dietodie variations, generic critical path, parametric yield, principal component analysis, statistical timing analysis, timing margin, virtual corner, withindie variations. I.
Timing Yield Estimation Using Statistical Static Timing Analysis
 In Proceedings of IEEE International Symposium on Circuits and Systems
, 2005
"... Abstract—As process variations become a significant problem in deep submicron technology, a shift from deterministic static timing analysis to statistical static timing analysis for highperformance circuit designs could reduce the excessive conservatism that is built into current timing design met ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract—As process variations become a significant problem in deep submicron technology, a shift from deterministic static timing analysis to statistical static timing analysis for highperformance circuit designs could reduce the excessive conservatism that is built into current timing design method. In this paper, we address the timing yield problem for sequential circuits and propose a statistical approach to handle it. In our approach, we consider the spatial and path reconvergence correlations between path delays, setup time and hold time constraints, as well as clock skew due to process variations. We propose a method to get the timing yield based on the delay distributions of registertoregister paths in the circuit. On average, the timing yield results obtained by our approach have average errors of less than 1.0 % in comparison with Monte Carlo simulation. Experimental results show that shortest path variations and clock skew due to process variations have considerable impact on circuit timing, which could bias the timing yield results. In addition, the correlation between longest and shortest path delays is not significant. 1.
Time and Space Efficient Method for Accurate Computation of Error Detection
 Probabilities in VLSI Circuits,” IEE Proc. of Computers
, 2005
"... Abstract—We propose a novel fault/error model based on a graphical probabilistic framework. We arrive at the Logic Induced Fault Encoded Directed Acyclic Graph (LIFEDAG) that is proven to be a Bayesian network, capturing all spatial dependencies induced by the circuit logic. Bayesian Networks are ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract—We propose a novel fault/error model based on a graphical probabilistic framework. We arrive at the Logic Induced Fault Encoded Directed Acyclic Graph (LIFEDAG) that is proven to be a Bayesian network, capturing all spatial dependencies induced by the circuit logic. Bayesian Networks are the minimal and exact representation of the joint probability distribution of the underlying probabilistic dependencies that not only use conditional independencies in modeling but also exploits them for achieving minimality and smart probabilistic inference. The detection probabilities also act as a measure of soft error susceptibility (an increased threat in nanodomain logic block) that depends on the structural correlations of the internal nodes and also on input patterns. Based on this model, we show that we are able to estimate detection probabilities of faults/errors on ISCAS’85 benchmarks with high accuracy, linear space requirement complexity, and with an order of magnitude ( 5 times) reduction in estimation time over corresponding BDD based approaches. I.
A Heuristic Method for Statistical Digital Circuit Sizing
"... In this paper we give a brief overview of a heuristic method for approximately solving a statistical digital circuit sizing problem, by reducing it to a related deterministic sizing problem that includes extra margins in each of the gate delays to account for the variation. Since the method is based ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper we give a brief overview of a heuristic method for approximately solving a statistical digital circuit sizing problem, by reducing it to a related deterministic sizing problem that includes extra margins in each of the gate delays to account for the variation. Since the method is based on solving a deterministic sizing problem, it readily handles largescale problems. Numerical experiments show that the resulting designs are often substantially better than one in which the variation in delay is ignored, and often quite close to the global optimum. Moreover, the designs seem to be good despite the simplicity of the statistical model (which ignores gate distribution shape, correlations, and so on). We illustrate the method on a 32bit LadnerFischer adder, with a simple resistorcapacitor (RC) delay model, and a Pelgrom model of delay variation.