Results 1  10
of
35
Gate Sizing Using Incremental Parameterized Statistical Timing Analysis
 In ICCAD
, 2005
"... Abstract — As technology scales into the sub90nm domain, manufacturing variations become an increasingly significant portion of circuit delay. As a result, delays must be modeled as statistical distributions during both analysis and optimization. This paper uses incremental, parametric statistical ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
Abstract — As technology scales into the sub90nm domain, manufacturing variations become an increasingly significant portion of circuit delay. As a result, delays must be modeled as statistical distributions during both analysis and optimization. This paper uses incremental, parametric statistical static timing analysis (SSTA) to perform gate sizing with a required yield target. Both correlated and uncorrelated process parameters are considered by using a firstorder linear delay model with fitted process sensitivities. The fitted sensitivities are verified to be accurate with circuit simulations. Statistical information in the form of criticality probabilities are used to actively guide the optimization process which reduces runtime and improves area and performance. The gate sizing results show a significant improvement in worst slack at 99.86 % yield over deterministic optimization. I.
Analysis and modeling of CD variation for statistical static timing,” ICCAD
, 2006
"... Statistical static timing analysis (SSTA) has become a key method for analyzing the effect of process variation in aggressively scaled CMOS technologies. Much research has focused on the modeling of spatial correlation in SSTA. However, the vast majority of these works used artificially generated pr ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
Statistical static timing analysis (SSTA) has become a key method for analyzing the effect of process variation in aggressively scaled CMOS technologies. Much research has focused on the modeling of spatial correlation in SSTA. However, the vast majority of these works used artificially generated process data to test the proposed models. Hence, it is difficult to determine the actual effectiveness of these methods, the conditions under which they are necessary, and whether they lead to a significant increase in accuracy that warrants their increased runtime and complexity. In this paper, we study 5 different correlation models and their associated SSTA methods using 35420 critical dimension (CD) measurements that were extracted from 23 reticles on 5 wafers in a 130nm CMOS process. Based on the measured CD data, we analyze the correlation as a function of distance and generate 5 distinct correlation models, ranging from simple models which incorporate one or two variation components to more complex models that utilize principle component analysis and Quadtrees. We then study the accuracy of the different models and compare their SSTA results with the result of running STA directly on the extracted data. We also examine the tradeoff between model accuracy and run time, as well as the impact of die size on model accuracy. We show that, especially for small dies (< 6.6mm x 5.7mm), the simple models provide comparable accuracy to that of the more complex ones, while incurring significantly less runtime and implementation difficulty. The results of this study demonstrate that correlation models for SSTA must be carefully tested on actual process data and must be used judiciously. 1.
Criticality computation in parameterized statistical timing
 in Proc. Design Automation Conf
, 2006
"... Chips manufactured in 90 nm technology have shown large parametric variations, and a worsening trend is predicted. These parametric variations make circuit optimization difficult since different paths are frequencylimiting in different parts of the multidimensional process space. Therefore, it is ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
Chips manufactured in 90 nm technology have shown large parametric variations, and a worsening trend is predicted. These parametric variations make circuit optimization difficult since different paths are frequencylimiting in different parts of the multidimensional process space. Therefore, it is desirable to have a new diagnostic metric for robust circuit optimization. This paper presents a novel algorithm to compute the criticality probability of every edge in the timing graph of a design with linear complexity in the circuit size. Using industrial benchmarks, we verify the correctness of our criticality computation via Monte Carlo simulation. We also show that for large industrial designs with 442,000 gates, our algorithm computes all edge criticalities in less than 160 seconds.
Intervalbased Robust Statistical Techniques for Nonnegative Convex Functions, with Application to Timing Analysis of Computer Chips
 Proceedings of the Second International Workshop on Reliable Engineering Computing
, 2006
"... In chip design, one of the main objectives is to decrease its clock cycle. On the design stage, this time is usually estimated by using worstcase (interval) techniques, in which we only use the bounds on the parameters that lead to delays. This analysis does not take into account that the probabili ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
In chip design, one of the main objectives is to decrease its clock cycle. On the design stage, this time is usually estimated by using worstcase (interval) techniques, in which we only use the bounds on the parameters that lead to delays. This analysis does not take into account that the probability of the worstcase values is usually very small; thus, the resulting estimates are overconservative, leading to unnecessary overdesign and underperformance of circuits. If we knew the exact probability distributions of the corresponding parameters, then we could use MonteCarlo simulations (or the corresponding analytical techniques) to get the desired estimates. In practice, however, we only have partial information about the corresponding distributions, and we want to produce estimates that are valid for all distributions which are consistent with this information.
Nonlinear statistical static timing analysis for nongaussian variation sources,” in UCLA
 In Proc. DAC
, 2007
"... Existing statistical static timing analysis (SSTA) techniques suffer from limited modeling capability by using a linear delay model with Gaussian distribution, or have scalability problems due to expensive operations involved to handle nonGaussian variation sources or nonlinear delays. To overcome ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Existing statistical static timing analysis (SSTA) techniques suffer from limited modeling capability by using a linear delay model with Gaussian distribution, or have scalability problems due to expensive operations involved to handle nonGaussian variation sources or nonlinear delays. To overcome these limitations, we propose a novel SSTA technique to handle both nonlinear delay dependency and nonGaussian variation sources simultaneously. We develop efficient algorithms to perform all statistical atomic operations (such as max and add) efficiently via either closedform formulas or onedimensional lookup tables. The resulting timing quantity provably preserves the correlation with variation sources to the thirdorder. We prove that the complexity of our algorithm is linear in both variation sources and circuit sizes, hence our algorithm scales well for
Statistical timing yield optimization by gate sizing
 TCAD
, 2006
"... Abstract—In this paper, we propose a statistical gate sizing approach to maximize the timing yield of a given circuit, under area constraints. Our approach involves statistical gate delay modeling, statistical static timing analysis, and gate sizing. Experiments performed in an industrial framework ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Abstract—In this paper, we propose a statistical gate sizing approach to maximize the timing yield of a given circuit, under area constraints. Our approach involves statistical gate delay modeling, statistical static timing analysis, and gate sizing. Experiments performed in an industrial framework on combinational International Symposium on Circuits and Systems (ISCAS’85) and Microelectronics Center of North Carolina (MCNC) benchmarks show absolute timing yield gains of 30 % on the average, over deterministic timing optimization for at most 10 % area penalty. It is further shown that circuits optimized using our metric have larger timing yields than the same optimized using a worst case metric, for isoarea solutions. Finally, we present an insight into statistical properties of gate delays for a commercial 0.13 m technology library which intuitively provides one reason why statistical timing driven optimization does better than deterministic timing driven optimization. Index Terms—Gate sizing, optimization, statistical gate delay modeling, statistical timing analysis, timing yield, variability, VLSI. I.
Quadratic Statistical MAX Approximation for Parametric Yield Estimation of Analog/RF Integrated Circuits
"... Abstract—In this paper, we propose an efficient numerical algorithm for estimating the parametric yield of analog/RF circuits, considering largescale process variations. Unlike many traditional approaches that assume normal performance distributions, the proposed approach is particularly developed ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract—In this paper, we propose an efficient numerical algorithm for estimating the parametric yield of analog/RF circuits, considering largescale process variations. Unlike many traditional approaches that assume normal performance distributions, the proposed approach is particularly developed to handle multiple correlated nonnormal performance distributions, thereby providing better accuracy than the traditional techniques. Starting from a set of quadratic performance models, the proposed parametric yield estimation conceptually maps multiple correlated performance constraints to a single auxiliary constraint by using a MAX operator. As such, the parametric yield is uniquely determined by the probability distribution of the auxiliary constraint and, therefore, can easily be computed. In addition, two novel numerical algorithms are derived from moment matching and statistical Taylor expansion, respectively, to facilitate efficient quadratic statistical MAX approximation. We prove that these two algorithms are mathematically equivalent if the performance distributions are normal. Our numerical examples demonstrate that the proposed algorithm provides an error reduction of 6.5 times compared to a normaldistributionbased method while achieving a runtime speedup of 10–20 times over the Monte Carlo analysis with 103 samples. Index Terms—Analog/RF circuits, MAXoperator, parametric yield.
NonGaussian Statistical Parameter Modeling for SSTA with Confidence Interval Analysis
"... Abstract — Most of the existing statistical static timing analysis (SSTA) algorithms assume that the process parameters of have been given with 100 % confidence level or zero errors and are preferable Gaussian distributions. These assumptions are actually quite questionable and require careful atten ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract — Most of the existing statistical static timing analysis (SSTA) algorithms assume that the process parameters of have been given with 100 % confidence level or zero errors and are preferable Gaussian distributions. These assumptions are actually quite questionable and require careful attention. In this paper, we aim at providing solid statistical analysis methods to analyze the measurement data on testing chips and extract the statistical distribution, either Gaussian or nonGaussian which could be used in advanced SSTA algorithms for confidence interval or error bound information. Two contributions are achieved by this paper. First, we develop a moment matching based quadratic function modeling method to fit the first three moments of given measurement data in plain form which may not follow Gaussian distributions. Second, we provide a systematic way to analyze the confident intervals on our modeling strategies. The confidence intervals analysis gives the solid guidelines for testing chip data collections. Extensive experimental results demonstrate the accuracy of our algorithm. I.
Parameterized blockbased nonGaussian statistical gate timing analysis
 Proc. of Asia and South Pacific Design Automation Conference
, 2006
"... circuits becomes an increasingly challenging task due to the gate and wire variability. Therefore, statistical timing analysis (denoted by σTA) is becoming unavoidable. This paper introduces a new framework for performing statistical gate timing analysis for nonGaussian sources of variation in bloc ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
circuits becomes an increasingly challenging task due to the gate and wire variability. Therefore, statistical timing analysis (denoted by σTA) is becoming unavoidable. This paper introduces a new framework for performing statistical gate timing analysis for nonGaussian sources of variation in blockbased σTA. First, an approach is described to approximate a variational RCπ load by using a canonical firstorder model. Next, an accurate variationaware gate timing analysis based on statistical input transition, statistical gate timing library, and statistical RCπ load is presented. Finally, to achieve the aforementioned objective, a statistical effective capacitance calculation method is presented. Experimental results show an average error of 6 % for gate delay and output transition time with respect to the Monte Carlo simulation with 10 4 samples while the runtime is nearly two orders of magnitude shorter. 1.
Parameterized timing analysis with general delay models and arbitrary variation sources
 In Design Automation Conference
, 2008
"... In general, parameterized timing analysis must be able to handle nonlinear delay models and must account for delay variability due to both random process parameters (with arbitrary distributions) and uncertain nonrandom parameters (which may depend on the operating environment). Existing statistica ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
In general, parameterized timing analysis must be able to handle nonlinear delay models and must account for delay variability due to both random process parameters (with arbitrary distributions) and uncertain nonrandom parameters (which may depend on the operating environment). Existing statistical static timing analysis (SSTA) techniques suffer from at least one of the following limitations: 1) restricting their modeling capabilities to linear models, or 2) limiting random variations to Gaussian distributions, or 3) from their inability to handle uncertain nonrandom parameters. In this work, we propose a general mathematical framework for resolving the max operation in parameterized timing analysis that can overcome all the above limitations. The framework is general, in the sense that it can be applied to a general class of nonlinear delay models. In addition, it can handle both statistical process parameters, with arbitrary distributions, as well as uncertain nonrandom parameters. The max operator is efficiently resolved using carefully chosen linear combinations that preserve the inherent nonlinearity of the delay model used. Our general technique is tested for two applications, namely multicorner timing analysis and nonlinear nonGaussian SSTA, where we show that its complexity is linear in both the number of process parameters and the size of the circuit. Our results show that, on average, all timing characteristics of circuit delay are predicted with less than 2 % error for multicorner analysis, and less than 1 % error for SSTA. 1.