Results 1  10
of
11
An Efficient Algorithm for Statistical Minimization of Total Power under Timing Yield Constraints
 In DAC
, 2005
"... Power minimization under variability is formulated as a rigorous statistical robust optimization program with a guarantee of power and timing yields. Both power and timing metrics are treated probabilistically. Power reduction is performed by simultaneous sizing and dual threshold voltage assignment ..."
Abstract

Cited by 38 (1 self)
 Add to MetaCart
Power minimization under variability is formulated as a rigorous statistical robust optimization program with a guarantee of power and timing yields. Both power and timing metrics are treated probabilistically. Power reduction is performed by simultaneous sizing and dual threshold voltage assignment. An extremely fast runtime is achieved by casting the problem as a secondorder conic problem and solving it using efficient interiorpoint optimization methods. When compared to the deterministic optimization, the new algorithm, on average, reduces static power by 31 % and total power by 17 % without the loss of parametric yield. The run time on a variety of public and industrial benchmarks is 30X faster than other known statistical power minimization algorithms.
Gate Sizing Using Incremental Parameterized Statistical Timing Analysis
 In ICCAD
, 2005
"... Abstract — As technology scales into the sub90nm domain, manufacturing variations become an increasingly significant portion of circuit delay. As a result, delays must be modeled as statistical distributions during both analysis and optimization. This paper uses incremental, parametric statistical ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
Abstract — As technology scales into the sub90nm domain, manufacturing variations become an increasingly significant portion of circuit delay. As a result, delays must be modeled as statistical distributions during both analysis and optimization. This paper uses incremental, parametric statistical static timing analysis (SSTA) to perform gate sizing with a required yield target. Both correlated and uncorrelated process parameters are considered by using a firstorder linear delay model with fitted process sensitivities. The fitted sensitivities are verified to be accurate with circuit simulations. Statistical information in the form of criticality probabilities are used to actively guide the optimization process which reduces runtime and improves area and performance. The gate sizing results show a significant improvement in worst slack at 99.86 % yield over deterministic optimization. I.
Robust Gate Sizing via Mean Excess Delay Minimization
"... We introduce mean excess delay as a statistical measure of circuit delay in the presence of parameter variations. The βmean excess delay is defined as the expected delay of the circuits that exceed the βquantile of the delay, so it is always an upper bound on the βquantile. However, in contrast t ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We introduce mean excess delay as a statistical measure of circuit delay in the presence of parameter variations. The βmean excess delay is defined as the expected delay of the circuits that exceed the βquantile of the delay, so it is always an upper bound on the βquantile. However, in contrast to the βquantile, it preserves the convexity properties of the underlying delay distribution. We apply the βmean excess delay to the circuit sizing problem, and use it to minimize the delay quantile over the gate sizes. We use the Analytic Centering Cutting Plane Method to perform the minimization and apply this sizing to the ISCAS ‘85 benchmarks. Depending on the structure of the circuit, it can make significant improvements on the 95%quantile.
On the Futility of Statistical Power Optimization
"... In response to the increasing variations in integratedcircuit manufacturing, the current trend is to create designs that take these variations into account statistically. In this paper we try to quantify the difference between the statistical and deterministic optima of leakage power while making n ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In response to the increasing variations in integratedcircuit manufacturing, the current trend is to create designs that take these variations into account statistically. In this paper we try to quantify the difference between the statistical and deterministic optima of leakage power while making no assumptions about the delay model. We develop a framework for deriving a theoretical upperbound on the suboptimality that is incurred by using the deterministic optimum as an approximation for the statistical optimum. On average, the bound is 2.4 % for a suite of benchmark circuits in a 45nm technology. We further give an intuitive explanation and show, by using solution rank orders, that the practical suboptimality gap is much lower. Therefore, the need for statistical power modeling for the purpose of optimization is questionable. I.
Evaluating the Effectiveness of Statistical Gate Sizing for Power Optimization
, 2005
"... We evaluate the effectiveness of statistical gate sizing to minimize circuit power. We develop reliable posynomial models for delay and power that are accurate to within 510 % of 130nm library data. We formulate statistical sizing as a geometric program, accounting for randomness in gate delays. Fo ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We evaluate the effectiveness of statistical gate sizing to minimize circuit power. We develop reliable posynomial models for delay and power that are accurate to within 510 % of 130nm library data. We formulate statistical sizing as a geometric program, accounting for randomness in gate delays. For various ISCAS85 circuits, statistical sizing at a 99.8% target yield provides 25 % power reduction compared to a 3σ worstcase deterministic approach. However, this can be replicated by deterministic sizing using a less conservative corner. Statistical sizing, under assumptions of variational independence, is still conservative and further power reductions can be achieved for the same timing target and yield. 1.
Timing Budgeting under Arbitrary Process Variations ∗
"... Timing budgeting under process variations is an important step in a statistical optimization flow. We propose a novel formulation of the problem where budgets are statistical instead of deterministic as in existing works. This new formulation considers the changes of both the means and variances of ..."
Abstract
 Add to MetaCart
Timing budgeting under process variations is an important step in a statistical optimization flow. We propose a novel formulation of the problem where budgets are statistical instead of deterministic as in existing works. This new formulation considers the changes of both the means and variances of delays, and thus can reduce the timing violation introduced by ignoring the changes of variances. We transform the problem to a linear programming problem using a robust optimization technique. Our approach can be used in latestage design where the detailed distribution information is known, and is most useful in earlystage design since our approach does not assume specific underlying distributions. In addition, with the help of blocklevel timing budgeting, our approach can reduce the timing pessimism. Our approach is applied to the leakage power minimization problem. The results demonstrate that our approach can reduce timing violation from 690ps to 0ps, and the worst total leakage power by 17.50 % on average. 1
A Statistical Circuit Optimization Algorithm under Thermal and Timing Constraints
"... Process Variation has become a crucial challenge on both interconnect delay and reliability of nanometer integrated circuit designs. Furthermore, the dramatic increase of power consumption and integration density has led to high operating temperature. Temperature, as well as electromigration (EM) an ..."
Abstract
 Add to MetaCart
Process Variation has become a crucial challenge on both interconnect delay and reliability of nanometer integrated circuit designs. Furthermore, the dramatic increase of power consumption and integration density has led to high operating temperature. Temperature, as well as electromigration (EM) and power, also significantly affects the delay and reliability of interconnects. Considering process variation, we use statistical methods to simultaneously optimize the circuit area, delay, power, thermal, and EM reliability by sizing circuit components (both wires and gates). We model the problem as a secondorder conic program and solve it with the interiorpoint optimization method. Experimental results show that our statistical algorithm can efficiently find desired solutions that satisfy all delay, power, and thermal constraints. 1
Set of Gaussian Random Variables
"... Abstract—This paper quantifies the approximation error when results obtained by Clark (Oper. Res., vol. 9, p. 145, 1961) are employed to compute the maximum (max) of Gaussian random variables, which is a fundamental operation in statistical timing. We show that a finite lookup table can be used to s ..."
Abstract
 Add to MetaCart
Abstract—This paper quantifies the approximation error when results obtained by Clark (Oper. Res., vol. 9, p. 145, 1961) are employed to compute the maximum (max) of Gaussian random variables, which is a fundamental operation in statistical timing. We show that a finite lookup table can be used to store these errors. Based on the error computations, approaches to different orderings for pairwise max operations on a set of Gaussians are proposed. Experimental results show accuracy improvements in the computation of the max of multiple Gaussians, in comparison to the traditional approach. In addition, we present an approach to compute the tightness probabilities of Gaussian random variables with dynamic runtimeaccuracy tradeoff options. We replace required numerical computations for their estimations by closed form expressions based on Taylor series expansion that involve table lookup and a few fundamental arithmetic operations. Experimental results demonstrate an average speedup of 2 × using our approach for computing the maximum of two Gaussians, in comparison to the traditional approach, without any accuracy penalty. Index Terms—Computeraided design (CAD), Gaussian approximation, statistical timing, very largescale integration
Analysis and Optimization under Crosstalk and Variability in Deep SubMicron VLSI Circuits
, 2006
"... With very large scale integrated (VLSI) circuit fabrication entering the deep submicron era, devices are scaled down to finer geometries, clocks are run at higher frequencies, and more functionality is integrated into one chip. All these bring a great promise of “systemonachip”, but also introdu ..."
Abstract
 Add to MetaCart
With very large scale integrated (VLSI) circuit fabrication entering the deep submicron era, devices are scaled down to finer geometries, clocks are run at higher frequencies, and more functionality is integrated into one chip. All these bring a great promise of “systemonachip”, but also introduce challenging new issues in the design process. As a result of the increasing frequency and density, coupling effects or crosstalk between neighboring wires are increased. These effects can cause functionality and timing failures in a circuit. The dynamic power consumption in charging or discharging coupling capacitances is timing dependent, and contributes significantly to a circuit’s power consumption. In addition, manufacturing process variations (e.g. VT, Le), and environmental variations (e.g. Vdd, Temperature) contribute to uncertainties that deeply impact the timing characteristics of a circuit. This variability makes timing verification, and consequently, timing driven circuit optimization extremely difficult. Although worst case analyses for circuit optimization are simpler, they are not desirable since they severely overconstrain the optimization problem, and result in designs that have excessive penalties in terms of area or power consumption. In this research, we investigate the essential problems of timing verification, power estimation,
Evaluating Statistical Power Optimization
"... Abstract—In response to the increasing variations in integratedcircuit manufacturing, the current trend is to create designs that take these variations into account statistically. In this paper, we quantify the difference between the statistical and deterministic optima of leakage power while makin ..."
Abstract
 Add to MetaCart
Abstract—In response to the increasing variations in integratedcircuit manufacturing, the current trend is to create designs that take these variations into account statistically. In this paper, we quantify the difference between the statistical and deterministic optima of leakage power while making no assumptions about the delay model. We develop a framework for deriving a theoretical upper bound on the suboptimality that is incurred by using the deterministic optimum as an approximation for the statistical optimum. We show that for the mean power measure, the deterministic optima is an excellent approximation, and for the mean plus standard deviation measures, the optimality gap increases as the amount of interdie variation grows, for a suite of benchmark circuits in a 45 nm technology. For large variations, we show that there are excellent linear approximations that can be used to approximate the effects of variation. Therefore, the need to develop special statistical power optimization algorithms is questionable. Index Terms—Algorithms, gate sizing, optimization, physical design, statistical power. I.