Results 1 
9 of
9
A heuristic for optimizing stochastic activity networks with applications to statistical digital circuit sizing
 IEEE Transactions on Circuits and SystemsI
, 2004
"... A deterministic activity network (DAN) is a collection of activities, each with some duration, along with a set of precedence constraints, which specify that activities begin only when certain others have finished. One critical performance measure for an activity network is its makespan, which is th ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
A deterministic activity network (DAN) is a collection of activities, each with some duration, along with a set of precedence constraints, which specify that activities begin only when certain others have finished. One critical performance measure for an activity network is its makespan, which is the minimum time required to complete all activities. In a stochastic activity network (SAN), the durations of the activities and the makespan are random variables. The analysis of SANs is quite involved, but can be carried out numerically by Monte Carlo analysis. This paper concerns the optimization of a SAN, i.e., the choice of some design variables that affect the probability distributions of the activity durations. We concentrate on the problem of minimizing a quantile (e.g., 95%) of the makespan, subject to constraints on the variables. This problem has many applications, ranging from project management to digital integrated circuit (IC) sizing (the latter being our motivation). While there are effective methods for optimizing DANs, the SAN optimization problem is much more difficult; the few existing methods cannot handle largescale problems.
A Vectorbased Approach for Power Supply Noise Analysis
 in Int’l Test Conf
, 2005
"... Excessive power supply noise can lead to overkill during delay test. A static compaction solution is described to prevent such overkill. Lowcost power supply noise models are developed and used in compaction. An error analysis of these models is given. This paper improves on prior work in terms of ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Excessive power supply noise can lead to overkill during delay test. A static compaction solution is described to prevent such overkill. Lowcost power supply noise models are developed and used in compaction. An error analysis of these models is given. This paper improves on prior work in terms of models and algorithm to increase accuracy and performance. Experimental results are given on ISCAS89 circuits. 1.
Massive Statistical Process Variations A Grand Challenge for Testing Nanoelectronic Circuits
 INTERNATIONAL CONFERENCE ON DEPENDABLE SYSTEMS AND NETWORKS WORKSHOPS (DSNW)
, 2010
"... Increasing parameter variations, high defect densities and a growing susceptibility to external noise in nanoscale technologies have led to a paradigm shift in design. Classical design strategies based on worstcase or average assumptions have been replaced by statistical design, and new robust and ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Increasing parameter variations, high defect densities and a growing susceptibility to external noise in nanoscale technologies have led to a paradigm shift in design. Classical design strategies based on worstcase or average assumptions have been replaced by statistical design, and new robust and variation tolerant architectures have been developed. At the same time testing has become extremely challenging, as parameter variations may lead to an unacceptable behavior or change the impact of defects. Furthermore, for robust designs a precise quality assessment is required particularly showing the remaining robustness in the presence of manufacturing defects. The paper pinpoints the key challenges for testing nanoelectronic circuits in more detail, covering the range of variationaware fault modeling via methods for statiscal testing and their algorithmic foundations to robustness analysis and quality binning.
A Heuristic Method for Statistical Digital Circuit Sizing
"... In this paper we give a brief overview of a heuristic method for approximately solving a statistical digital circuit sizing problem, by reducing it to a related deterministic sizing problem that includes extra margins in each of the gate delays to account for the variation. Since the method is based ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper we give a brief overview of a heuristic method for approximately solving a statistical digital circuit sizing problem, by reducing it to a related deterministic sizing problem that includes extra margins in each of the gate delays to account for the variation. Since the method is based on solving a deterministic sizing problem, it readily handles largescale problems. Numerical experiments show that the resulting designs are often substantially better than one in which the variation in delay is ignored, and often quite close to the global optimum. Moreover, the designs seem to be good despite the simplicity of the statistical model (which ignores gate distribution shape, correlations, and so on). We illustrate the method on a 32bit LadnerFischer adder, with a simple resistorcapacitor (RC) delay model, and a Pelgrom model of delay variation.
Modeling and Simulation of Time Domain Faults in Digital Systems
"... The purpose of this paper is to present and discuss a novel modeling and fault simulation technique for two types of dynamic faults in digital systems: transient power supply voltage drops and transient delays in logic elements or signals paths. Techniques and tools currently used for permanent faul ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The purpose of this paper is to present and discuss a novel modeling and fault simulation technique for two types of dynamic faults in digital systems: transient power supply voltage drops and transient delays in logic elements or signals paths. Techniques and tools currently used for permanent faults are reused for dynamic (permanent) and intermittent faults. For transient power supply voltage drops (V DD ), two approaches are proposed: delay fault injection in all logic elements of the CUT (Circuit Under Test), or modulation of the clock and observation rate. For transient delays (e.g., SEU), single delay injection is performed at logic element level. Delay modulation is carried out by fault injection using the PLI interface of the commercial Verilog# simulation tool. Preliminary results, demonstrated by the c7552 ISCAS'85 benchmark circuit, show that CUTs with long critical paths are very sensitive to power supply transients. Moreover, a pseudorandom test pattern can be used to identify the dependence of the CUT sensitivity to delay faults on defect size, for a given clock period, o .
The Confluence of Manufacturing Test and Design Validation
"... gigascale, nanometer devices, we see an increasing intersection of manufacturing test, design verification, and design for robustness. While each of these domains has its own unique failure modes and errors, we see a huge potential in shared solutions, especially in the following areas: 1. Delay te ..."
Abstract
 Add to MetaCart
gigascale, nanometer devices, we see an increasing intersection of manufacturing test, design verification, and design for robustness. While each of these domains has its own unique failure modes and errors, we see a huge potential in shared solutions, especially in the following areas: 1. Delay testing and timing verification The noise, process, thermal and powerinduced delay variations make circuit delays much less predictable. At the same time, due to small and subtle defects, devices are more likely to marginally violate performance specifications under scaled technologies. Like the signal lines, clock lines are also becoming more susceptible to variations and defects. For timing verification, such trends invalidate traditional, static (i.e. vectorless) timing verification paradigms and create a demand for dynamic solutions that would require carefully crafted test vectors for accurate timing simulation. For delay testing, we need test vectors that can exercise various worstcase timing scenarios to screen out devices with parametric variations and small timing defects. To reduce the overall costs and effort involved in design and test, we need to develop models, tools, and methodologies that can generate high quality, costeffective test vectors that serve both applications [1]. 2. Verification and test sharing the same computational technologies Some aspects of functional verification are now routinely performed with tools that employ ATPG and/or SAT techniques [2]. For example, establishing the equivalence between two implementations of the same design is now usually done using formal equivalencechecking tools, which have more or less replaced simulators as the preferred method. Such tools use ATPG, SAT and BDD engines, which are also widely used in tools for generating manufacturing tests [3]. Property verification via model checking, a technique which
VariationAware Fault Modeling
 19TH IEEE ASIAN TEST SYMPOSIUM
, 2010
"... To achieve a high product quality for nanoscale systems both realistic defect mechanisms and process variations must be taken into account. While existing approaches for variationaware digital testing either restrict themselves to special classes of defects or assume given probability distributio ..."
Abstract
 Add to MetaCart
To achieve a high product quality for nanoscale systems both realistic defect mechanisms and process variations must be taken into account. While existing approaches for variationaware digital testing either restrict themselves to special classes of defects or assume given probability distributions to model variabilities, the proposed approach combines defectoriented testing with statistical library characterization. It uses Monte Carlo simulations at electrical level to extract delay distributions of cells in the presence of defects and for the defectfree case. This allows distinguishing the effects of process variations on the cell delay from defectinduced cell delays under process variations. To provide a suitable interface for test algorithms at higher levels of abstraction the distributions are represented as histograms and stored in a histogram data base (HDB). Thus, the computationally expensive defect analysis needs to be performed only once as a preprocessing step for library characterization, and statistical test algorithms do not require any low level information beyond the HDB. The generation of the HDB is demonstrated for primitive cells in 45nm technology.
works. VariationAware Fault Grading
"... This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan or sublicensing, systematic supply or distribution in any form to anyone is expressly forbidden. ..."
Abstract
 Add to MetaCart
This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan or sublicensing, systematic supply or distribution in any form to anyone is expressly forbidden.
Towards VariationAware Test Methods
"... This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan or sublicensing, systematic supply or distribution in any form to anyone is expressly forbidden. ..."
Abstract
 Add to MetaCart
This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan or sublicensing, systematic supply or distribution in any form to anyone is expressly forbidden.