Results 1  10
of
53
Graphbased algorithms for Boolean function manipulation
 IEEE TRANSACTIONS ON COMPUTERS
, 1986
"... In this paper we present a new data structure for representing Boolean functions and an associated set of manipulation algorithms. Functions are represented by directed, acyclic graphs in a manner similar to the representations introduced by Lee [1] and Akers [2], but with further restrictions on th ..."
Abstract

Cited by 3499 (47 self)
 Add to MetaCart
(Show Context)
In this paper we present a new data structure for representing Boolean functions and an associated set of manipulation algorithms. Functions are represented by directed, acyclic graphs in a manner similar to the representations introduced by Lee [1] and Akers [2], but with further restrictions on the ordering of decision variables in the graph. Although a function requires, in the worst case, a graph of size exponential in the number of arguments, many of the functions encountered in typical applications have a more reasonable representation. Our algorithms have time complexity proportional to the sizes of the graphs being operated on, and hence are quite efficient as long as the graphs do not grow too large. We present experimental results from applying these algorithms to problems in logic design verification that demonstrate the practicality of our approach.
Symbolic manipulation of boolean functions using a graphical representation
 In DAC
, 1985
"... In this paper we describe a data structure for representing Boolean functions and an associated set of manipulation algorithms. Functions are represented by directed, acyclic graphs in a manner similar to the representations of Lee and Akers, but with further restrictions on the ordering of decision ..."
Abstract

Cited by 71 (3 self)
 Add to MetaCart
(Show Context)
In this paper we describe a data structure for representing Boolean functions and an associated set of manipulation algorithms. Functions are represented by directed, acyclic graphs in a manner similar to the representations of Lee and Akers, but with further restrictions on the ordering of decision variables in the graph. Although a function requires, in the worst case, a graph of size exponential in the number of arguments, many of the functions encountered in typical applications have a more reasonable representation. Our algorithms are quite efficient as long as the graphs being operated on do not grow too large. We present performance measurements obtained while applying these algorithms to problems in logic design verification.
Binary decision diagrams in theory and practice
, 2001
"... Decision diagrams (DDs) are the stateoftheart data structure in VLSI CAD and have been successfully applied in many other fields.DDs are widely used and are also integrated in commercial tools.This special section comprises six contributed articles on various aspects of the theory and application ..."
Abstract

Cited by 31 (7 self)
 Add to MetaCart
Decision diagrams (DDs) are the stateoftheart data structure in VLSI CAD and have been successfully applied in many other fields.DDs are widely used and are also integrated in commercial tools.This special section comprises six contributed articles on various aspects of the theory and application of DDs.As preparation for these contributions, the present article reviews the basic definitions of binary decision diagrams (BDDs). We provide a brief overview and study theoretical and practical aspects.Basic properties of BDDs are discussed and manipulation algorithms are described.Extensions of BDDs are investigated and by this we give a deeper insight into the basic data structure.Finally we outline several applications of BDDs and their extensions and suggest a number of articles and books for those who wish to pursue the topic in more depth.
EaseCAM: An Energy and Storage Efficient TCAMBased Router Architecture for IP Lookup
 IEEE Transactions on Computers
, 2005
"... Abstract—Ternary Content Addressable Memories (TCAMs) have been emerging as a popular device in designing routers for packet forwarding and classifications. Despite their premise on highthroughput, large TCAM arrays are prohibitive due to their excessive power consumption and lack of scalable desig ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
(Show Context)
Abstract—Ternary Content Addressable Memories (TCAMs) have been emerging as a popular device in designing routers for packet forwarding and classifications. Despite their premise on highthroughput, large TCAM arrays are prohibitive due to their excessive power consumption and lack of scalable design schemes. This paper presents a TCAMbased router architecture that is energy and storage efficient. We introduce prefix aggregation and expansion techniques to compact the effective TCAM size in a router. Pipelined and paging schemes are employed in the architecture to activate a limited number of entries in the TCAM array during an IP lookup. The new architecture provides low power, fast incremental updating, and fast table lookup. Heuristic algorithms for page filling, fast prefix update, and memory management are also provided. Results have been illustrated with two large routers (bbnplanet and attcanada) to demonstrate the effectiveness of our approach. Index Terms—Router, TCAMs, IP lookup, partition, compaction, page table.
BOOM  a Heuristic Boolean Minimizer
 INTERNATIONAL CONFERENCE ON COMPUTERAIDED DESIGN
, 2001
"... We present a twolevel Boolean minimization tool (BOOM) based on a new implicant generation paradigm. In contrast to all previous minimization methods, where the implicants are generated bottomup, the proposed approach uses a topdown approach. Thus instead of increasing the dimensionality of impli ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
We present a twolevel Boolean minimization tool (BOOM) based on a new implicant generation paradigm. In contrast to all previous minimization methods, where the implicants are generated bottomup, the proposed approach uses a topdown approach. Thus instead of increasing the dimensionality of implicants by omitting literals from their terms, the dimension of a term is gradually decreased by adding new literals. Unlike most other minimization tools like ESPRESSO, BOOM doesn't use the definition of the function to be minimized as a basis for the solution, thus the original coverage influences the solution only indirectly through the number of literals used. Most of the minimization methods use two basic phases introduced by QuineMcCluskey and known as prime implicant (PI) generation and covering problem solution. Some more modern methods, like ESPRESSO, combine these two phases, reducing the number of PIs to be processed. This approach is also used in BOOM, the search for new literals to be included into a term aims at maximum coverage of the output function. The function to be minimized is defined by its onset and offset listed in a truth table. Thus the don't care set, often representing the dominant part of the truth table, need not be specified explicitly. The proposed minimization method is efficient above all for functions with a large number of input variables while only few care terms are defined. The minimization procedure is very fast, hence if the first solution does not meet the requirements, it can be improved in an iterative manner. The minimization method has been tested on several different kinds of problems. The MCNC standard benchmarks were solved many times in order to evaluate the minimality of the solution and the runtime.
ColumnMatching BIST Exploiting Test Don'tCares
 PROC. 8TH IEEE EUROPEAN TEST WORKSHOP (ETW'03), MAASTRICHT (THE NETHERLANDS)
, 2003
"... We propose a new testperclock BIST method for combinational or fullscan circuits. Our aim is to design a combinational block transforming the LFSR code words into deterministic test patterns precomputed by some ATPG tool. The proposed algorithm is an enhancement of a column matching method, in w ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
We propose a new testperclock BIST method for combinational or fullscan circuits. Our aim is to design a combinational block transforming the LFSR code words into deterministic test patterns precomputed by some ATPG tool. The proposed algorithm is an enhancement of a column matching method, in which the maximum of the output variables of the decoder is tried to be implemented as mere wires, thus without any logic. The enhancement consists in extending the use of the method for a test set containing don't cares. These don't cares allow us to reach a higher number of column matches, which significantly reduces the BIST logic.
EspressoHF: A Heuristic HazardFree Minimizer for TwoLevel Logic
 In DAC
"... We present a new heuristic algorithm for hazardfree minimization of twolevel logic. On nearly all examples, the algorithm finds an exactly minimumcost cover. It also solves several problems which have not been previously solved using existing exact minimizers. We believe this is the first heurist ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
(Show Context)
We present a new heuristic algorithm for hazardfree minimization of twolevel logic. On nearly all examples, the algorithm finds an exactly minimumcost cover. It also solves several problems which have not been previously solved using existing exact minimizers. We believe this is the first heuristic method based on Espresso to solve the general hazardfree twolevel minimization problem, for multipleinput change transitions.
FCMin: A Fast MultiOutput Boolean Minimizer
 PROC. 29TH EUROMICRO SYMPOSIUM ON DIGITAL SYSTEMS DESIGN (DSD'03)
, 2003
"... We present a novel heuristic algorithm for twolevel Boolean minimization. In contrast to the other approaches, the proposed method firstly finds the coverage of the onsets and from that it derives the group implicants. No prime implicants of the single functions are being computed; only the necess ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We present a novel heuristic algorithm for twolevel Boolean minimization. In contrast to the other approaches, the proposed method firstly finds the coverage of the onsets and from that it derives the group implicants. No prime implicants of the single functions are being computed; only the necessary implicants needed to cover the onsets are produced. This reverse approach makes the algorithm extremely fast and minimizes the memory demands. It is most efficient for functions with a large number of output variables, where the other minimization algorithms (e.g. ESPRESSO) are too slow. It is also very efficient for highly unspecified functions, i.e. functions with only few terms defined.
ColumnMatching Based BIST Design Method
 PROC. 7TH IEEE EUROPIAN TEST WORKSHOP (ETW 2002), CORFU (GREECE)
, 2002
"... A new method of testperclock BIST design for combinational circuits is proposed. The fundamental problem of matching the PRPG outputs with the required test patterns is solved as a general design problem in the field of combinational logic. A test set generated by an ATPG is compared with the PRPG ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
A new method of testperclock BIST design for combinational circuits is proposed. The fundamental problem of matching the PRPG outputs with the required test patterns is solved as a general design problem in the field of combinational logic. A test set generated by an ATPG is compared with the PRPG generated sequence. The solution is based on a novel search algorithm, which identifies the best matches between the pairs of columns of the two sets.
Fast Heuristic and Exact Algorithms for TwoLevel HazardFree Logic Minimization
, 1998
"... None of the available minimizers for twolevel hazardfree logic minimization can synthesize very large circuits. This limitation has forced researchers to resort to manual and automated circuit partitioning techniques. This paper ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
(Show Context)
None of the available minimizers for twolevel hazardfree logic minimization can synthesize very large circuits. This limitation has forced researchers to resort to manual and automated circuit partitioning techniques. This paper