Results 1  10
of
65
Symbolic Boolean manipulation with ordered binarydecision diagrams
 ACM Computing Surveys
, 1992
"... Ordered BinaryDecision Diagrams (OBDDS) represent Boolean functions as directed acyclic graphs. They form a canonical representation, making testing of functional properties such as satmfiability and equivalence straightforward. A number of operations on Boolean functions can be implemented as grap ..."
Abstract

Cited by 879 (11 self)
 Add to MetaCart
Ordered BinaryDecision Diagrams (OBDDS) represent Boolean functions as directed acyclic graphs. They form a canonical representation, making testing of functional properties such as satmfiability and equivalence straightforward. A number of operations on Boolean functions can be implemented as graph algorithms on OBDD
Generic ILP versus Specialized 01 ILP: An Update
 IN INTERNATIONAL CONFERENCE ON COMPUTERAIDED DESIGN
, 2002
"... Optimized solvers for the Boolean Satisfiability (SAT) problem have many applications in areas such as hardware and software verification, FPGA routing, planning, etc. Further uses are complicated by the need to express "counting constraints" in conjunctive normal form (CNF). Expressing such constra ..."
Abstract

Cited by 77 (21 self)
 Add to MetaCart
Optimized solvers for the Boolean Satisfiability (SAT) problem have many applications in areas such as hardware and software verification, FPGA routing, planning, etc. Further uses are complicated by the need to express "counting constraints" in conjunctive normal form (CNF). Expressing such constraints by pure CNF leads to more complex SAT instances. Alternatively, those constraints can be handled by Integer Linear Programming (ILP), but generic ILP solvers may ignore the Boolean nature of 01 variables. Therefore specialized 01 ILP solvers extend SAT solvers to handle these socalled "pseudoBoolean" constraints. This work
Rough Mereology: A New Paradigm For Approximate Reasoning
, 1996
"... We are concerned with formal models of reasoning under uncertainty. Many approaches to this problem are known in the literature e.g. DempsterShafer theory, bayesianbased reasoning, belief networks, fuzzy logics etc. We propose rough mereology as a foundation for approximate reasoning about complex ..."
Abstract

Cited by 57 (25 self)
 Add to MetaCart
We are concerned with formal models of reasoning under uncertainty. Many approaches to this problem are known in the literature e.g. DempsterShafer theory, bayesianbased reasoning, belief networks, fuzzy logics etc. We propose rough mereology as a foundation for approximate reasoning about complex objects. Our notion of a complex object includes approximate proofs understood as schemes constructed to support our assertions about the world on the basis of our incomplete or uncertain knowledge. 1 Introduction We present a formal model of approximate reasoning about processes of synthesis of complex systems. First ideas of this approach have been presented in [15], [24], [25], [27], [28], [29], [30], [31]. Our research has been stimulated by the demand for solutions of the following groups of problems, estimated in [1] to be crucial for the progress in the area of automated design and manufacturing. These groups of problems are concerned with the treatment of: Group 1. Poorly defined...
Dynamic Reducts as a Tool for Extracting Laws from Decisions Tables
, 1994
"... . We apply rough set methods and boolean reasoning for knowledge discovery from decision tables. It is not always possible to extract general laws from experimental data by computing first all reducts [12] of a decision table and next decision rules on the basis of these reducts. We investigate a pr ..."
Abstract

Cited by 53 (13 self)
 Add to MetaCart
. We apply rough set methods and boolean reasoning for knowledge discovery from decision tables. It is not always possible to extract general laws from experimental data by computing first all reducts [12] of a decision table and next decision rules on the basis of these reducts. We investigate a problem how information about the reduct set changes in a random sampling process of a given decision table could be used to generate these laws. The reducts stable in the process of decision table sampling are called dynamic reducts. Dynamic reducts define the set of attributes called the dynamic core. This is the set of attributes included in all dynamic reducts. The set of decision rules can be computed from the dynamic core or from the best dynamic reducts. We report the results of experiments with different data sets, e.g. market data, medical data, textures and handwritten digits. The results are showing that dynamic reducts can help to extract laws from decision tables. Key words: evol...
BDS: A BDDBased Logic Optimization System
 Proc. of DAC 2000
, 2000
"... This paper describes a new BDDbased logic optimization system, BDS. It is based on a recently developed theory for BDDbased logic decomposition, which supports both algebraic and Boolean factorization. New techniques, which are crucial to the manipulation of BDDs in a partitioned Boolean network e ..."
Abstract

Cited by 53 (0 self)
 Add to MetaCart
This paper describes a new BDDbased logic optimization system, BDS. It is based on a recently developed theory for BDDbased logic decomposition, which supports both algebraic and Boolean factorization. New techniques, which are crucial to the manipulation of BDDs in a partitioned Boolean network environment, are described in detail. The experimental results show that BDS has a capability to handle very large circuits. It offers a superior runtime advantage over SIS, with comparable results in terms of circuit area and often improved delay.
Rough Sets: A Tutorial
, 1998
"... A rapid growth of interest in rough set theory [290] and its applications can be lately seen in the number of international workshops, conferences and seminars that are either directly dedicated to rough sets, include the subject in their programs, or simply accept papers that use this approach t ..."
Abstract

Cited by 53 (3 self)
 Add to MetaCart
A rapid growth of interest in rough set theory [290] and its applications can be lately seen in the number of international workshops, conferences and seminars that are either directly dedicated to rough sets, include the subject in their programs, or simply accept papers that use this approach to solve problems at hand. A large number of high quality papers on various aspects of rough sets and their applications have been published in recent years as a result of this attention. The theory has been followed by the development of several software systems that implement rough set operations. In Section 12 we present a list of software systems based on rough sets. Some of the toolkits, provide advanced graphical environments that support the process of developing and validating rough set classifiers. Rough sets are applied in many domains, such as, for instance, medicine, finance, telecommunication, vibration analysis, conflict resolution, intelligent agents, image analysis, p...
A Comparison of Dynamic and nonDynamic Rough Set Methods for Extracting Laws from Decision Tables
, 1998
"... We report results of experiments on several data sets, in particular: Monk's problems data (see [58]), medical data (lymphography, breast cancer, primary tumor  see [30]) and StatLog's data (see [32]). We compare standard methods for extracting laws from decision tables (see [43], [52]), based on r ..."
Abstract

Cited by 50 (5 self)
 Add to MetaCart
We report results of experiments on several data sets, in particular: Monk's problems data (see [58]), medical data (lymphography, breast cancer, primary tumor  see [30]) and StatLog's data (see [32]). We compare standard methods for extracting laws from decision tables (see [43], [52]), based on rough set (see [42]) and boolean reasoning (see [8]), with the method based on dynamic reducts and dynamic rules (see [3],[4],[5],[6]). We also compare the results of computer experiments on those data sets obtained by applying our system based on rough set methods with the results on the same data sets obtained with help of several data analysis systems known from literature.
MultiLevel Logic Optimization by Implication Analysis
 In International Conference on Computer Aided Design
, 1994
"... This paper proposes a new approach to multilevel logic optimization based on ATPG (Automatic Test Pattern Generation). Previous ATPGbased methods for logic minimization suffered from the limitation that they were quite restricted in the set of possible circuit transformations. We show that the ATP ..."
Abstract

Cited by 50 (7 self)
 Add to MetaCart
This paper proposes a new approach to multilevel logic optimization based on ATPG (Automatic Test Pattern Generation). Previous ATPGbased methods for logic minimization suffered from the limitation that they were quite restricted in the set of possible circuit transformations. We show that the ATPGbased method presented here allows (in principle) the transformation of a given combinational network C into an arbitrary, structurally different but functionally equivalent combinational network C'. Furthermore, powerful heuristics are presented in order to decide what network manipulations are promising for minimizing the circuit. By identifying indirect implications between signals in the circuit, transformations can be derived which are "good" candidates for the minimization of the circuit. In particular, it is shown that Recursive Learning can derive "good" Boolean divisors justifying the effort to attempt a Boolean division. For 9 out of 10 ISCAS85 benchmark circuits our tool HANNI...
Fault Classes and Error Detection Capability of Specification Based Testing
 ACM Transactions on Software Engineering and Methodology
, 1999
"... This paper describes a method for computing the conditions that must be covered by a test set for the test set to guarantee detection of the particular fault class. It is shown that there is a coverage hierarchy to fault classes that is consistent with, and may therefore explain, experimental result ..."
Abstract

Cited by 41 (3 self)
 Add to MetaCart
This paper describes a method for computing the conditions that must be covered by a test set for the test set to guarantee detection of the particular fault class. It is shown that there is a coverage hierarchy to fault classes that is consistent with, and may therefore explain, experimental results on fault based testing. The method is also shown to be effective for computing MCDCadequate [9] tests
Propositional Belief Base Update and Minimal Change
 Artificial Intelligence
, 1999
"... In this paper we examine ten concrete propositional update operations of the literature. We start by completely characterizing their relative strength and their computational complexity. Then we evaluate the competing update operations w.r.t. the postulates proposed by Katsuno and Mendelzon. It turn ..."
Abstract

Cited by 39 (5 self)
 Add to MetaCart
In this paper we examine ten concrete propositional update operations of the literature. We start by completely characterizing their relative strength and their computational complexity. Then we evaluate the competing update operations w.r.t. the postulates proposed by Katsuno and Mendelzon. It turns out that the majority violates most of the postulates. We argue that all violated postulates are undesirable except one. After that we evaluate the update operations w.r.t. another property which has been investigated extensively in the literature, viz. that disjunctive updates should not be identified with the exclusive disjunction. We argue that this is desirable, and show that the argument gives further support to the rejection of two of the postulates. Finally we study how the different approaches accommodate general laws governing the world, alias integrity constraints. Summing up our results, we conclude that only two of the update operations are satisfactory. Key words: Belief chan...