Results 1  10
of
111
Discovering Frequent Closed Itemsets for Association Rules
, 1999
"... In this paper, we address the problem of finding frequent itemsets in a database. Using the closed itemset lattice framework, we show that this problem can be reduced to the problem of finding frequent closed itemsets. Based on this statement, we can construct efficient data mining algorithms by lim ..."
Abstract

Cited by 333 (10 self)
 Add to MetaCart
In this paper, we address the problem of finding frequent itemsets in a database. Using the closed itemset lattice framework, we show that this problem can be reduced to the problem of finding frequent closed itemsets. Based on this statement, we can construct efficient data mining algorithms by limiting the search space to the closed itemset lattice rather than the subset lattice. Moreover, we show that the set of all frequent closed itemsets suffices to determine a reduced set of association rules, thus addressing another important data mining problem: limiting the number of rules produced without information loss. We propose a new algorithm, called AClose, using a closure mechanism to find frequent closed itemsets. We realized experiments to compare our approach to the commonly used frequent itemset search approach. Those experiments showed that our approach is very valuable for dense and/or correlated data that represent an important part of existing databases.
OntologyBased Integration of Information  A Survey of Existing Approaches
, 2001
"... We review the use on ontologies for the integration of heterogeneous information sources. Based on an indepth evaluation of existing approaches to this problem we discuss how ontologies are used to support the integration task. We evaluate and compare the languages used to represent the ontologies ..."
Abstract

Cited by 196 (1 self)
 Add to MetaCart
We review the use on ontologies for the integration of heterogeneous information sources. Based on an indepth evaluation of existing approaches to this problem we discuss how ontologies are used to support the integration task. We evaluate and compare the languages used to represent the ontologies and the use of mappings between ontologies as well as to connect ontologies with information sources. We also enquire into ontology engineering methods and tools used to develop ontologies for information integration. Based on the results of our analysis we summarize the stateoftheart in ontologybased information integration and name areas of further research activities.
Efficient Mining Of Association Rules Using Closed Itemset Lattices
 INFORMATION SYSTEMS
, 1999
"... Discovering association rules is one of the most important task in data mining. Many efficient algorithms have been proposed in the literature. The most noticeable are Apriori, Mannila's algorithm, Partition, Sampling and DIC, that are all based on the Apriori mining method: pruning the subset latti ..."
Abstract

Cited by 116 (7 self)
 Add to MetaCart
Discovering association rules is one of the most important task in data mining. Many efficient algorithms have been proposed in the literature. The most noticeable are Apriori, Mannila's algorithm, Partition, Sampling and DIC, that are all based on the Apriori mining method: pruning the subset lattice (itemset lattice). In this paper we propose an efficient algorithm, called Close, based on a new mining method: pruning the closed set lattice (closed itemset lattice). This lattice, which is a suborder of the subset lattice, is closely related to Wille's concept lattice in formal concept analysis. Experiments comparing Close to an optimized version of Apriori showed that Close is very efficient for mining dense and/or correlated data such as census style data, and performs reasonably well for market basket style data.
Computing Iceberg Concept Lattices with TITANIC
, 2002
"... We introduce the notion of iceberg concept lattices... ..."
Abstract

Cited by 82 (13 self)
 Add to MetaCart
We introduce the notion of iceberg concept lattices...
On the Inference of Configuration Structures from Source Code
 In Proceedings of the 16th International Conference on Software Engineering
, 1994
"... We apply mathematical concept analysis to the problem of infering configuration structures from existing source code. Concept analysis has been developed by German mathematicians over the last years; it can be seen as a discrete analogon to Fourier analysis. Based on this theory, our tool will accep ..."
Abstract

Cited by 79 (5 self)
 Add to MetaCart
We apply mathematical concept analysis to the problem of infering configuration structures from existing source code. Concept analysis has been developed by German mathematicians over the last years; it can be seen as a discrete analogon to Fourier analysis. Based on this theory, our tool will accept source code, where configurationspecific statements are controlled by the preprocessor. The algorithm will compute a socalled concept lattice, which  when visually displayed  allows remarkable insight into the structure and properties of possible configurations. The lattice not only displays finegrained dependencies between configuration threads, but also visualizes the overall quality of configuration structures according to software engineering principles. The paper presents a short introduction to concept analysis, as well as experimental results on various programs. 1 Introduction A simple and widely used technique for configuration management is the use of the C preprocessor. ...
Conceptual Graphs and Formal Concept Analysis
, 1997
"... . It is shown how Conceptual Graphs and Formal Concept Analysis may be combined to obtain a formalization of Elementary Logic which is useful for knowledge representation and processing. For this, a translation of conceptual graphs to formal contexts and concept lattices is described through an exam ..."
Abstract

Cited by 53 (7 self)
 Add to MetaCart
. It is shown how Conceptual Graphs and Formal Concept Analysis may be combined to obtain a formalization of Elementary Logic which is useful for knowledge representation and processing. For this, a translation of conceptual graphs to formal contexts and concept lattices is described through an example. Using a suitable mathematization of conceptual graphs, basics of a unified mathematical theory for Elementary Logic are proposed. Contents 1. Formalization of Elementary Logic 2. From Conceptual Graphs to Formal Contexts 3. Mathematization of Conceptual Structures 1 Formalization of Elementary Logic Conceptual Graphs and Formal Concept Analysis have been used both for knowledge representation and processing in a large extent. This has caused the desire to combine the two approaches for deriving benefits from both disciplines and their experiences. There is even a fundamental reason for associating Conceptual Graphs and Formal Concept Analysis which lies in their farback reaching root...
Reengineering of Configurations Based on Mathematical Concept Analysis
 ACM Transactions on Software Engineering and Methodology
, 1996
"... We apply mathematical concept analysis to the problem of reengineering configurations. Concept analysis will reconstruct a taxonomy of concepts from a relation between objects and attributes. We use concept analysis to infer configuration structures from existing source code. Our tool NORA/RECS will ..."
Abstract

Cited by 50 (6 self)
 Add to MetaCart
We apply mathematical concept analysis to the problem of reengineering configurations. Concept analysis will reconstruct a taxonomy of concepts from a relation between objects and attributes. We use concept analysis to infer configuration structures from existing source code. Our tool NORA/RECS will accept source code, where configurationspecific code pieces are controlled by the preprocessor. The algorithm will compute a socalled concept lattice, which —when visually displayed — offers remarkable insight into the structure and properties of possible configurations. The lattice not only displays tinegrained dependencies between configurations, but also visualizes the overall quality of configuration structures according to software engineering principles. In a second step, interferences between configurations can be analyzed in order to restructure or simplify configurations. Interferences showing up in the lattice indicate high coupling and low cohesion between configuration concepts. Source files can then be simplified according to the lattice structure. Finally, we show how governing expressions can be simplified by utilizing an isomorphism theorem of mathematical concept analysis.
Formal Concept Analysis in Information Science
 ANNUAL REVIEW OF INFORMATION SCIENCE AND TECHNOLOGY
, 1996
"... ..."
Conceptual Knowledge Discovery in Databases Using Formal Concept Analysis Methods
, 1998
"... this paper we discuss Conceptual Knowledge Discovery in Databases (CKDD) as it is developing in the field of Conceptual Knowledge Processing (cf. [29],[30]). Conceptual Knowledge Processing is based on the mathematical theory of Formal Concept Analysis which has become a successful theory for data a ..."
Abstract

Cited by 37 (15 self)
 Add to MetaCart
this paper we discuss Conceptual Knowledge Discovery in Databases (CKDD) as it is developing in the field of Conceptual Knowledge Processing (cf. [29],[30]). Conceptual Knowledge Processing is based on the mathematical theory of Formal Concept Analysis which has become a successful theory for data analysis during the last 18 years. This approach relies on the pragmatic philosophy of Ch. S. Peirce [15] who claims that we can only analyze and argue within restricted contexts where we always rely on preknowledge and common sense. The development of Formal Concept Analysis led to the software system TOSCANA, which is presented as a CKDD tool in this paper. TOSCANA is a flexible navigation tool that allows dynamic browsing through and zooming into the data. It supports the exploration of large databases by visualizing conceptual aspects inherent to the data. We want to clarify that CKDD can be understood as a humancentered approach of Knowledge Discovery in Databases. The actual discussion about humancentered Knowledge Discovery is therefore briefly summarized in Section 1
On Modeling Data Mining with Granular Computing
 Proceedings of COMPSAC 2001
, 2001
"... The main objective of this paper is to advocate for formal and mathematical modeling of data mining, which unfortunately has not received much attention. A framework is proposed for rule mining based on granular computing. It is developed in the Tarski's style through the notions of a model and sati ..."
Abstract

Cited by 34 (16 self)
 Add to MetaCart
The main objective of this paper is to advocate for formal and mathematical modeling of data mining, which unfortunately has not received much attention. A framework is proposed for rule mining based on granular computing. It is developed in the Tarski's style through the notions of a model and satisfiability. The model is a database consisting of a finite set of objects described by a finite set of attributes. Within this framework, a concept is defined as a pair consisting of the intension, an expression in a certain language over the set of attributes, and the extension, a subset of the universe, of the concept. An object satisfies the expression of a concept if the object has the properties as specified by the expression, and the object belongs to the extension of the concepts. Rules are used to describe relationships between concepts. A rule is expressed in terms of the intensions of the two concepts and is interpreted in terms of the extensions of the concepts. Two interpretations of rules are examined in detail, one is based on logical implication and the other on conditional probability.