### Lazy Classication with Interval Pattern Structures: Application to Credit Scoring

"... Abstract. Pattern structures allow one to approach the knowledge extraction problem in case of arbitrary object descriptions. They provide the way to apply Formal Concept Analysis (FCA) techniques to nonbinary contexts. However, in order to produce classication rules a concept lattice should be bui ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract. Pattern structures allow one to approach the knowledge extraction problem in case of arbitrary object descriptions. They provide the way to apply Formal Concept Analysis (FCA) techniques to nonbinary contexts. However, in order to produce classication rules a concept lattice should be built. For non-binary contexts this procedure may take much time and resources. In order to tackle this problem, we introduce a modication of the lazy associative classication algorithm and apply it to credit scoring. The resulting quality of classication is compared to existing methods adopted in bank systems.

### Bridging DBpedia Categories and DL-Concept Definitions using Formal Concept Analysis

"... Abstract. The popularization and quick growth of Linked Open Data (LOD) has led to challenging aspects regarding quality assessment and data exploration of the RDF triples that shape the LOD cloud. Particularly, we are interested in the completeness of data and its potential to provide concept defi ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract. The popularization and quick growth of Linked Open Data (LOD) has led to challenging aspects regarding quality assessment and data exploration of the RDF triples that shape the LOD cloud. Particularly, we are interested in the completeness of data and its potential to provide concept definitions in terms of necessary and sufficient conditions. In this work we propose a novel technique based on Formal Concept Analysis which organizes RDF data into a concept lattice. This allows the discovery of implications, which are used to automatically detect missing information and then to complete RDF data.

### Practical Computing with Pattern Structures in FCART Environment

"... Abstract. A new general and efficient architecture for working with pattern structures, an extension of FCA for dealing with “complex ” descriptions, is introduced and implemented in a subsystem of Formal Concept Analysis Research Toolbox (FCART). The architecture is universal in terms of possible d ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract. A new general and efficient architecture for working with pattern structures, an extension of FCA for dealing with “complex ” descriptions, is introduced and implemented in a subsystem of Formal Concept Analysis Research Toolbox (FCART). The architecture is universal in terms of possible dataset structures and formats, techniques of pattern structure manipulation.

### Preface

, 2013

"... This is the second edition of the FCA4AI workshop, the first edition being associated to ..."

Abstract
- Add to MetaCart

(Show Context)
This is the second edition of the FCA4AI workshop, the first edition being associated to

### A local discretization of continuous data for lattices: Technical aspects

"... Abstract. Since few years, Galois lattices (GLs) are used in data mining and defining a GL from complex data (i.e. non binary) is a recent challenge [1,2]. Indeed GL is classically defined from a binary table (called context), and therefore in the presence of continuous data a discretization step is ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract. Since few years, Galois lattices (GLs) are used in data mining and defining a GL from complex data (i.e. non binary) is a recent challenge [1,2]. Indeed GL is classically defined from a binary table (called context), and therefore in the presence of continuous data a discretization step is generally needed to convert continuous data into discrete data. Discretization is classically performed before the GL construction in a global way. However, local discretization is reported to give better classification rates than global discretization when used jointly with other symbolic classification methods such as decision trees (DTs). Using a result of lattice theory bringing together set of objects and specific nodes of the lattice, we identify subsets of data to perform a local discretization for GLs. Experiments are performed to assess the efficiency and the effectiveness of the proposed algorithm compared to global discretization. 1 Discretization process The discretization process consists in converting continuous attributes into discrete attributes [3]. This conversion can induce scaling attributes or disjoint intervals. We focus on the latter. Such a transformation is necessary for some classification models like symbolic models, which cannot handle continuous attributes [4]. Consider a continuous data set D = (O, F), where each object in O is described by p continuous attributes in F. The discretization process is performed by iteration of attribute splitting step, according to a splitting criterion (Entropy [3], Gini [5], χ2 [6],...) until a stopping criterion S is satisfied (a maximal number of intervals to create, a purity measure,...). More formally for one discretization step, for selecting the best attribute to be cut, let (v1,..., vN) be the sorted values of a continuous attribute V ∈ F. Each vi corresponds to a value verified by one object of the data set D. The set of possible cut-points is CV = (c1 V,..., cN−1

### Homogeneity and Stability in Conceptual Analysis

"... Abstract. This work comes within the field of data analysis using Galois lattices. We consider ordinal, numerical single or interval data as well as data that consist on frequency/probability distributions on a finite set of categories. Data are represented and dealt with on a common framework, by d ..."

Abstract
- Add to MetaCart

Abstract. This work comes within the field of data analysis using Galois lattices. We consider ordinal, numerical single or interval data as well as data that consist on frequency/probability distributions on a finite set of categories. Data are represented and dealt with on a common framework, by defining a generalization operator that determines intents by intervals. In the case of distribution data, the obtained concepts are more homogeneous and more easily interpretable than those obtained by using the maximum and minimum operators previously proposed. The number of obtained concepts being often rather large, and to limit the influence of atypical elements, we propose to identify stable concepts using interval distances in a cross validation-like approach. 1

### Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence Revisiting Numerical Pattern Mining with Formal Concept Analysis

"... We investigate the problem of mining numerical data with Formal Concept Analysis. The usual way is to use a scaling procedure –transforming numerical attributes into binary ones – leading either to a loss of information or of efficiency, in particular w.r.t. the volume of extracted patterns. By cont ..."

Abstract
- Add to MetaCart

We investigate the problem of mining numerical data with Formal Concept Analysis. The usual way is to use a scaling procedure –transforming numerical attributes into binary ones – leading either to a loss of information or of efficiency, in particular w.r.t. the volume of extracted patterns. By contrast, we propose to directly work on numerical data in a more precise and efficient way. For that, the notions of closed patterns, generators and equivalent classes are revisited in the numerical context. Moreover, two algorithms are proposed and tested in an evaluation involving real-world data, showing the quality of the present approach. 1

### Formal Concept Analysis

, 2014

"... This document is the author deposited version. You are advised to consult the publisher's version if you wish to cite from it. Published version ANDREWS, Simon (2015). A ‘Best-of-Breed ’ approach for designing a fast algorithm ..."

Abstract
- Add to MetaCart

(Show Context)
This document is the author deposited version. You are advised to consult the publisher's version if you wish to cite from it. Published version ANDREWS, Simon (2015). A ‘Best-of-Breed ’ approach for designing a fast algorithm