Results 1  10
of
34
Clausal Discovery
 Machine Learning
, 1996
"... The clausal discovery engine Claudien is presented. Claudien is an inductive logic programming engine that fits in the knowledge discovery in databases and data mining paradigm as it discovers regularities that are valid in data. As such Claudien performs a novel induction task, which is called char ..."
Abstract

Cited by 184 (33 self)
 Add to MetaCart
The clausal discovery engine Claudien is presented. Claudien is an inductive logic programming engine that fits in the knowledge discovery in databases and data mining paradigm as it discovers regularities that are valid in data. As such Claudien performs a novel induction task, which is called characteristic induction from closed observations, and which is related to existing formalizations of induction in logic. In characterising induction from closed observations, the regularities are represented by clausal theories, and the data using Herbrand interpretations. Claudien also employs a novel declarative bias mechanism to define the set of clauses that may appear in a hypothesis. Keywords : Inductive Logic Programming, Knowledge Discovery in Databases, Data Mining, Learning, Induction, Semantics for Induction, Logic of Induction, Parallel Learning. 1 Introduction Despite the fact that the areas of knowledge discovery in databases [Fayyad et al., 1995] and inductive logic programmin...
Separateandconquer rule learning
 Artificial Intelligence Review
, 1999
"... This paper is a survey of inductive rule learning algorithms that use a separateandconquer strategy. This strategy can be traced back to the AQ learning system and still enjoys popularity as can be seen from its frequent use in inductive logic programming systems. We will put this wide variety of ..."
Abstract

Cited by 135 (29 self)
 Add to MetaCart
This paper is a survey of inductive rule learning algorithms that use a separateandconquer strategy. This strategy can be traced back to the AQ learning system and still enjoys popularity as can be seen from its frequent use in inductive logic programming systems. We will put this wide variety of algorithms into a single framework and analyze them along three different dimensions, namely their search, language and overfitting avoidance biases.
Inductive Constraint Logic
, 1995
"... . A novel approach to learning first order logic formulae from positive and negative examples is presented. Whereas present inductive logic programming systems employ examples as true and false ground facts (or clauses), we view examples as interpretations which are true or false for the target theo ..."
Abstract

Cited by 86 (19 self)
 Add to MetaCart
. A novel approach to learning first order logic formulae from positive and negative examples is presented. Whereas present inductive logic programming systems employ examples as true and false ground facts (or clauses), we view examples as interpretations which are true or false for the target theory. This viewpoint allows to reconcile the inductive logic programming paradigm with classical attribute value learning in the sense that the latter is a special case of the former. Because of this property, we are able to adapt AQ and CN2 type algorithms in order to enable learning of full first order formulae. However, whereas classical learning techniques have concentrated on concept representations in disjunctive normal form, we will use a clausal representation, which corresponds to a conjuctive normal form where each conjunct forms a constraint on positive examples. This representation duality reverses also the role of positive and negative examples, both in the heuristics and in the a...
First order jkclausal theories are PAClearnable
 Artificial Intelligence
, 1994
"... We present positive PAClearning results for the nonmonotonic inductive logic programming setting. In particular, we show that first order rangerestricted clausal theories that consist of clauses with up to k literals of size at most j each are polynomialsample polynomialtime PAClearnable with on ..."
Abstract

Cited by 64 (27 self)
 Add to MetaCart
We present positive PAClearning results for the nonmonotonic inductive logic programming setting. In particular, we show that first order rangerestricted clausal theories that consist of clauses with up to k literals of size at most j each are polynomialsample polynomialtime PAClearnable with onesided error from positive examples only. In our framework, concepts are clausal theories and examples are finite interpretations. We discuss the problems encountered when learning theories which only have infinite nontrivial models and propose a way to avoid these problems using a representation change called flattening. Finally, we compare our results to PAClearnability results for the normal inductive logic programming setting. 1
Distance Between Herbrand Interpretations: a measure for approximations to a target concept
, 1997
"... . We can use a metric to measure the di#erences between elements in a domain or subsets of that domain #i.e. concepts#. Which particular metric should be chosen, depends on the kind of di#erence wewant to measure. The well known Euclidean metric on # n and its generalizations are often used f ..."
Abstract

Cited by 38 (0 self)
 Add to MetaCart
. We can use a metric to measure the di#erences between elements in a domain or subsets of that domain #i.e. concepts#. Which particular metric should be chosen, depends on the kind of di#erence wewant to measure. The well known Euclidean metric on # n and its generalizations are often used for this purpose, but such metrics are not always suitable for concepts where elements have some structure di#erent from real numbers. For example, in #Inductive# Logic Programming a concept is often expressed as an Herbrand interpretation of some #rstorder language. Every element in an Herbrand interpretation is a ground atom which has a tree structure. We start by de#ning a metric d on the set of expressions #ground atoms and ground terms#, motivated by the structure and complexity of the expressions and the symbols used therein. This metric induces the Hausdor # metric h on the set of all sets of ground atoms, which allows us to measure the distance between Herbrand interpretatio...
Learning FirstOrder Definitions of Functions
 Journal of Artificial Intelligence Research
, 1996
"... Firstorder learning involves finding a clauseform definition of a relation from examples of the relation and relevant background information. In this paper, a particular firstorder learning system is modified to customize it for finding definitions of functional relations. This restriction leads ..."
Abstract

Cited by 35 (1 self)
 Add to MetaCart
Firstorder learning involves finding a clauseform definition of a relation from examples of the relation and relevant background information. In this paper, a particular firstorder learning system is modified to customize it for finding definitions of functional relations. This restriction leads to faster learning times and, in some cases, to definitions that have higher predictive accuracy. Other firstorder learning systems might benefit from similar specialization. 1. Introduction Empirical learning is the subfield of AI that develops algorithms for constructing theories from data. Most classification research in this area has used the attributevalue formalism, in which data are represented as vectors of values of a fixed set of attributes and are labelled with one of a small number of discrete classes. A learning system then develops a mapping from attribute values to classes that can be used to classify unseen data. Despite the welldocumented successes of algorithms develope...
Polynomial Learnability and Inductive Logic Programming: Methods and Results
 New Generation Computing
, 1995
"... Over the last few years, the ecient learnability of logic programs has been studied extensively. Positive and negative learnability results now exist for a number of restricted classes of logic programs that are closely related to the classes used in practice within inductive logic programming. T ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
Over the last few years, the ecient learnability of logic programs has been studied extensively. Positive and negative learnability results now exist for a number of restricted classes of logic programs that are closely related to the classes used in practice within inductive logic programming. This paper surveys these results, and also introduces some of the more useful techniques for deriving such results. The paper does not assume any prior background in computational learning theory.
Naive Bayesian Classification of Structured Data
, 2003
"... In this paper we present 1BC and 1BC2, two systems that perform naive Bayesian classification of structured individuals. The approach of 1BC is to project the individuals along firstorder features. These features are built from the individual using structural predicates referring to related objects ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
In this paper we present 1BC and 1BC2, two systems that perform naive Bayesian classification of structured individuals. The approach of 1BC is to project the individuals along firstorder features. These features are built from the individual using structural predicates referring to related objects (e.g. atoms within molecules), and properties applying to the individual or one or several of its related objects (e.g. a bond between two atoms). We describe an individual in terms of elementary features consisting of zero or more structural predicates and one property; these features are treated as conditionally independent in the spirit of the naive Bayes assumption. 1BC2 represents an alternative firstorder upgrade to the naive Bayesian classifier by considering probability distributions over structured objects (e.g., a molecule as a set of atoms), and estimating those distributions from the probabilities of its elements (which are assumed to be independent). We present a unifying view on both systems in which 1BC works in language space, and 1BC2 works in individual space. We also present a new, efficient recursive algorithm improving upon the original propositionalisation approach of 1BC. Both systems have been implemented in the context of the firstorder descriptive learner Tertius, and we investigate the differences between the two systems both in computational terms and on artificially generated data. Finally, we describe a range of experiments on ILP benchmark data sets demonstrating the viability of our approach.
Nonmonotonic Abductive Inductive Learning
 Journal of Applied Logic
, 2008
"... Inductive Logic Programming (ILP) is concerned with the task of generalising sets of positive and negative examples with respect to background knowledge expressed as logic programs. Negation as Failure (NAF) is a key feature of logic programming which provides a means for nonmonotonic commonsense re ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
Inductive Logic Programming (ILP) is concerned with the task of generalising sets of positive and negative examples with respect to background knowledge expressed as logic programs. Negation as Failure (NAF) is a key feature of logic programming which provides a means for nonmonotonic commonsense reasoning under incomplete information. But, so far, most ILP research has been aimed at Horn programs which exclude NAF, and has failed to exploit the full potential of normal programs that allow NAF. By contrast, Abductive Logic Programming (ALP), a related task concerned with explaining observations with respect to a prior theory, has been well studied and applied in the context of normal logic programs. This paper shows how ALP can be used to provide a semantics and proof procedure for nonmonotonic ILP that utilises practical methods of language and search bias to reduce the search space. This is done by lifting an existing method called Hybrid Abductive Inductive Learning (HAIL) from Horn clauses to normal logic programs. To demonstrate its potential benefits, the resulting system, called XHAIL, is applied to a process modelling case study involving a nonmonotonic temporal Event Calculus (EC). 1
Learning Recursive Theories in the Normal ILP Setting
, 2003
"... Induction of recursive theories in the normal ILP setting is a difficult learning task whose complexity is equivalent to multiple predicate learning. In this paper we propose computational solutions to some relevant issues raised by the multiple predicate learning problem. A separateandparallel ..."
Abstract

Cited by 13 (9 self)
 Add to MetaCart
Induction of recursive theories in the normal ILP setting is a difficult learning task whose complexity is equivalent to multiple predicate learning. In this paper we propose computational solutions to some relevant issues raised by the multiple predicate learning problem. A separateandparallel conquer search strategy is adopted to interleave the learning of clauses supplying predicates with mutually recursive definitions. A novel generality order to be imposed on the search space of clauses is investigated, in order to cope with recursion in a more suitable way. The consistency recovery is performed by reformulating the current theory and by applying a layering technique, based on the collapsed dependency graph. The proposed approach has been implemented in the ILP system ATRE and tested on some laboratorysized and realworld data sets. Experimental results demonstrate that ATRE is able to learn correct theories autonomously and to discover concept dependencies. Finally, related works and their main differences with our approach are discussed.