Results 1  10
of
23
1BC: a FirstOrder Bayesian Classifier
 PROCEEDINGS OF THE 9TH INTERNATIONAL WORKSHOP ON INDUCTIVE LOGIC PROGRAMMING, VOLUME 1634 OF LECTURE NOTES IN ARTIFICIAL INTELLIGENCE
, 1999
"... In this paper we present 1BC, a firstorder Bayesian Classifier. Our approach is to view individuals as structured terms, and to distinguish between structural predicates referring to subterms (e.g. atoms from molecules), and properties applying to one or several of these subterms (e.g. a bond betwe ..."
Abstract

Cited by 42 (18 self)
 Add to MetaCart
In this paper we present 1BC, a firstorder Bayesian Classifier. Our approach is to view individuals as structured terms, and to distinguish between structural predicates referring to subterms (e.g. atoms from molecules), and properties applying to one or several of these subterms (e.g. a bond between two atoms). We describe an individual in terms of elementary features consisting of zero or more structural predicates and one property; these features are considered conditionally independent following the usual naive Bayes assumption. 1BC has been implemented in the context of the firstorder descriptive learner Tertius, and we describe several experiments demonstrating the viability of our approach.
Confirmationguided discovery of firstorder rules with Tertius
 Machine Learning
, 2000
"... . This paper deals with learning firstorder logic rules from data lacking an explicit classification predicate. Consequently, the learned rules are not restricted to predicate definitions as in supervised inductive logic programming. Firstorder logic offers the ability to deal with structured, mul ..."
Abstract

Cited by 29 (9 self)
 Add to MetaCart
. This paper deals with learning firstorder logic rules from data lacking an explicit classification predicate. Consequently, the learned rules are not restricted to predicate definitions as in supervised inductive logic programming. Firstorder logic offers the ability to deal with structured, multirelational knowledge. Possible applications include firstorder knowledge discovery, induction of integrity constraints in databases, multiple predicate learning, and learning mixed theories of predicate definitions and integrity constraints. One of the contributions of our work is a heuristic measure of confirmation, trading off novelty and satisfaction of the rule. The approach has been implemented in the Tertius system. The system performs an optimal bestfirst search, finding the k most confirmed hypotheses, and includes a nonredundant refinement operator to avoid duplicates in the search. Tertius can be adapted to many different domains by tuning its parameters, and it can deal eithe...
Naive Bayesian Classification of Structured Data
, 2003
"... In this paper we present 1BC and 1BC2, two systems that perform naive Bayesian classification of structured individuals. The approach of 1BC is to project the individuals along firstorder features. These features are built from the individual using structural predicates referring to related objects ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
In this paper we present 1BC and 1BC2, two systems that perform naive Bayesian classification of structured individuals. The approach of 1BC is to project the individuals along firstorder features. These features are built from the individual using structural predicates referring to related objects (e.g. atoms within molecules), and properties applying to the individual or one or several of its related objects (e.g. a bond between two atoms). We describe an individual in terms of elementary features consisting of zero or more structural predicates and one property; these features are treated as conditionally independent in the spirit of the naive Bayes assumption. 1BC2 represents an alternative firstorder upgrade to the naive Bayesian classifier by considering probability distributions over structured objects (e.g., a molecule as a set of atoms), and estimating those distributions from the probabilities of its elements (which are assumed to be independent). We present a unifying view on both systems in which 1BC works in language space, and 1BC2 works in individual space. We also present a new, efficient recursive algorithm improving upon the original propositionalisation approach of 1BC. Both systems have been implemented in the context of the firstorder descriptive learner Tertius, and we investigate the differences between the two systems both in computational terms and on artificially generated data. Finally, we describe a range of experiments on ILP benchmark data sets demonstrating the viability of our approach.
The Role of Feature Construction in Inductive Rule Learning
"... This paper proposes a unifying framework for inductive rule learning algorithms. We suggest that the problem of constructing an appropriate inductive hypothesis (set of rules) can be broken down in the following subtasks: rule construction, body construction, and feature construction. Each of the ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
This paper proposes a unifying framework for inductive rule learning algorithms. We suggest that the problem of constructing an appropriate inductive hypothesis (set of rules) can be broken down in the following subtasks: rule construction, body construction, and feature construction. Each of these subtasks may have its own declarative bias, search strategies, and heuristics. In particular, we argue that feature construction is a crucial notion in explaining the relations between attributevalue rule learning and inductive logic programming (ILP). We demonstrate this by a general method for transforming ILP problems to attributevalue form, which overcomes some of the traditional limitations of propositionalisation approaches.
An Evolutionary Approach to Concept Learning with Structured Data
 In Proceedings of the fourth International Conference on Artificial Neural Networks and Genetic Algorithms
, 1999
"... This paper details the implementation of a stronglytyped evolutionary programming system (STEPS) and its application to concept learning from highlystructured examples. STEPS evolves concept descriptions in the form of program trees. Predictive accuracy is used as the fitness function to be optimi ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
This paper details the implementation of a stronglytyped evolutionary programming system (STEPS) and its application to concept learning from highlystructured examples. STEPS evolves concept descriptions in the form of program trees. Predictive accuracy is used as the fitness function to be optimised through genetic operations. Empirical results with representative applications demonstrate promise. 1 Introduction The aim of concept learning is to induce a general description of a concept from a set of specific examples. The examples and the concept description are expressed in some representation language (e.g., attributevalue language, Horn clauses) and the learning task can be viewed as a search, through the space of all possible concept descriptions, for a description that both characterises the examples provided and generalises to new ones [9]. As concept learning problems of increasing complexity are being tackled, increasingly expressive representation languages are becoming...
Bridging the gap between distance and generalisation: Symbolic learning in metric spaces
, 2008
"... Distancebased and generalisationbased methods are two families of artificial intelligence techniques that have been successfully used over a wide range of realworld problems. In the first case, general algorithms can be applied to any data representation by just changing the distance. The metric ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Distancebased and generalisationbased methods are two families of artificial intelligence techniques that have been successfully used over a wide range of realworld problems. In the first case, general algorithms can be applied to any data representation by just changing the distance. The metric space sets the search and learning space, which is generally instanceoriented. In the second case, models can be obtained for a given pattern language, which can be comprehensible. The generalityordered space sets the search and learning space, which is generally modeloriented. However, the concepts of distance and generalisation clash in many different ways, especially when knowledge representation is complex (e.g. structured data). This work establishes a framework where these two fields can be integrated in a consistent way. We introduce the concept of distancebased generalisation, which connects all the generalised examples in such a way that all of them are reachable inside the generalisation by using straight paths in the metric space. This makes the metric space and the generalityordered space coherent (or even dual). Additionally, we also introduce a definition of minimal distancebased generalisation that can be seen as the first formulation of the Minimum Description Length (MDL)/Minimum Message Length (MML) principle in terms of a distance function. We instantiate and develop the framework for the most common data representations and distances, where we show that consistent instances can be found for numerical data, nominal data, sets, lists, tuples, graphs, firstorder atoms and clauses. As a result, general learning methods that integrate the best from distancebased and generalisationbased methods can be defined and adapted to any specific problem by appropriately choosing the distance, the pattern language and the generalisation operator.
Computational Logic and Machine Learning: A roadmap for Inductive Logic Programming
 Technical Report, J. Stefan Institute
, 1998
"... Computational logic has already significantly influenced (symbolic) machine learning through the field of inductive logic programming (ILP) which is concerned with the induction of logic programs from examples and background knowledge. In ILP, the shift of attention from program synthesis to knowled ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Computational logic has already significantly influenced (symbolic) machine learning through the field of inductive logic programming (ILP) which is concerned with the induction of logic programs from examples and background knowledge. In ILP, the shift of attention from program synthesis to knowledge discovery resulted in advanced techniques that are practically applicable for discovering knowledge in relational databases. Machine learning, and ILP in particular, has the potential to influence computational logic by providing an application area full of industrially significant problems, thus providing a challenge for other techniques in computational logic. This paper gives a brief introduction to ILP, presents stateoftheart ILP techniques for relational knowledge discovery as well as some research and organizational directions for further developments in this area. 1 Introduction Inductive logic programming (ILP) [35, 39, 29] is a research area that has its backgrounds in induct...
FirstOrder Bayesian Classification with 1BC
, 2000
"... . In this paper we present 1BC, a firstorder Bayesian Classifier. Our approach is to view individuals as structured objects, and to distinguish between structural predicates referring to parts of individuals (e.g. atoms within molecules), and properties applying to the individual or one or several ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
. In this paper we present 1BC, a firstorder Bayesian Classifier. Our approach is to view individuals as structured objects, and to distinguish between structural predicates referring to parts of individuals (e.g. atoms within molecules), and properties applying to the individual or one or several of its parts (e.g. a bond between two atoms). We describe an individual in terms of elementary features consisting of zero or more structural predicates and one property; these features are considered conditionally independent following the usual naive Bayes assumption. 1BC has been implemented in the context of the firstorder descriptive learner Tertius, and we describe several experiments demonstrating the viability of our approach. Keywords: inductive logic programming, naive Bayes, firstorder logic 1. Motivation and scope In this paper we present 1BC, a firstorder Bayesian Classifier. While the propositional Bayesian Classifier makes the naive Bayes assumption of statistical indepe...
A Unifying View of Knowledge Representation for Inductive Learning
 In preparation
, 2000
"... This paper provides a foundation for inductive learning based on the use of higherorder logic for knowledge representation. In particular, the paper (i) provides a systematic individualsasterms approach to knowledge representation for inductive learning, and demonstrates the utility of types an ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
This paper provides a foundation for inductive learning based on the use of higherorder logic for knowledge representation. In particular, the paper (i) provides a systematic individualsasterms approach to knowledge representation for inductive learning, and demonstrates the utility of types and higherorder constructs for this purpose; (ii) gives a systematic way of constructing predicates for use in induced definitions; (iii) widens the applicability of decisiontree algorithms beyond the usual attributevalue setting to the classification of individuals with complex structure; and (iv) shows how to induce definitions which are comprehensible and have predictive power. The paper contains ten illustrative applications involving a variety of types to which a decisiontree learning system is applied. The e#ectiveness of the approach is further demonstrated by applying the learning system to two larger benchmark applications. 1 Introduction Inductive learning focuses on tec...