Results 1  10
of
29
1BC: a FirstOrder Bayesian Classifier
 PROCEEDINGS OF THE 9TH INTERNATIONAL WORKSHOP ON INDUCTIVE LOGIC PROGRAMMING, VOLUME 1634 OF LECTURE NOTES IN ARTIFICIAL INTELLIGENCE
, 1999
"... In this paper we present 1BC, a firstorder Bayesian Classifier. Our approach is to view individuals as structured terms, and to distinguish between structural predicates referring to subterms (e.g. atoms from molecules), and properties applying to one or several of these subterms (e.g. a bond betwe ..."
Abstract

Cited by 42 (18 self)
 Add to MetaCart
In this paper we present 1BC, a firstorder Bayesian Classifier. Our approach is to view individuals as structured terms, and to distinguish between structural predicates referring to subterms (e.g. atoms from molecules), and properties applying to one or several of these subterms (e.g. a bond between two atoms). We describe an individual in terms of elementary features consisting of zero or more structural predicates and one property; these features are considered conditionally independent following the usual naive Bayes assumption. 1BC has been implemented in the context of the firstorder descriptive learner Tertius, and we describe several experiments demonstrating the viability of our approach.
Confirmationguided discovery of firstorder rules with Tertius
 Machine Learning
, 2000
"... . This paper deals with learning firstorder logic rules from data lacking an explicit classification predicate. Consequently, the learned rules are not restricted to predicate definitions as in supervised inductive logic programming. Firstorder logic offers the ability to deal with structured, mul ..."
Abstract

Cited by 33 (9 self)
 Add to MetaCart
. This paper deals with learning firstorder logic rules from data lacking an explicit classification predicate. Consequently, the learned rules are not restricted to predicate definitions as in supervised inductive logic programming. Firstorder logic offers the ability to deal with structured, multirelational knowledge. Possible applications include firstorder knowledge discovery, induction of integrity constraints in databases, multiple predicate learning, and learning mixed theories of predicate definitions and integrity constraints. One of the contributions of our work is a heuristic measure of confirmation, trading off novelty and satisfaction of the rule. The approach has been implemented in the Tertius system. The system performs an optimal bestfirst search, finding the k most confirmed hypotheses, and includes a nonredundant refinement operator to avoid duplicates in the search. Tertius can be adapted to many different domains by tuning its parameters, and it can deal eithe...
Naive Bayesian Classification of Structured Data
, 2003
"... In this paper we present 1BC and 1BC2, two systems that perform naive Bayesian classification of structured individuals. The approach of 1BC is to project the individuals along firstorder features. These features are built from the individual using structural predicates referring to related objects ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
In this paper we present 1BC and 1BC2, two systems that perform naive Bayesian classification of structured individuals. The approach of 1BC is to project the individuals along firstorder features. These features are built from the individual using structural predicates referring to related objects (e.g. atoms within molecules), and properties applying to the individual or one or several of its related objects (e.g. a bond between two atoms). We describe an individual in terms of elementary features consisting of zero or more structural predicates and one property; these features are treated as conditionally independent in the spirit of the naive Bayes assumption. 1BC2 represents an alternative firstorder upgrade to the naive Bayesian classifier by considering probability distributions over structured objects (e.g., a molecule as a set of atoms), and estimating those distributions from the probabilities of its elements (which are assumed to be independent). We present a unifying view on both systems in which 1BC works in language space, and 1BC2 works in individual space. We also present a new, efficient recursive algorithm improving upon the original propositionalisation approach of 1BC. Both systems have been implemented in the context of the firstorder descriptive learner Tertius, and we investigate the differences between the two systems both in computational terms and on artificially generated data. Finally, we describe a range of experiments on ILP benchmark data sets demonstrating the viability of our approach.
The Role of Feature Construction in Inductive Rule Learning
"... This paper proposes a unifying framework for inductive rule learning algorithms. We suggest that the problem of constructing an appropriate inductive hypothesis (set of rules) can be broken down in the following subtasks: rule construction, body construction, and feature construction. Each of the ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
This paper proposes a unifying framework for inductive rule learning algorithms. We suggest that the problem of constructing an appropriate inductive hypothesis (set of rules) can be broken down in the following subtasks: rule construction, body construction, and feature construction. Each of these subtasks may have its own declarative bias, search strategies, and heuristics. In particular, we argue that feature construction is a crucial notion in explaining the relations between attributevalue rule learning and inductive logic programming (ILP). We demonstrate this by a general method for transforming ILP problems to attributevalue form, which overcomes some of the traditional limitations of propositionalisation approaches.
An Evolutionary Approach to Concept Learning with Structured Data
"... This paper details the implementation of a stronglytyped evolutionary programming system (STEPS) and its application to concept learning from highlystructured examples. STEPS evolves concept descriptions in the form of program trees. Predictive accuracy is used as the fitness function to be optimi ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
This paper details the implementation of a stronglytyped evolutionary programming system (STEPS) and its application to concept learning from highlystructured examples. STEPS evolves concept descriptions in the form of program trees. Predictive accuracy is used as the fitness function to be optimised through genetic operations. Empirical results with representative applications demonstrate promise.
Bridging the gap between distance and generalisation: Symbolic learning in metric spaces
, 2008
"... Distancebased and generalisationbased methods are two families of artificial intelligence techniques that have been successfully used over a wide range of realworld problems. In the first case, general algorithms can be applied to any data representation by just changing the distance. The metric ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Distancebased and generalisationbased methods are two families of artificial intelligence techniques that have been successfully used over a wide range of realworld problems. In the first case, general algorithms can be applied to any data representation by just changing the distance. The metric space sets the search and learning space, which is generally instanceoriented. In the second case, models can be obtained for a given pattern language, which can be comprehensible. The generalityordered space sets the search and learning space, which is generally modeloriented. However, the concepts of distance and generalisation clash in many different ways, especially when knowledge representation is complex (e.g. structured data). This work establishes a framework where these two fields can be integrated in a consistent way. We introduce the concept of distancebased generalisation, which connects all the generalised examples in such a way that all of them are reachable inside the generalisation by using straight paths in the metric space. This makes the metric space and the generalityordered space coherent (or even dual). Additionally, we also introduce a definition of minimal distancebased generalisation that can be seen as the first formulation of the Minimum Description Length (MDL)/Minimum Message Length (MML) principle in terms of a distance function. We instantiate and develop the framework for the most common data representations and distances, where we show that consistent instances can be found for numerical data, nominal data, sets, lists, tuples, graphs, firstorder atoms and clauses. As a result, general learning methods that integrate the best from distancebased and generalisationbased methods can be defined and adapted to any specific problem by appropriately choosing the distance, the pattern language and the generalisation operator.
Computational Logic and Machine Learning: A roadmap for Inductive Logic Programming
 Technical Report, J. Stefan Institute
, 1998
"... Computational logic has already significantly influenced (symbolic) machine learning through the field of inductive logic programming (ILP) which is concerned with the induction of logic programs from examples and background knowledge. In ILP, the shift of attention from program synthesis to knowled ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Computational logic has already significantly influenced (symbolic) machine learning through the field of inductive logic programming (ILP) which is concerned with the induction of logic programs from examples and background knowledge. In ILP, the shift of attention from program synthesis to knowledge discovery resulted in advanced techniques that are practically applicable for discovering knowledge in relational databases. Machine learning, and ILP in particular, has the potential to influence computational logic by providing an application area full of industrially significant problems, thus providing a challenge for other techniques in computational logic. This paper gives a brief introduction to ILP, presents stateoftheart ILP techniques for relational knowledge discovery as well as some research and organizational directions for further developments in this area. 1 Introduction Inductive logic programming (ILP) [35, 39, 29] is a research area that has its backgrounds in induct...
Learning in Clausal Logic: A Perspective on Inductive Logic Programming
 Computational Logic: Logic Programming and Beyond, volume 2407 of Lecture Notes in Computer Science
, 2002
"... Abstract. Inductive logic programming is a form of machine learning from examples which employs the representation formalism of clausal logic. One of the earliest inductive logic programming systems was Ehud Shapiro’s Model Inference System [90], which could synthesise simple recursive programs like ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract. Inductive logic programming is a form of machine learning from examples which employs the representation formalism of clausal logic. One of the earliest inductive logic programming systems was Ehud Shapiro’s Model Inference System [90], which could synthesise simple recursive programs like append/3. Many of the techniques devised by Shapiro, such as topdown search of program clauses by refinement operators, the use of intensional background knowledge, and the capability of inducing recursive clauses, are still in use today. On the other hand, significant advances have been made regarding dealing with noisy data, efficient heuristic and stochastic search methods, the use of logical representations going beyond definite clauses, and restricting the search space by means of declarative bias. The latter is a general term denoting any form of restrictions on the syntactic form of possible hypotheses. These include the use of types, input/output mode declarations, and clause schemata. Recently, some researchers have started using alternatives to Prolog featuring strong typing and real functions, which alleviate the need for some of the above adhoc mechanisms. Others have gone beyond Prolog by investigating learning tasks in which the hypotheses are not definite clause programs, but for instance sets of indefinite clauses or denials, constraint logic programs, or clauses representing association rules. The chapter gives an accessible introduction to the above topics. In addition, it outlines the main current research directions which have been strongly influenced by recent developments in data mining and challenging reallife applications. 1
A FirstOrder Approach to Unsupervised Learning
, 1999
"... . This paper deals with learning firstorder logic rules from data lacking an explicit classification predicate. Consequently, the learned rules are not restricted to predicate definitions as in supervised Inductive Logic Programming. Firstorder logic offers the ability to deal with structured, mul ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
. This paper deals with learning firstorder logic rules from data lacking an explicit classification predicate. Consequently, the learned rules are not restricted to predicate definitions as in supervised Inductive Logic Programming. Firstorder logic offers the ability to deal with structured, multirelational knowledge. Possible applications include firstorder knowledge discovery, induction of integrity constraints in databases, multiple predicate learning, and learning mixed theories of predicate definitions and integrity constraints. One of the contributions of our work is a heuristic measure of confirmation, trading off satisfaction and novelty of the rule. The approach has been implemented in the Tertius system. The system performs an optimal bestfirst search, finding the k most confirmed hypotheses. It can be tuned to many different domains by setting its parameters, and it can deal either with individualbased representations as in propositional learning or with general logi...