Results 1  10
of
21
Learning TwoTiered Descriptions of Flexible Concepts: The Poseidon Systems
 MACHINE LEARNING
, 1992
"... This paper describes a method for learning flexible concepts. by which are meant concepts that lack precise definition and are contextqlependent. To describe such concepts, the method employs a twotiered represen tation. in which the first tier captures explicitly basic concept properties, and the ..."
Abstract

Cited by 44 (20 self)
 Add to MetaCart
This paper describes a method for learning flexible concepts. by which are meant concepts that lack precise definition and are contextqlependent. To describe such concepts, the method employs a twotiered represen tation. in which the first tier captures explicitly basic concept properties, and the second tier characterizes allowable concept's modifications and context dependency. In e proposed method. the first tier, called Base Concept Representation (BCR), is created in two phases. In phase 1, the AQ15 rule learning program is applied to induce a complete and consistent concept description from supplied examples. In phase 2, this description is optimized according to a domaindependent quality criterion. The second tier, called the inferential concept interpretation dCI). consists of a procedure for flexible matching, and a set of inference rules. The proposed method has been implemented in the POSEIDON system. and experimentally tested on two realworld problems: [earning the concept of an acceptable umon contract, and learning voting patterns of Republicans and Democrats in the U.S. Congress. For comparison, a few other learning methods were also applied to the same problems. These methods included simple variants of exemplarbased learning, and an ID3tyl: decision tree learning, implemented m the ASSISTANT program. In the exl:riments, POSEIDON generated concept descriptions that were both, more accurate and also substantially simpler than those produced by the other methods.
Adding equations to NUProlog
 In Proc. of the 3rd Int. Symposium on Programming Language Implementation and Logic Programming
, 1991
"... This paper describes an extension to NUProlog which allows evaluable functions to be defined using equations. We consider it to be the most pragmatic way of combining functional and relational programming. The implementation consists of several hundred lines of Prolog code and the underlying Prolog ..."
Abstract

Cited by 38 (5 self)
 Add to MetaCart
This paper describes an extension to NUProlog which allows evaluable functions to be defined using equations. We consider it to be the most pragmatic way of combining functional and relational programming. The implementation consists of several hundred lines of Prolog code and the underlying Prolog implementation was not modified at all. However, the system is reasonably efficient and supports coroutining, optional lazy evaluation, higher order functions and parallel execution. Efficiency is gained in several ways. First, we use some new implementation techniques. Second, we exploit some of the unique features of NUProlog, though these features are not essential to the implementation. Third, the language is designed so that we can take advantage of implicit mode and determinism information. Although we have not concentrated on the semantics of the language, we believe that our language design decisions and implementation techniques will be useful in the next generation of combined functional and relational languages. Keywords: logic programming, equations, functions, parallelism, indexing, lazy evaluation, higher order.  1  1 Introduction
Algebra of logic programming
 International Conference on Logic Programming
, 1999
"... At present, the field of declarative programming is split into two main areas based on different formalisms; namely, functional programming, which is based on lambda calculus, and logic programming, which is based on firstorder logic. There are currently several language proposals for integrating th ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
At present, the field of declarative programming is split into two main areas based on different formalisms; namely, functional programming, which is based on lambda calculus, and logic programming, which is based on firstorder logic. There are currently several language proposals for integrating the expressiveness of these two models of computation. In this thesis we work towards an integration of the methodology from the two research areas. To this end, we propose an algebraic approach to reasoning about logic programs, corresponding to the approach taken in functional programming. In the first half of the thesis we develop and discuss a framework which forms the basis for our algebraic analysis and transformation methods. The framework is based on an embedding of definite logic programs into lazy functional programs in Haskell, such that both the declarative and the operational semantics of the logic programs are preserved. In spite of its conciseness and apparent simplicity, the embedding proves to have many interesting properties and it gives rise to an algebraic semantics of logic programming. It also allows us to reason about logic programs in a simple calculational style, using rewriting and the algebraic laws of combinators. In the embedding, the meaning of a logic program arises compositionally from the meaning of its constituent subprograms and the combinators that connect them. In the second half of the thesis we explore applications of the embedding to the algebraic transformation of logic programs. A series of examples covers simple program derivations, where our techniques simplify some of the current techniques. Another set of examples explores applications of the more advanced program development techniques from the Algebra of Programming by Bird and de Moor [18], where we expand the techniques currently available for logic program derivation and optimisation. To my parents, Sandor and Erzsebet. And the end of all our exploring Will be to arrive where we started And know the place for the first time.
MultiStrategy Learning and Theory Revision
, 1993
"... This paper presents the system WHY, which learns and updates a diagnostic knowledge base using domain knowledge and a set of examples. The apriori knowledge consists of a causal model of the domain, stating the relationships among basic phenomena, and a body of phenomenological theory, describing t ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
This paper presents the system WHY, which learns and updates a diagnostic knowledge base using domain knowledge and a set of examples. The apriori knowledge consists of a causal model of the domain, stating the relationships among basic phenomena, and a body of phenomenological theory, describing the links between abstract concepts and their possible manifestations in the world. The phenomenological knowledge is used deductively, the causal model is used abductively and the examples are used inductively. The problems of imperfection and intractability of the theory are handled by allowing the system to make assumptions during its reasoning. In this way, robust knowledge can be learned with limited complexity and limited number of examples. The system works in a first order logic environment and has been applied in a real domain. 2 1. Introduction Several authors have advocated the necessity of using deep models of the structure and behaviour of the entities involved in a given doma...
Embedding prolog in haskell
 Department of Computer Science, University of Utrecht
, 1999
"... The distinctive merit of the declarative reading of logic programs is the validity ofallthelaws of reasoning supplied by the predicate calculus with equality. Surprisingly many of these laws are still valid for the procedural reading � they can therefore be used safely for algebraic manipulation, pr ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
The distinctive merit of the declarative reading of logic programs is the validity ofallthelaws of reasoning supplied by the predicate calculus with equality. Surprisingly many of these laws are still valid for the procedural reading � they can therefore be used safely for algebraic manipulation, program transformation and optimisation of executable logic programs. This paper lists a number of common laws, and proves their validity for the standard (depth rst search) procedural reading of Prolog. They also hold for alternative search strategies, e.g. breadth rst search. Our proofs of the laws are based on the standard algebra of functional programming, after the strategies have been given a rather simple implementation in Haskell. 1
Logic Programming, Functional Programming, and Inductive Definitions
 In Extensions of Logic Programming, volume 475 of LNCS
, 1991
"... Machine. It is incomplete due to depthfirst search, but presumably there could be a version using iterative deepening. An ORparallel machine such as DelPhi [12] could support such languages in future. Functions make explicit the granularity for ORparallelism: evaluation is deterministic while sea ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Machine. It is incomplete due to depthfirst search, but presumably there could be a version using iterative deepening. An ORparallel machine such as DelPhi [12] could support such languages in future. Functions make explicit the granularity for ORparallelism: evaluation is deterministic while search is not.
A Logic Programming and Verification System for Recursive Quantificational Logic
 Proceedings of the Nth International Joint Conference on Artificial Intelligence(IJCAI85
, 1985
"... In this paper, we describe a logic programming and program verification system which is based on quantifier elimination techniques and axiomatization rather than on more common method of doing logic programming using the HerbrandPrawitzRobinson unification algorithm without occurcheck. This syste ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper, we describe a logic programming and program verification system which is based on quantifier elimination techniques and axiomatization rather than on more common method of doing logic programming using the HerbrandPrawitzRobinson unification algorithm without occurcheck. This system is shown to have interesting properties for logic programming and includes a number of advanced features. Among these features are userdefined data objects, userdefined recursive relations and functions, either of which may involve quantifiers in the body of their definitions, and automatic termination and consistency checking for recursively defined concept. In addition, it has a correct implementation of negation in contrast to PROLOG implementation of negation as failure, a smooth interaction between LISPlike functions and PROLOGlike relations, and a smooth interaction between specifications and programs. Finally, it provides a method of mathematical induction applicable to recursive definitions involving quantifiers. I.
Functional Reading of Logic Programs
"... We propose an embedding of logic programming into lazy functional programming in which each predicate in a Prolog program becomes a Haskell function, in such a way that both the declarative and the procedural reading of the Prolog predicate are preserved. The embedding computes by means of operation ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We propose an embedding of logic programming into lazy functional programming in which each predicate in a Prolog program becomes a Haskell function, in such a way that both the declarative and the procedural reading of the Prolog predicate are preserved. The embedding computes by means of operations on lazy lists. The state of each step in computation is passed on as a stream of answer substitutions, and all the logic operators of Prolog are implemented by explicit Haskell operators on these streams. The search strategy can be changed by altering the basic types of the embedding and the implementation of these operators. This model results in a perspicuous semantics for logic programs, and serves as a good example of modularisation in functional programming.
Conditional Equational Theories and Complete Sets of Transformations
 IN PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON FIFTH GENERATION COMPUTER SYSTEMS
, 1988
"... The idea to combine the advantages of function and logic programming has attracted many researches. Their work ranges from the integration of existing languages over higherorder logic to equational logic languages, where logic programs are augmented with equational theories. Recently, it has been p ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The idea to combine the advantages of function and logic programming has attracted many researches. Their work ranges from the integration of existing languages over higherorder logic to equational logic languages, where logic programs are augmented with equational theories. Recently, it has been proposed to handle those equational theories by complete sets of transformations. These transformations are extensions of the rules introduced by Herbrand and later used by Martelli and Montanari to compute the most general unifier of two expressions. We generalize this idea to complete sets of transformations for arbitrary conditional equational theories, the largest class of equational theories that admit a least Herbrand model. The completeness
Learning Relations: an Evaluation of Search Strategies
 Fundamenta Informaticae
, 1993
"... . Inducing concept descriptions in first order logic is inherently a complex task; then, heuristics are needed to keep the problem to manageable size. In this paper we explore the effect of alternative search strategies, including the use of information gain and of apriori knowledge, on the quality ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
. Inducing concept descriptions in first order logic is inherently a complex task; then, heuristics are needed to keep the problem to manageable size. In this paper we explore the effect of alternative search strategies, including the use of information gain and of apriori knowledge, on the quality of the acquired relations, intended as the ability to reconstruct the rule used to generate the examples. To this aim, an artificial domain has been created, in which the experimental conditions can be kept under control, the "solution" of the learning problem is known and a perfect theory is available. Another investigated aspect is the impact of more complex description languages, such as, for instance, including numerical quantifiers. The results show that the information gain criterion is too greedy to be useful when the concepts have a complex internal structure; however, this drawback is more or less shared with any purely statistical evaluation criterion. The addition of parts of the...