Results 1  10
of
24
Automated Deduction by Theory Resolution
 Journal of Automated Reasoning
, 1985
"... Theory resolution constitutes a set of complete procedures for incorporating theories into a resolution theoremproving program, thereby making it unnecessary to resolve directly upon axioms of the theory. This can greatly reduce the length of proofs and the size of the search space. Theory resoluti ..."
Abstract

Cited by 121 (1 self)
 Add to MetaCart
Theory resolution constitutes a set of complete procedures for incorporating theories into a resolution theoremproving program, thereby making it unnecessary to resolve directly upon axioms of the theory. This can greatly reduce the length of proofs and the size of the search space. Theory resolution effects a beneficial division of labor, improving the performance of the theorem prover and increasing the applicability of the specialized reasoning procedures. Total theory resolution utilizes a decision procedure that is capable of determining unsatisfiability of any set of clauses using predicates in the theory. Partial theory resolution employs a weaker decision procedure that can determine potential unsatisfiability of sets of literals. Applications include the building in of both mathematical and special decision procedures, e.g., for the taxonomic information furnished by a knowledge representation system. Theory resolution is a generalization of numerous previously known resolution refinements. Its power is demonstrated by comparing solutions of "Schubert's Steamroller" challenge problem with and without building in axioms through theory resolution. 1 1
Equational Problems and Disunification
 Journal of Symbolic Computation
, 1989
"... Roughly speaking, an equational problem is a first order formula whose only predicate symbol is =. We propose some rules for the transformation of equational problems and study their correctness in various models. Then, we give completeness results with respect to some “simple ” problems called solv ..."
Abstract

Cited by 104 (9 self)
 Add to MetaCart
Roughly speaking, an equational problem is a first order formula whose only predicate symbol is =. We propose some rules for the transformation of equational problems and study their correctness in various models. Then, we give completeness results with respect to some “simple ” problems called solved forms. Such completeness results still hold when adding some control which moreover ensures termination. The termination proofs are given for a “weak ” control and thus hold for the (large) class of algorithms obtained by restricting the scope of the rules. Finally, it must be noted that a byproduct of our method is a decision procedure for the validity in the Herbrand Universe of any
Paradigmatic Morphology
, 1989
"... We present a notation for the declarative statement of morphological relationships and lexical rules, based on the traditional notion of Word and Paradigm (cf Hockett 1954). The phenomenon of blocking arises from a generalized version of Kiparsky's (1973) Elsewhere Condition, stated in terms of orde ..."
Abstract

Cited by 33 (1 self)
 Add to MetaCart
We present a notation for the declarative statement of morphological relationships and lexical rules, based on the traditional notion of Word and Paradigm (cf Hockett 1954). The phenomenon of blocking arises from a generalized version of Kiparsky's (1973) Elsewhere Condition, stated in terms of ordering by subsumption over paradigms. Orthographic constraints on morphemic alternation are described by means of string equations (Sickmann 1975). We indicate some criticisms to be made of our approach from both linguistic and computational perspectives and relate our approach to others such as FiniteState Morphology (Koskenniemi 1983), DAII (Gazdar and Evans 1989) and objectoriented morphophonemics (de Smedt 1984, Daelemans 1988). Finally, we discuss the questions of whether a system involving string equations allows a reduction to finitestate techniques.
A completionbased method for mixed universal and rigid Eunification
 PROC. 12TH CONFERENCE ON AUTOMATED DEDUCTION CADE, NANCY/FRANCE, LNAI 814
, 1994
"... We present a completionbased method for handling a new version of Eunification, called “mixed” Eunification, that is a combination of the classical “universal” Eunification and “rigid” Eunification. Rigid Eunification is an important method for handling equality in Gentzentype firstorder cal ..."
Abstract

Cited by 27 (8 self)
 Add to MetaCart
We present a completionbased method for handling a new version of Eunification, called “mixed” Eunification, that is a combination of the classical “universal” Eunification and “rigid” Eunification. Rigid Eunification is an important method for handling equality in Gentzentype firstorder calculi, such as freevariable semantic tableaux or matings. The performance of provers using Eunification can be increased considerably, if mixed Eunification is used instead of the purely rigid version. We state soundness and completeness results, and describe experiments with an implementation of our method.
Type inference and semiunification
 In Proceedings of the ACM Conference on LISP and Functional Programming (LFP ) (Snowbird
, 1988
"... In the last ten years declarationfree programming languages with a polymorphic typing discipline (ML, B) have been developed to approximate the flexibility and conciseness of dynamically typed languages (LISP, SETL) while retaining the safety and execution efficiency of conventional statically type ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
In the last ten years declarationfree programming languages with a polymorphic typing discipline (ML, B) have been developed to approximate the flexibility and conciseness of dynamically typed languages (LISP, SETL) while retaining the safety and execution efficiency of conventional statically typed languages (Algol68, Pascal). These polymorphic languages can be type checked at compile time, yet allow functions whose arguments range over a variety of types. We investigate several polymorphic type systems, the most powerful of which, termed MilnerMycroft Calculus, extends the socalled letpolymorphism found in, e.g., ML with a polymorphic typing rule for recursive definitions. We show that semiunification, the problem of solving inequalities over firstorder terms, characterizes type checking in the MilnerMycroft Calculus to polynomial time, even in the restricted case where nested definitions are disallowed. This permits us to extend some infeasibility results for related combinatorial problems to type inference and to correct several claims and statements in the literature. We prove the existence of unique most general solutions of term inequalities, called most general semiunifiers, and present an algorithm for computing them that terminates for all known inputs due to a novel “extended occurs check”. We conjecture this algorithm to be
Nonstandard Concepts of Similarity in CaseBased Reasoning
 Information Systems and Data Analysis: Prospects  Foundations  Applications, Proceedings of the 17th Annual Conference of the GfKl, Univ. of Kaiserslautern, 1993, Studies in Classification, Data Analysis, and Knowledge Organization
, 1994
"... Introduction The present paper is aimed at propagating new concepts of similarity more flexible and expressive than those underlying most casebased reasoning approaches today. So, it mainly deals with criticizing approaches in use, with motivating and introducing new notions and notations, and wit ..."
Abstract

Cited by 24 (6 self)
 Add to MetaCart
Introduction The present paper is aimed at propagating new concepts of similarity more flexible and expressive than those underlying most casebased reasoning approaches today. So, it mainly deals with criticizing approaches in use, with motivating and introducing new notions and notations, and with first steps towards future applications. The investigations at hand originate from the author's work in learning theory. In exploring the relationship between inductive learning and casebased learning within a quite formal setting (cf. [Jan92b]), it turned out that both areas almost coincide, if sufficiently flexible similarity concepts are taken into acount. This provides some formal arguments for the necessity of nonsymmetric similarity measures. Encouraged by these first results, the author tried to investigate more structured learning problems from the view point of casebased reasoning. It turned out that an appropriate handling requires formalisms allowing similarity concep
Generalized quantifiers and discontinuous type constructors
 In Arthur Horck and Wietske Sijtsma, editors, Discontinuous Constituency, Berlin. Mouton de Gruyter
, 1992
"... 1 A signbased categorial framework This paper investigates discontinuous type constructors within the framework of a signbased generalization of categorial type calculi. The paper takes its inspiration from Oehrle’s (1988) work on generalized compositionality for multidimensional linguistic objects ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
1 A signbased categorial framework This paper investigates discontinuous type constructors within the framework of a signbased generalization of categorial type calculi. The paper takes its inspiration from Oehrle’s (1988) work on generalized compositionality for multidimensional linguistic objects, and, we hope, may establish a bridge between work in Unification Categorial Grammar or HPSG, and the research that views categorial grammar from the perspective of substructural type logics. Categorial sequents are represented as composed of multidimensional signs, modelled as tuples of the form 〈Type, Semantics, Syntax〉 They simultaneously characterize the semantic and structural properties of linguistic objects in terms of a typeassignment labelled with semantic information (a lambda term) and structural, syntactic information. As argued elsewhere (Moortgat 1988), the structural information refers to phonological structuring of linguistic material, rather than to syntactic structure in the conventional sense. For the purposes of this paper, structural information is simplified to a string description.
Open Problems in Rewriting
 Proceeding of the Fifth International Conference on Rewriting Techniques and Application (Montreal, Canada), LNCS 690
, 1991
"... Introduction Interest in the theory and applications of rewriting has been growing rapidly, as evidenced in part by four conference proceedings #including this one# #15, 26, 41,66#; three workshop proceedings #33, 47, 77#; #ve special journal issues #5,88, 24, 40, 67#; more than ten surveys #2,7,27 ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
Introduction Interest in the theory and applications of rewriting has been growing rapidly, as evidenced in part by four conference proceedings #including this one# #15, 26, 41,66#; three workshop proceedings #33, 47, 77#; #ve special journal issues #5,88, 24, 40, 67#; more than ten surveys #2,7,27, 28, 44, 56,57,76, 82, 81#; one edited collection of papers #1#; four monographs #3, 12,55,65#; and seven books #four of them still in progress# #8,9, 35, 54, 60,75, 84#. To encourage and stimulate continued progress in this area, wehave collected #with the help of colleagues# a number of problems that appear to us to be of interest and regarding whichwe do not know the answer. Questions on rewriting and other equational paradigms have been included; manyhave not aged su#ciently to be accorded the appellation #open problem". Wehave limited ourselves to theoretical questions, though there are certainly many additional interesting questions relating to applications and implementation
Implementation of Narrowing: The PrologBased Approach
 Logic programming languages: constraints, functions, and objects
, 1993
"... We present the problem of integrating functional languages and logic languages. We explain why the narrowingbased techniques have so far prevailed as operational mechanisms for the functional logic interpreters. We then discuss various strategies of narrowing. Finally we explain how to simulate the ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
We present the problem of integrating functional languages and logic languages. We explain why the narrowingbased techniques have so far prevailed as operational mechanisms for the functional logic interpreters. We then discuss various strategies of narrowing. Finally we explain how to simulate these strategies of narrowing using the leftmost SLDresolution rule of Prolog, and compare some experimental results with those obtained with direct narrowing implementations. 1. Introduction There has been a flurry of research on the integration of functional programming (FP) and logic programming (LP). A natural framework would be to consider the union of a set H of Horn clauses with a set E of conditional equations as a program. The declarative semantics of a program is then given by firstorder logic with equality [26], that is, firstorder logic extended with an equality symbol and the standard equality axioms. The operational semantics of a program is usually given by a system of infere...
A Strong Complete Schema for Inductive Functional Logic Programming
 in Dzeroski, S.; Flach, P. (eds) “Inductive Logic Programming” Lecture Notes in Artificial Intelligence (LNAI) series
, 1999
"... Abstract. A new IFLP schema is presented as a general framework for theinductionoffunctionallogicprograms(FLP).Sincenarrowing(which is the most usual operational semantics of FLP) performs a unification (mgu) followed by a replacement, we introduce two main operators in our IFLP schema: a generalisa ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
Abstract. A new IFLP schema is presented as a general framework for theinductionoffunctionallogicprograms(FLP).Sincenarrowing(which is the most usual operational semantics of FLP) performs a unification (mgu) followed by a replacement, we introduce two main operators in our IFLP schema: a generalisation and an inverse replacement or intrareplacement, which results in a generic inversion of the transitive property of equality. We prove that this schema is strong complete in the way that, given some evidence, it is possible to induce any program which could have generated that evidence. We outline some possible restrictions in order to improve the tractability of the schema. We also show that inverse narrowing is just a special case of our IFLP schema. Finally, a straightforward extension of the IFLP schema to function invention is illustrated.