Results 1  10
of
30
Rewriting Logic as a Logical and Semantic Framework
, 1993
"... Rewriting logic [72] is proposed as a logical framework in which other logics can be represented, and as a semantic framework for the specification of languages and systems. Using concepts from the theory of general logics [70], representations of an object logic L in a framework logic F are und ..."
Abstract

Cited by 163 (55 self)
 Add to MetaCart
Rewriting logic [72] is proposed as a logical framework in which other logics can be represented, and as a semantic framework for the specification of languages and systems. Using concepts from the theory of general logics [70], representations of an object logic L in a framework logic F are understood as mappings L ! F that translate one logic into the other in a conservative way. The ease with which such maps can be defined for a number of quite different logics of interest, including equational logic, Horn logic with equality, linear logic, logics with quantifiers, and any sequent calculus presentation of a logic for a very general notion of "sequent," is discussed in detail. Using the fact that rewriting logic is reflective, it is often possible to reify inside rewriting logic itself a representation map L ! RWLogic for the finitely presentable theories of L. Such a reification takes the form of a map between the abstract data types representing the finitary theories of...
Automated Deduction by Theory Resolution
 Journal of Automated Reasoning
, 1985
"... Theory resolution constitutes a set of complete procedures for incorporating theories into a resolution theoremproving program, thereby making it unnecessary to resolve directly upon axioms of the theory. This can greatly reduce the length of proofs and the size of the search space. Theory resoluti ..."
Abstract

Cited by 126 (1 self)
 Add to MetaCart
(Show Context)
Theory resolution constitutes a set of complete procedures for incorporating theories into a resolution theoremproving program, thereby making it unnecessary to resolve directly upon axioms of the theory. This can greatly reduce the length of proofs and the size of the search space. Theory resolution effects a beneficial division of labor, improving the performance of the theorem prover and increasing the applicability of the specialized reasoning procedures. Total theory resolution utilizes a decision procedure that is capable of determining unsatisfiability of any set of clauses using predicates in the theory. Partial theory resolution employs a weaker decision procedure that can determine potential unsatisfiability of sets of literals. Applications include the building in of both mathematical and special decision procedures, e.g., for the taxonomic information furnished by a knowledge representation system. Theory resolution is a generalization of numerous previously known resolution refinements. Its power is demonstrated by comparing solutions of "Schubert's Steamroller" challenge problem with and without building in axioms through theory resolution. 1 1
Efficient implementation of lattice operations
 ACM Transactions on Programming Languages and Systems
, 1989
"... complementation (BUTNOT) are becoming more and more important in programming languages supporting object inheritance. We present a general technique for the efficient implementation of such operations based on an encoding method. The effect of the encoding is to plunge the given ordering into a bool ..."
Abstract

Cited by 122 (9 self)
 Add to MetaCart
complementation (BUTNOT) are becoming more and more important in programming languages supporting object inheritance. We present a general technique for the efficient implementation of such operations based on an encoding method. The effect of the encoding is to plunge the given ordering into a boolean lattice of binary words, leading to an almost constant time complexity of the lattice operations. A first method is described based on a transitive closure approach. Then a more spaceefficient method minimizing codeword length is described. Finally a powerful grouping technique called modulation is presented, which drastically reduces code space while keeping all three lattice operations highly efficient. This technique takes into account idiosyncrasies of the topology of the poset being encoded that are quite likely to occur in practice. All methods are formally justified. We see this work as an original contribution towards using semantic (vz., in this case, taxonomic) information in the engineering pragmatics of storage and retrieval of (vz., partially or quasiordered) information.
A Feature Logic with Subsorts
 LILOG Report 33, IWBS, IBM Deutschland
, 1992
"... This paper presents a set description logic with subsorts, feature selection (the inverse of unary function application), agreement, intersection, union and complement. We define a model theoretic open world semantics and show that sorted feature structures constitute a canonical model, that is, ..."
Abstract

Cited by 78 (4 self)
 Add to MetaCart
(Show Context)
This paper presents a set description logic with subsorts, feature selection (the inverse of unary function application), agreement, intersection, union and complement. We define a model theoretic open world semantics and show that sorted feature structures constitute a canonical model, that is, without loss of generality subsumption and consistency of set descriptions can be considered with respect to feature structures only. We show that deciding consistency of set descriptions is an NPcomplete problem. To appear in: J. Wedekind and C. Rohrer (eds.), Unification in Grammar. The MIT Press, 1992 This text is a minor revision of LILOG Report 33, May 1988, IBM Deutschland, IWBS, Postfach 800880, 7000 Stuttgart 80, Germany. The research reported here has been done while the author was with IBM Deutschland. The author's article [23] is a more recent work on feature logics. 1 1 Introduction This paper presents a set description logic that generalizes and integrates formalisms...
Logic Programming over Polymorphically OrderSorted Types
, 1989
"... This thesis presents the foundations for relational logic programming over polymorphically ordersorted data types. This type discipline combines the notion of parametric polymorphism, which has been developed for higherorder functional programming, with the notion of ordersorted typing, which ha ..."
Abstract

Cited by 59 (0 self)
 Add to MetaCart
This thesis presents the foundations for relational logic programming over polymorphically ordersorted data types. This type discipline combines the notion of parametric polymorphism, which has been developed for higherorder functional programming, with the notion of ordersorted typing, which has been developed for equational firstorder specification and programming. Polymorphically ordersorted types are obtained as canonical models of a class of specifications in a suitable logic accommodating sort functions. Algorithms for constraint solving, type checking and type inference are given and proven correct.
(ML)²: A formal language for KADS models of expertise
, 1993
"... This paper reports on an investigation into a formal language for specifying kads models of expertise. After arguing the need for and the use of such formal representations, we discuss each of the layers of a kads model of expertise in the subsequent sections, and define the formal constructions tha ..."
Abstract

Cited by 35 (9 self)
 Add to MetaCart
This paper reports on an investigation into a formal language for specifying kads models of expertise. After arguing the need for and the use of such formal representations, we discuss each of the layers of a kads model of expertise in the subsequent sections, and define the formal constructions that we use to represent the kads entities at every layer: ordersorted logic at the domain layer, metalogic at the inference layer, and dynamiclogic at the task layer. All these constructions together make up (ml) 2 , the language that we use to represent models of expertise. We illustrate the use of (ml) 2 in a small example model. We conclude by describing our experience to date with constructing such formal models in (ml) 2 , and by discussing some open problems that remain for future work. 1 Introduction One of the central concerns of "knowledge engineering" is the construction of a model of some problem solving behaviour. This model should eventually lead to the construction of a...
Natural Language Based Inference Procedures applied to Schubert's Steamroller
 In AAAI91
, 1991
"... We have previously argued that the syntactic structure of natural language can be exploited to construct powerful polynomial time inference procedures. This paper supports the earlier arguments by demonstrating that a natural language based polynomial time procedure can solve Schubert's steamro ..."
Abstract

Cited by 23 (8 self)
 Add to MetaCart
We have previously argued that the syntactic structure of natural language can be exploited to construct powerful polynomial time inference procedures. This paper supports the earlier arguments by demonstrating that a natural language based polynomial time procedure can solve Schubert's steamroller in a single step. This report describes research done at the Artificial Intelligence Laboratory of the Massachusetts Institute of Technology. Support for the work described in this paper was provided in part by Misubishi Electric Research Laboratories, Inc. Support for the laboratory's artificial intelligence research is provided in part by the Advanced Research Projects Agency of the Department of Defense under Office of Naval Research contract N0001485K0124. This paper appeared in AAAI91. A postscript electronic source for this paper can be found in ftp.ai.mit.edu:/pub/dam/aaaib.ps. A bibtex reference can be found in ftp.ai.mit.edu:/pub/dam/dam.bib. 1 Introduction Schubert's steamro...
The Specification and Implementation of ConstraintBased Unification Grammars
, 1991
"... this paper. The research of Pollard and Franz was supported by a grant from the National Science Foundation (IRI8806913). (Empty Category Principle and Subjacency) and so forth. Patterns of crosslinguistic variation are accounted for by means of the parametrization of these principles. The method ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
this paper. The research of Pollard and Franz was supported by a grant from the National Science Foundation (IRI8806913). (Empty Category Principle and Subjacency) and so forth. Patterns of crosslinguistic variation are accounted for by means of the parametrization of these principles. The methodological distinction between these two approaches is widely supposed to be that rules enumerate possibilities, while principles eliminate possibilities. But it is quite difficult to distinguish formally between a parametrized disjunctive principle and a collection of schematic rules only one of which can apply to a given structure. Consider, for example, the distinction between categorial grammar application schemata, basic ID rules of GPSG, and the Cstructure constraints of LFG, on the one hand, and the disjunctive clauses of ¯ X Theory or the Empty Category Principle on the other. It should also be borne in mind that socalled rulebased approaches often employ not only rules but also global constraints on representations which behave similarly to principles, such as the Head Feature Convention and the Control Agreement Principle of GPSG or the Completeness and FunctionArgument Biuniqueness Conditions of LFG. HPSG belongs to the "unificationbased" family of linguistic theories, but differs from LFG and GPSG in that grammars are formulated entirely in terms of universal and languagespecific principles expressed as constraints on feature structures, which in turn are taken to represent possible linguistic objects. As shown by Pollard and Sag (1987), constraints on feature structures can be used to do the same duty as many of the principles and rules of GPSG, LFG and GB. Unlike rulebased theories, in HPSG, immediate dominance and linear precedence conditions (traditional...
Expressing Generalizations in UnificationBased Grammar Formalisms
 PROCEEDINGS OF THE 4TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL89
, 1989
"... This paper shows how higher levels of generalizatlon can be introduced into unification grammars by exploiting methods for typing grammatical objects. We discuss the strategy of using global declarations to limit possible linguistic structures, and sketch a few unusual aspects of our typechecking ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
This paper shows how higher levels of generalizatlon can be introduced into unification grammars by exploiting methods for typing grammatical objects. We discuss the strategy of using global declarations to limit possible linguistic structures, and sketch a few unusual aspects of our typechecking algorithm. We also describe the sort system we use in our semantic representation language and illustrate the expressive power gained by being able to state global constraints over these sorts. Finally, we briefly illustrate the sort system by applying it to some agreement phenomena and to problems of adjunct resolution.
MetaLevel Inference Systems
, 1991
"... 1.1 Goals of this book................................ 13 ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
1.1 Goals of this book................................ 13