Results 1  10
of
58
Multilanguage Hierarchical Logics (or: How We Can Do Without Modal Logics)
, 1994
"... MultiLanguage systems (ML systems) are formal systems allowing the use of multiple distinct logical languages. In this paper we introduce a class of ML systems which use a hierarchy of first order languages, each language containing names for the language below, and propose them as an alternative to ..."
Abstract

Cited by 178 (47 self)
 Add to MetaCart
MultiLanguage systems (ML systems) are formal systems allowing the use of multiple distinct logical languages. In this paper we introduce a class of ML systems which use a hierarchy of first order languages, each language containing names for the language below, and propose them as an alternative to modal logics. The motivations of our proposal are technical, epistemological and implementational. From a technical point of view, we prove, among other things, that the set of theorems of the most common modal logics can be embedded (under the obvious bijective mapping between a modal and a first order language) into that of the corresponding ML systems. Moreover, we show that ML systems have properties not holding for modal logics and argue that these properties are justified by our intuitions. This claim is motivated by the study of how ML systems can be used in the representation of beliefs (more generally, propositional attitudes) and provability, two areas where modal logics have been extensively used. Finally, from an implementation point of view, we argue that ML systems resemble closely the current practice in the computer representation of propositional attitudes and metatheoretic theorem proving.
Contextual Reasoning
 EPISTEMOLOGIA, SPECIAL ISSUE ON I LINGUAGGI E LE MACCHINE
, 1992
"... It is widely agreed on that most cognitive processes are contextual in the sense that they depend on the environment, or context, inside which they are carried on. Even concentrating on the issue of contextuality in reasoning, many different notions of context can be found in the Artificial Intel ..."
Abstract

Cited by 73 (5 self)
 Add to MetaCart
It is widely agreed on that most cognitive processes are contextual in the sense that they depend on the environment, or context, inside which they are carried on. Even concentrating on the issue of contextuality in reasoning, many different notions of context can be found in the Artificial Intelligence literature. Our intuition is that reasoning is usually performed on a subset of the global knowledge base. The notion of context is used as a means of formalizing this idea of localization. Roughly speaking, we take a context to be the set of facts used locally to prove a given goal plus the inference routines used to reason about them (which in general are different for different sets of facts). Our perspective is similar to that proposed in [McC87, McC91]. The goal of this paper is to propose an epistemologically adequate theory of reasoning with contexts. The emphasis is on motivations and intuitions, rather than on technicalities. The two basic definitions are reported i...
Intuitionistic Model Constructions and Normalization Proofs
, 1998
"... We investigate semantical normalization proofs for typed combinatory logic and weak calculus. One builds a model and a function `quote' which inverts the interpretation function. A normalization function is then obtained by composing quote with the interpretation function. Our models are just like ..."
Abstract

Cited by 44 (7 self)
 Add to MetaCart
We investigate semantical normalization proofs for typed combinatory logic and weak calculus. One builds a model and a function `quote' which inverts the interpretation function. A normalization function is then obtained by composing quote with the interpretation function. Our models are just like the intended model, except that the function space includes a syntactic component as well as a semantic one. We call this a `glued' model because of its similarity with the glueing construction in category theory. Other basic type constructors are interpreted as in the intended model. In this way we can also treat inductively defined types such as natural numbers and Brouwer ordinals. We also discuss how to formalize terms, and show how one model construction can be used to yield normalization proofs for two different typed calculi  one with explicit and one with implicit substitution. The proofs are formalized using MartinLof's type theory as a meta language and mechanized using the A...
(ML)²: A formal language for KADS models of expertise
, 1993
"... This paper reports on an investigation into a formal language for specifying kads models of expertise. After arguing the need for and the use of such formal representations, we discuss each of the layers of a kads model of expertise in the subsequent sections, and define the formal constructions tha ..."
Abstract

Cited by 35 (9 self)
 Add to MetaCart
This paper reports on an investigation into a formal language for specifying kads models of expertise. After arguing the need for and the use of such formal representations, we discuss each of the layers of a kads model of expertise in the subsequent sections, and define the formal constructions that we use to represent the kads entities at every layer: ordersorted logic at the domain layer, metalogic at the inference layer, and dynamiclogic at the task layer. All these constructions together make up (ml) 2 , the language that we use to represent models of expertise. We illustrate the use of (ml) 2 in a small example model. We conclude by describing our experience to date with constructing such formal models in (ml) 2 , and by discussing some open problems that remain for future work. 1 Introduction One of the central concerns of "knowledge engineering" is the construction of a model of some problem solving behaviour. This model should eventually lead to the construction of a...
A NATURAL AXIOMATIZATION OF COMPUTABILITY AND PROOF OF CHURCH’S THESIS
"... Abstract. Church’s Thesis asserts that the only numeric functions that can be calculated by effective means are the recursive ones, which are the same, extensionally, as the Turingcomputable numeric functions. The Abstract State Machine Theorem states that every classical algorithm is behaviorally e ..."
Abstract

Cited by 21 (10 self)
 Add to MetaCart
Abstract. Church’s Thesis asserts that the only numeric functions that can be calculated by effective means are the recursive ones, which are the same, extensionally, as the Turingcomputable numeric functions. The Abstract State Machine Theorem states that every classical algorithm is behaviorally equivalent to an abstract state machine. This theorem presupposes three natural postulates about algorithmic computation. Here, we show that augmenting those postulates with an additional requirement regarding basic operations gives a natural axiomatization of computability and a proof of Church’s Thesis, as Gödel and others suggested may be possible. In a similar way, but with a different set of basic operations, one can prove Turing’s Thesis, characterizing the effective string functions, and—in particular—the effectivelycomputable functions on string representations of numbers.
A formal method for the abstract specification of software
 J. ACM
, 1984
"... An intuitive presentation of the trace method for the abstract specification of software contains sample specifications, syntactic and semantic definitions of consistency and totalness, methods for proving specifications consistent and total, and a comparison of the method with the algebraic approac ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
An intuitive presentation of the trace method for the abstract specification of software contains sample specifications, syntactic and semantic definitions of consistency and totalness, methods for proving specifications consistent and total, and a comparison of the method with the algebraic approach to specification. This intuitive presentation is underpinned by a formal syntax, semantics, and derivation system for the method. Completeness and soundness theorems establish the correctness of the derivation system vis −a −vis the semantics, the coextensiveness of the syntactic definitions of consistency and totalness with their semantic counterparts, and the correctness of the proof methods presented. Areas for future research are discussed. 1.
A Foundation for Metareasoning, Part I: The Proof Theory
, 1997
"... We propose a framework, called OM pairs, for the formalization of metareasoning. OM pairs allow us to generate deductively the object theory and/or the meta theory. This is done by imposing, via appropriate reflection rules, the relation we want to hold between the object theory and the meta theory. ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
We propose a framework, called OM pairs, for the formalization of metareasoning. OM pairs allow us to generate deductively the object theory and/or the meta theory. This is done by imposing, via appropriate reflection rules, the relation we want to hold between the object theory and the meta theory. In this paper we concentrate on the proof theory of OM pairs. We study them from three different points of view: we compare the strength of the object and meta theories generated by different OM pairs; for each OM pair we study the precise form of the object theory and meta theory; and, finally, we study three important case studies.
Plan Formation and Execution in an Uniform Architecture of Declarative Metatheories
 Proc. Workshop on MetaProgramming in Logic
, 1990
"... We show how explicit control strategies can be represented in a declarative (classical) metatheory as first order formulae (proof plans). Proof plans can be reasoned about (by metatheoretic theorem proving) to modify the search strategy and "executed" (by suitably "interpreting" them in terms of ..."
Abstract

Cited by 11 (9 self)
 Add to MetaCart
We show how explicit control strategies can be represented in a declarative (classical) metatheory as first order formulae (proof plans). Proof plans can be reasoned about (by metatheoretic theorem proving) to modify the search strategy and "executed" (by suitably "interpreting" them in terms of the deductive machinery implementation code) to prove a theorem in the object theory. The resulting architecture is uniform as it becomes possible to define a tower of metatheories, each using the same deductive machinery, each (but the lowest) being able to represent proof plans with formulae of the same shape. Plan formation at one level can be obtained by plan execution one level up. The realization of these ideas in the GETFOL system is briefly described via the implementation of a simplified version of the Boyer and Moore theorem prover. 1 Introduction The idea of using metatheories in theorem proving has been extensively studied in the past, a not exhaustive list is [DS79, Wey8...