Results 1  10
of
29
Metatheory and Reflection in Theorem Proving: A Survey and Critique
, 1995
"... One way to ensure correctness of the inference performed by computer theorem provers is to force all proofs to be done step by step in a simple, more or less traditional, deductive system. Using techniques pioneered in Edinburgh LCF, this can be made palatable. However, some believe such an appro ..."
Abstract

Cited by 59 (2 self)
 Add to MetaCart
One way to ensure correctness of the inference performed by computer theorem provers is to force all proofs to be done step by step in a simple, more or less traditional, deductive system. Using techniques pioneered in Edinburgh LCF, this can be made palatable. However, some believe such an approach will never be efficient enough for large, complex proofs. One alternative, commonly called reflection, is to analyze proofs using a second layer of logic, a metalogic, and so justify abbreviating or simplifying proofs, making the kinds of shortcuts humans often do or appealing to specialized decision algorithms. In this paper we contrast the fullyexpansive LCF approach with the use of reflection. We put forward arguments to suggest that the inadequacy of the LCF approach has not been adequately demonstrated, and neither has the practical utility of reflection (notwithstanding its undoubted intellectual interest). The LCF system with which we are most concerned is the HOL proof ...
Rough sets: some extensions
 Information Sciences 177
, 2007
"... This article was originally published in a journal published by Elsevier, and the attached copy is provided by Elsevier for the author’s benefit and for the benefit of the author’s institution, for noncommercial research and educational use including without limitation use in instruction at your in ..."
Abstract

Cited by 47 (6 self)
 Add to MetaCart
(Show Context)
This article was originally published in a journal published by Elsevier, and the attached copy is provided by Elsevier for the author’s benefit and for the benefit of the author’s institution, for noncommercial research and educational use including without limitation use in instruction at your institution, sending it to specific colleagues that you know, and providing a copy to your institution’s administrator. All other uses, reproduction and distribution, including without limitation commercial reprints, selling or licensing copies or access, or posting on open internet sites, your personal or institution’s website or repository, are prohibited. For exceptions, permission may be sought for such use through Elsevier’s permissions site at:
A RelationAlgebraic Approach to the Region Connection Calculus
 Fundamenta Informaticae
, 2001
"... We explore the relationalgebraic aspects of the region connection calculus (RCC) of Randell et al. (1992a). In particular, we present a refinement of the RCC8 table which shows that the axioms provide for more relations than are listed in the present table. We also show that each RCC model leads ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
We explore the relationalgebraic aspects of the region connection calculus (RCC) of Randell et al. (1992a). In particular, we present a refinement of the RCC8 table which shows that the axioms provide for more relations than are listed in the present table. We also show that each RCC model leads to a Boolean algebra. Finally, we prove that a refined version of the RCC5 table has as models all atomless Boolean algebras B with the natural ordering as the "part  of" relation, and that the table is closed under first order definable relations iff B is homogeneous. 1 Introduction Qualitative reasoning (QR) has its origins in the exploration of properties of physical systems when numerical information is not sufficient  or not present  to explain the situation at hand (Weld and Kleer, 1990). Furthermore, it is a tool to represent the abstractions of researchers who are constructing numerical systems which model the physical world. Thus, it fills a gap in data modeling which often l...
Chinese NumberNames, Tree Adjoining Languages, and Mild ContextSensitivity
 COMPUTATIONAL LINGUISTICS
, 1991
"... ... this paper that the numbername system of Chinese is generated neither by this formalism nor by any other equivalent or weaker ones, suggesting that such a task might require the use of the more powerful Indexed Grammar formalism. Given that our formal results apply only to a proper subset of Ch ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
... this paper that the numbername system of Chinese is generated neither by this formalism nor by any other equivalent or weaker ones, suggesting that such a task might require the use of the more powerful Indexed Grammar formalism. Given that our formal results apply only to a proper subset of Chinese, we extensively discuss the issue of whether they have any implications for the whole of that natural language. We conclude that our results bear directly either on the syntax of Chinese or on the interface between Chinese and the cognitive component responsible for arithmetic reasoning. Consequently, either Tree Adjoining Grammars, as currently defined, fail to generate the class of natural languages in a way that discriminates between linguistically warranted sublanguages, or formalisms with generative power equivalent to Tree Adjoining Grammar cannot serve as a basis for the interface between the human linguistic and mathematical faculties.
Complexity in LeftAssociative Grammar
, 1992
"... This paper presents a mathematical definition of LeftAssociative Grammar, and describes its formal properties. 1 Conceptually, LAgrammar is based on the notion of possible continuations, in contrast to more traditional systems such as Phrase Structure Grammar and Categorial Grammar, which are li ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
This paper presents a mathematical definition of LeftAssociative Grammar, and describes its formal properties. 1 Conceptually, LAgrammar is based on the notion of possible continuations, in contrast to more traditional systems such as Phrase Structure Grammar and Categorial Grammar, which are linguistically motivated in terms of possible substitutions. It is shown that LAgrammar generates all and only the recursive languages. The Chomsky hierarchy of regular, contextfree, and contextsensitive languages is reconstructed in LAgrammar by simulating finite state automata, pushdown automata, and linearly bounded automata, respectively. Using alternative restrictions on LAgrammars, the new language hierarchy of ALAGs, BLAGs, CLAGs is proposed. The class of CLAGs is divided into three subclasses representing different degrees of ambiguity and associated computational complexity. The class of CLAGs without recursive ambiguities (called the C1LAGs) parses in linear time, and incl...
On computing belief change operations using quantified boolean formulas
 Journal of Logic and Computation
, 2004
"... In this paper, we show how an approach to belief revision and belief contraction can be axiomatised by means of quantified Boolean formulas. Specifically, we consider the approach of belief change scenarios, a general framework that has been introduced for expressing different forms of belief change ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
(Show Context)
In this paper, we show how an approach to belief revision and belief contraction can be axiomatised by means of quantified Boolean formulas. Specifically, we consider the approach of belief change scenarios, a general framework that has been introduced for expressing different forms of belief change. The essential idea is that for a belief change scenario (K, R, C), the set of formulas K, representing the knowledge base, is modified so that the sets of formulas R and C are respectively true in, and consistent with the result. By restricting the form of a belief change scenario, one obtains specific belief change operators including belief revision, contraction, update, and merging. For both the general approach and for specific operators, we give a quantified Boolean formula such that satisfying truth assignments to the free variables correspond to belief change extensions in the original approach. Hence, we reduce the problem of determining the results of a belief change operation to that of satisfiability. This approach has several benefits. First, it furnishes an axiomatic specification of belief change with respect to belief change scenarios. This then leads to further insight into the belief change framework.
On Ontology and Epistemology of Rough Location
 Spatial Information Theory  Cognitive and Computational Foundations of Geographic Information Science: Proc. COSIT'99, LNCS No. 1661
, 1999
"... . Spatial objects are located in regions of space. In this paper the notions of exact, part, and rough location are discussed. Exact location is the relation between an object and the region of space it occupies. The notion of part location characterizes relations between parts of objects and pa ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
. Spatial objects are located in regions of space. In this paper the notions of exact, part, and rough location are discussed. Exact location is the relation between an object and the region of space it occupies. The notion of part location characterizes relations between parts of objects and parts of regions of space. The notion of rough location characterizes the location of a spatial object within a set of regions which form a regional partition of space. It links parts of spatial objects to parts of partition elements. The relationships between rough location, vague dened spatial objects, and indeterminacy of location are discussed. Knowledge about location of spatial objects in physical reality is based on observation and measurement. This paper argues that the the observations and measurement of location in physical reality yield knowledge about rough location rather than knowledge about exact location. The underlying regional partitions are created by the observati...
Characterising equilibrium logic and nested logic programs: Reductions and complexity
, 2009
"... Equilibrium logic is an approach to nonmonotonic reasoning that extends the stablemodel and answerset semantics for logic programs. In particular, it includes the general case of nested logic programs, where arbitrary Boolean combinations are permitted in heads and bodies of rules, as special kind ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Equilibrium logic is an approach to nonmonotonic reasoning that extends the stablemodel and answerset semantics for logic programs. In particular, it includes the general case of nested logic programs, where arbitrary Boolean combinations are permitted in heads and bodies of rules, as special kinds of theories. In this paper, we present polynomial reductions of the main reasoning tasks associated with equilibrium logic and nested logic programs into quantified propositional logic, an extension of classical propositional logic where quantifications over atomic formulas are permitted. Thus, quantified propositional logic is a fragment of secondorder logic, and its formulas are usually referred to as quantified Boolean formulas (QBFs). We provide reductions not only for decision problems, but also for the central semantical concepts of equilibrium logic and nested logic programs. In particular, our encodings map a given decision problem into some QBF such that the latter is valid precisely in case the former holds. The basic tasks we deal with here are the consistency problem, brave reasoning, and skeptical reasoning. Additionally, we also provide encodings for testing equivalence of theories or programs under different notions
VariableBinders As Functors
 The Heritage of Kazimierz Ajdukiewicz, Amsterdam and Atlanta (GA): Rodopi
, 1995
"... this paper I argue that this is by no means a necessary course. I argue that a fairly general semantic framework can be developed where the only relevant distinction is indeed the functor/argument distinction, and where the only structural operation for generating expressions is functional applicati ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
this paper I argue that this is by no means a necessary course. I argue that a fairly general semantic framework can be developed where the only relevant distinction is indeed the functor/argument distinction, and where the only structural operation for generating expressions is functional application, with no need to resort to functional abstraction as well. I shall not prove any general results to the effect that such a framework is universally applicable. However, the overall apparatus is illustrated in connection with some concrete examplesnotably quantificational and full categorial languageswhich should suffice to support my point.