Results 1  10
of
69
The Wellfounded Semantics Is the Principle of Inductive Definition
 Logics in Arti Intelligence
, 1998
"... . Existing formalisations of (transfinite) inductive definitions in constructive mathematics are reviewed and strong correspondences with LP under least model and perfect model semantics become apparent. I point to fundamental restrictions of these existing formalisations and argue that the wellfou ..."
Abstract

Cited by 44 (26 self)
 Add to MetaCart
. Existing formalisations of (transfinite) inductive definitions in constructive mathematics are reviewed and strong correspondences with LP under least model and perfect model semantics become apparent. I point to fundamental restrictions of these existing formalisations and argue that the wellfounded semantics (wfs) overcomes these problems and hence, provides a superior formalisation of the principle of inductive definition. The contribution of this study for LP is that it (re )introduces the knowledge theoretic interpretation of LP as a logic for representing definitional knowledge. I point to fundamental differences between this knowledge theoretic interpretation of LP and the more commonly known interpretations of LP as default theories or autoepistemic theories. The relevance is that differences in knowledge theoretic interpretation have strong impact on knowledge representation methodology and on extensions of the LP formalism, for example for representing uncertainty. Keywo...
Derivation of Data Intensive Algorithms by Formal Transformation: The SchorrWaite Graph Marking Algorithm
, 1996
"... In this paper we consider a particular class of algorithms which present certain difficulties to formal verification. These are algorithms which use a single data structure for two or more purposes, which combine program control information with other data structures or which are developed as a comb ..."
Abstract

Cited by 36 (25 self)
 Add to MetaCart
In this paper we consider a particular class of algorithms which present certain difficulties to formal verification. These are algorithms which use a single data structure for two or more purposes, which combine program control information with other data structures or which are developed as a combination of a basic idea with an implementation technique. Our approach is based on applying proven semanticspreserving transformation rules in a wide spectrum language. Starting with a set theoretical specification of "reachability" we are able to derive iterative and recursive graph marking algorithms using the "pointer switching" idea of Schorr and Waite. There have been several proofs of correctness of the SchorrWaite algorithm, and a small number of transformational developments of the algorithm. The great advantage of our approach is that we can derive the algorithm from its specification using only generalpurpose transformational rules: without the need for complicated induction arg...
Computability and recursion
 BULL. SYMBOLIC LOGIC
, 1996
"... We consider the informal concept of “computability” or “effective calculability” and two of the formalisms commonly used to define it, “(Turing) computability” and “(general) recursiveness.” We consider their origin, exact technical definition, concepts, history, general English meanings, how they b ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
We consider the informal concept of “computability” or “effective calculability” and two of the formalisms commonly used to define it, “(Turing) computability” and “(general) recursiveness.” We consider their origin, exact technical definition, concepts, history, general English meanings, how they became fixed in their present roles, how they were first and are now used, their impact on nonspecialists, how their use will affect the future content of the subject of computability theory, and its connection to other related areas. After a careful historical and conceptual analysis of computability and recursion we make several recommendations in section §7 about preserving the intensional differences between the concepts of “computability” and “recursion.” Specifically we recommend that: the term “recursive ” should no longer carry the additional meaning of “computable” or “decidable;” functions defined using Turing machines, register machines, or their variants should be called “computable” rather than “recursive;” we should distinguish the intensional difference between Church’s Thesis and Turing’s Thesis, and use the latter particularly in dealing with mechanistic questions; the name of the subject should be “Computability Theory” or simply Computability rather than
A logic of nonmonotone inductive definitions
 ACM transactions on computational logic
, 2007
"... Wellknown principles of induction include monotone induction and different sorts of nonmonotone induction such as inflationary induction, induction over wellfounded sets and iterated induction. In this work, we define a logic formalizing induction over wellfounded sets and monotone and iterated i ..."
Abstract

Cited by 28 (16 self)
 Add to MetaCart
Wellknown principles of induction include monotone induction and different sorts of nonmonotone induction such as inflationary induction, induction over wellfounded sets and iterated induction. In this work, we define a logic formalizing induction over wellfounded sets and monotone and iterated induction. Just as the principle of positive induction has been formalized in FO(LFP), and the principle of inflationary induction has been formalized in FO(IFP), this paper formalizes the principle of iterated induction in a new logic for NonMonotone Inductive Definitions (IDlogic). The semantics of the logic is strongly influenced by the wellfounded semantics of logic programming. This paper discusses the formalisation of different forms of (non)monotone induction by the wellfounded semantics and illustrates the use of the logic for formalizing mathematical and commonsense knowledge. To model different types of induction found in mathematics, we define several subclasses of definitions, and show that they are correctly formalized by the wellfounded semantics. We also present translations into classical first or second order logic. We develop modularity and totality results and demonstrate their use to analyze and simplify complex definitions. We illustrate the use of the logic for temporal reasoning. The logic formally extends Logic Programming, Abductive Logic Programming and Datalog, and thus formalizes the view on these formalisms as logics of (generalized) inductive definitions. Categories and Subject Descriptors:... [...]:... 1.
'Computing' as information compression by multiple alignment, unification and search
 Journal of Universal Computer Science
, 1999
"... This paper argues that the operations of a `Universal Turing Machine' (UTM) and equivalent mechanisms such as the `Post Canonical System' (PCS)  which are widely accepted as definitions of the concept of `computing'  may be interpreted as information compression by multiple alignment, unificat ..."
Abstract

Cited by 28 (14 self)
 Add to MetaCart
This paper argues that the operations of a `Universal Turing Machine' (UTM) and equivalent mechanisms such as the `Post Canonical System' (PCS)  which are widely accepted as definitions of the concept of `computing'  may be interpreted as information compression by multiple alignment, unification and search (ICMAUS). The motivation for this interpretation is that it suggests ways in which the UTM/PCS model may be augmented in a proposed new computing system designed to exploit the ICMAUS principles as fully as possible. The provision of a relatively sophisticated search mechanism in the proposed `SP' system appears to open the door to the integration and simplification of a range of functions including unsupervised inductive learning, bestmatch pattern recognition and information retrieval, probabilistic reasoning, planning and problem solving, and others. Detailed consideration of how the ICMAUS principles may be applied to these functions is outside the scope of this article but relevant sources are cited in this article.
On the Power of Circular Splicing Systems and DNA Computability
 Proc. of IEEE Intern. Conf. on Evol. Comput. (ICEC'97
, 1997
"... From a biological motivation of interactions between linear and circular DNA sequences, we propose a new type of splicing models called circular H systems and show that they have the same computational power as Turing machines. It is also shown that there effectively exists a universal circular H sy ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
From a biological motivation of interactions between linear and circular DNA sequences, we propose a new type of splicing models called circular H systems and show that they have the same computational power as Turing machines. It is also shown that there effectively exists a universal circular H system which can simulate any circular H system with the same terminal alphabet, which strongly suggests a feasible design for a DNA computer based on circular splicing. 1 Introduction Since Adleman's breathtaking paper on molecular (DNA) computing ([1]), there have already been quite a few papers on this challenging topic : [10] shows how to solve NPcomplete problems using DNA, while [3] discusses a design method for simulating a Turing machine by molecular biological techniques and shows how to compute PSPACE, and [4]) gives a methodology for breaking the DES using techniques in genetic engineering. In response to the rapid stream of experimental research on this new computation paradigm...
Pigs from Sausages? Reengineering from Assembler to C via FermaT Transformations
 Science of Computer Programming, Special Issue on Program Transformation 52
, 2004
"... Software reengineering has been described as being "about as easy as reconstructing a pig from a sausage" [11]. But the development of program transformation theory, as embodied in the FermaT transformation system, has made this miraculous feat into a practical possibility. This paper describes the ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
Software reengineering has been described as being "about as easy as reconstructing a pig from a sausage" [11]. But the development of program transformation theory, as embodied in the FermaT transformation system, has made this miraculous feat into a practical possibility. This paper describes the theory...
Heterogenous Simulation  mixing discreteevent model with dataflow
, 1996
"... This paper relates to systemlevel design of signal processing systems, which are often heterogeneous in implementation technologies and design styles. The heterogeneous approach, by combining small, specialized models of computation, achieves generality and also lends itself to automatic synthesis ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
This paper relates to systemlevel design of signal processing systems, which are often heterogeneous in implementation technologies and design styles. The heterogeneous approach, by combining small, specialized models of computation, achieves generality and also lends itself to automatic synthesis and formal verification. Key to the heterogeneous approach is to define interaction semantics that resolve the ambiguities when different models of computation are brought together. For this purpose, we introduce a tagged signal model as a formal framework within which the models of computation can be precisely described and unambiguously differentiated, and their interactions can be understood. In this paper, we will focus on the interaction between dataflow models, which have partially ordered events, and discreteevent models, with their notion of time that usually defines a total order of events. A variety of interaction semantics, mainly in handling the different notions of time in the two models, are explored to illustrate the subtleties involved. An implementation based on the Ptolemy system from U.C. Berkeley is described and critiqued.
Experience with FS 0 as a framework theory
, 1993
"... Feferman has proposed a system, FS 0 , as an alternative framework for encoding logics and also for reasoning about those encodings. We have implemented a version of this framework and performed experiments that show that it is practical. Specifically, we describe a formalisation of predicate calcul ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
Feferman has proposed a system, FS 0 , as an alternative framework for encoding logics and also for reasoning about those encodings. We have implemented a version of this framework and performed experiments that show that it is practical. Specifically, we describe a formalisation of predicate calculus and the development of an admissible rule that manipulates formulae with bound variables. This application will be of interest to researchers working with frameworks that use mechanisms based on substitution in the lambda calculus to implement variable binding and substitution in the declared logic directly. We suggest that metatheoretic reasoning, even for a theory using bound variables, is not as difficult as is often supposed, and leads to more powerful ways of reasoning about the encoded theory. x 1 Introduction: why metamathematics? A logical framework is a formal theory that is designed for the purpose of describing other formal theories in a uniform way, and for making the work ...