Results 11  20
of
87
Triune Continuum Paradigm: a paradigm for General System Modeling and its applications for UML and RMODP
, 2002
"... d previously: a single consistent formalization of the RMODP standard conceptual framework. This formalization presents a concrete example of formal ontology for general system modeling. The paradigm is also applied on UML. This application allowed the presentation of theoretical foundations t ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
d previously: a single consistent formalization of the RMODP standard conceptual framework. This formalization presents a concrete example of formal ontology for general system modeling. The paradigm is also applied on UML. This application allowed the presentation of theoretical foundations that are necessary for the understanding and defirfition of the UML metamodel. This thesis is useful to readers who are interested in the fundamentals of system analysis. It can be particularly interesting to the UML semantics spedalists, to the RMODP experts, and to ontological engineers. VERSION ABREGEE Cette thse prdsente l'organisation strucmrel/e, les fondements thdoriques et les principes de base pour les applications du Paradigrne de Triune Continuum, un paradigrne original applicable & la moddlisation orientde objet. Le paradigrne ddfmit une structure de memmoddlisation applicable & la moddlisation gdndrale des systmes et en particulier aux systmes orientals objet. Cette
Using Reflection to Explain and Enhance Type Theory
 Proof and Computation, volume 139 of NATO Advanced Study Institute, International Summer School held in Marktoberdorf, Germany, July 20August 1, NATO Series F
, 1994
"... The five lectures at Marktoberdorf on which these notes are based were about the architecture of problem solving environments which use theorem provers. Experience with these systems over the past two decades has shown that the prover must be extensible, yet it must be kept safe. We examine a way to ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
The five lectures at Marktoberdorf on which these notes are based were about the architecture of problem solving environments which use theorem provers. Experience with these systems over the past two decades has shown that the prover must be extensible, yet it must be kept safe. We examine a way to safely add new decision procedures to the Nuprl prover. It relies on a reflection mechanism and is applicable to any tacticoriented prover with sufficient reflection. The lectures explain reflection in the setting of constructive type theory, the core logic of Nuprl.
A Formal Foundation of the RMODP Conceptual Framework
, 2001
"... This paper presents an approach for formalizing the Reference Model for Open Distributed Processing (RMODP), an ISO and ITU standard for the modeling of distributed system. The goals of this formalization are to clarify the RMODP modeling framework thus making it more accessible to modelers such ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
This paper presents an approach for formalizing the Reference Model for Open Distributed Processing (RMODP), an ISO and ITU standard for the modeling of distributed system. The goals of this formalization are to clarify the RMODP modeling framework thus making it more accessible to modelers such as system architects, designers and implementers and to open the way for the formal verification of RMODP models (either within an ODP viewpoint or across multiple ODP viewpoints). RMODP officially declared as one of its goals to create a formal representation of Part 2: Foundations. The result of our work is a complete and truly consistent formal representation, until now nonexistent, of clauses 5, 6, 8 and 9 of part 2 of RMODP in their interrelations.
Types in logic and mathematics before 1940
 Bulletin of Symbolic Logic
, 2002
"... Abstract. In this article, we study the prehistory of type theory up to 1910 and its development between Russell and Whitehead’s Principia Mathematica ([71], 1910–1912) and Church’s simply typed λcalculus of 1940. We first argue that the concept of types has always been present in mathematics, thou ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
Abstract. In this article, we study the prehistory of type theory up to 1910 and its development between Russell and Whitehead’s Principia Mathematica ([71], 1910–1912) and Church’s simply typed λcalculus of 1940. We first argue that the concept of types has always been present in mathematics, though nobody was incorporating them explicitly as such, before the end of the 19th century. Then we proceed by describing how the logical paradoxes entered the formal systems of Frege, Cantor and Peano concentrating on Frege’s Grundgesetze der Arithmetik for which Russell applied his famous paradox 1 and this led him to introduce the first theory of types, the Ramified Type Theory (rtt). We present rtt formally using the modern notation for type theory and we discuss how Ramsey, Hilbert and Ackermann removed the orders from rtt leading to the simple theory of types stt. We present stt and Church’s own simply typed λcalculus (λ→C 2) and we finish by comparing rtt, stt and λ→C. §1. Introduction. Nowadays, type theory has many applications and is used in many different disciplines. Even within logic and mathematics, there are many different type systems. They serve several purposes, and are formulated in various ways. But, before 1903 when Russell first introduced
A Metamodel for the Unified Modeling Language
 Martinus Nijhoff Publishers Natanson, Maurice 1973 Edmund Husserl: Philosopher of Infinite Tasks, Evanston: Northwestern University Press Reid, Thomas 1971 Essays on the Intellectual Powers of Man
, 2002
"... Nowadays models, rather than code, become the key artifacts of software development. Consequently, this raises the level of requirements for modeling languages on which modeling practitioners should rely in their work. A minor inconsistency of a modeling language metamodel may cause major problems i ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
Nowadays models, rather than code, become the key artifacts of software development. Consequently, this raises the level of requirements for modeling languages on which modeling practitioners should rely in their work. A minor inconsistency of a modeling language metamodel may cause major problems in the language applications; thus with the model driven systems development the solidness of modeling languages metamodels becomes particularly important. In its current state the UML metamodel leaves a significant area for improvement. We present an alternative metamodel that was inspired by the RMODP standard and that solves the problems of UML. RMODP was mentioned in UML specifications as a framework that has already influenced UML.
On the Logic and Learning of Language
, 2002
"... algebra . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 3.1.1 Homomorphisms and free generators . . . . . . . . . . . . 34 3.1.2 Quotient algebras . . . . . . . . . . . . . . . . . . . . . . . 36 3.1.3 Reducts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.2 Algebras of la ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
algebra . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 3.1.1 Homomorphisms and free generators . . . . . . . . . . . . 34 3.1.2 Quotient algebras . . . . . . . . . . . . . . . . . . . . . . . 36 3.1.3 Reducts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.2 Algebras of languages . . . . . . . . . . . . . . . . . . . . . . . . . 38 3.2.1 The algebra of formulae . . . . . . . . . . . . . . . . . . . 38 3.2.2 Substitutions . . . . . . . . . . . . . . . . . . . . . . . . . . 39 3.2.3 Associated algebras . . . . . . . . . . . . . . . . . . . . . . 40 3.2.4 Valuations . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 3.2.5 LindenbaumTarski quotient algebras . . . . . . . . . . . . 42 3.3 Algebras of deductive systems . . . . . . . . . . . . . . . . . . . . 44 3.3.1 Determining a class of algebras . . . . . . . . . . . . . . . 45 3.3.2 Algebra of a sequent calculus . . . . . . . . . . . . . . . . . 46 3.3.3 Completeness . . . . . . . . . . . . . . . . . . . . . . . . . 48 3.4 Subsuming special cases: an example . . . . . . . . . . . . . . . . 49 3.4.1 The sequent system GL . . . . . . . . . . . . . . . . . . . . 49 3.4.2 The equivalent system t(GL) . . . . . . . . . . . . . . . . . 51 3.4.3 Algebraic models for GL . . . . . . . . . . . . . . . . . . . 52 3.5 Kripke semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 4 Categorial type logics 61 4.1 The typed lambda calculus . . . . . . . . . . . . . . . . . . . . . . 62 4.2 Categorial grammar . . . . . . . . . . . . . . . . . . . . . . . . . . 66 4.3 Forms of Lambek's calculus . . . . . . . . . . . . . . . . . . . . . . 69 4.3.1 Classical CG revisited . . . . . . . . . . . . . . . . . . . . . 70 4.3.2 The nonassociative productfree system . . . . . . . . . . . 70 4.3.3 Addin...
Propositional, firstorder and higherorder logics: Basic definitions, rules of inference, and examples
 Natural Language Processing and Knowledge Representation: Language for Knowledge and Knowledge for Language. Menlo Park, CA: AAAI Press/The
, 1995
"... Logic is the study of correct reasoning. It is not a particular KRR language. Thus, it is not proper to say “We are using (or not using) logic as our KRR language. ” There are, indeed, many different logics. For more details on logics, see (Haack, 1978), (McCawley, 1981), and the various articles on ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Logic is the study of correct reasoning. It is not a particular KRR language. Thus, it is not proper to say “We are using (or not using) logic as our KRR language. ” There are, indeed, many different logics. For more details on logics, see (Haack, 1978), (McCawley, 1981), and the various articles on Logic in (Shapiro, 1992) beginning with (Rapaport, 1992). 2 Requirements to Define a Logic A logic consists of two parts, a language and a method of reasoning. The logical language, in turn, has two aspects, syntax and semantics. Thus, to specify or define a particular logic, one needs to specify three things: Syntax: The atomic symbols of the logical language, and the rules for constructing wellformed, nonatomic expressions (symbol structures) of the logic. Semantics: The meanings of the atomic symbols of the logic, and the rules for determining the meanings of nonatomic expressions of the logic. Syntactic Inference Method: The rules for determining a subset of logical expressions, called theorems of the logic. 3 CarPool World We will use CarPool World as a simple example. In CarPool World, Tom and Betty carpool to work. On any day, either Tom drives Betty or Betty drives Tom. In the former case, Tom is the driver and Betty is the passenger. In the latter case, Betty is the driver and Tom is the passenger.
Naïve computational type theory
 Proof and SystemReliability, Proceedings of International Summer School Marktoberdorf, July 24 to August 5, 2001, volume 62 of NATO Science Series III
, 2002
"... The basic concepts of type theory are fundamental to computer science, logic and mathematics. Indeed, the language of type theory connects these regions of science. It plays a role in computing and information science akin to that of set theory in pure mathematics. There are many excellent accounts ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
The basic concepts of type theory are fundamental to computer science, logic and mathematics. Indeed, the language of type theory connects these regions of science. It plays a role in computing and information science akin to that of set theory in pure mathematics. There are many excellent accounts of the basic ideas of type theory, especially at the interface of computer science and logic — specifically, in the literature of programming languages, semantics, formal methods and automated reasoning. Most of these are very technical, dense with formulas, inference rules, and computation rules. Here we follow the example of the mathematician Paul Halmos, who in 1960 wrote a 104page book called Naïve Set Theory intended to make the subject accessible to practicing mathematicians. His book served many generations well. This article follows the spirit of Halmos ’ book and introduces type theory without recourse to precise axioms and inference rules, and with a minimum of formalism. I start by paraphrasing the preface to Halmos ’ book. The sections of this article follow his chapters closely. Every computer scientist agrees that every computer scientist must know some type theory; the disagreement begins in trying to decide how much is some. This article contains my partial answer to that question. The purpose of the article is to tell the beginning student of advanced computer science the basic type theoretic facts of life, and to do so with a minimum of philosophical discourse and logical formalism. The point throughout is that of a prospective computer scientist eager to study programming languages, or database systems, or computational complexity theory, or distributed systems or information discovery. In type theory, “naïve ” and “formal ” are contrasting words. The present treatment might best be described as informal type theory from a naïve point of view. The concepts are very general and very abstract; therefore they may
Information theory, evolutionary computation, and Dembski’s “complex specified information”’, Synthese 128(2): 237–270
, 2011
"... Intelligent design advocate William Dembski has introduced a measure of information called “complex specified information”, or CSI. He claims that CSI is a reliable marker of design by intelligent agents. He puts forth a “Law of Conservation of Information” which states that chance and natural laws ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Intelligent design advocate William Dembski has introduced a measure of information called “complex specified information”, or CSI. He claims that CSI is a reliable marker of design by intelligent agents. He puts forth a “Law of Conservation of Information” which states that chance and natural laws are incapable of generating CSI. In particular, CSI cannot be generated by evolutionary computation. Dembski asserts that CSI is present in intelligent causes and in the flagellum of Escherichia coli, and concludes that neither have natural explanations. In this paper we examine Dembski’s claims, point out significant errors in his reasoning, and conclude that there is no reason to accept his assertions. 1
On the Study of Data Modelling Languages using Chisholm's Ontology.” Information Modelling
 and Knowledge Bases XIII. Kangassalo, H.et al (Eds.). Amsterdam, IOS Press
, 2002
"... ..."