Results 1  10
of
16
A Calculus for Overload Functions with Subtyping

, 1992
"... We present a simple extension of typed calculus where functions can be overloaded by putting different "branches of code" together. When the function is applied, the branch to execute is chosen according to a particular selection rule which depends on the type of the argument. The crucial featu ..."
Abstract

Cited by 141 (28 self)
 Add to MetaCart
We present a simple extension of typed calculus where functions can be overloaded by putting different "branches of code" together. When the function is applied, the branch to execute is chosen according to a particular selection rule which depends on the type of the argument. The crucial feature of the present approach is that the branch selection depends on the "runtime type" of the argument, which may differ from its compiletime type, because of the existence of a subtyping relation among types. Hence overloading cannot be eliminated by a static analysis of code, but is an essential feature to be dealt with during computation. We obtain in this way a typedependent calculus, which differs from the various calculi where types do not play any role during computation. We prove Confluence and a generalized SubjectReduction theorem for this calculus. We prove Strong Normalization for a "stratified" subcalculus. The definition of this calculus is guided by the understand...
Equilogical Spaces
, 1998
"... It is well known that one can build models of full higherorder dependent type theory (also called the calculus of constructions) using partial equivalence relations (PERs) and assemblies over a partial combinatory algebra (PCA). But the idea of categories of PERs and ERs (total equivalence relation ..."
Abstract

Cited by 31 (12 self)
 Add to MetaCart
It is well known that one can build models of full higherorder dependent type theory (also called the calculus of constructions) using partial equivalence relations (PERs) and assemblies over a partial combinatory algebra (PCA). But the idea of categories of PERs and ERs (total equivalence relations) can be applied to other structures as well. In particular, we can easily dene the category of ERs and equivalencepreserving continuous mappings over the standard category Top 0 of topological T 0 spaces; we call these spaces (a topological space together with an ER) equilogical spaces and the resulting category Equ. We show that this categoryin contradistinction to Top 0 is a cartesian closed category. The direct proof outlined here uses the equivalence of the category Equ to the category PEqu of PERs over algebraic lattices (a full subcategory of Top 0 that is well known to be cartesian closed from domain theory). In another paper with Carboni and Rosolini (cited herein) a more abstract categorical generalization shows why many such categories are cartesian closed. The category Equ obviously contains Top 0 as a full subcategory, and it naturally contains many other well known subcategories. In particular, we show why, as a consequence of work of Ershov, Berger, and others, the KleeneKreisel hierarchy of countable functionals of nite types can be naturally constructed in Equ from the natural numbers object N by repeated use in Equ of exponentiation and binary products. We also develop for Equ notions of modest sets (a category equivalent to Equ) and assemblies to explain why a model of dependent type theory is obtained. We make some comparisons of this model to other, known models. 1
On functors expressible in the polymorphic typed lambda calculus
 Logical Foundations of Functional Programming
, 1990
"... This is a preprint of a paper that has been submitted to Information and Computation. ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
This is a preprint of a paper that has been submitted to Information and Computation.
A Logic of Subtyping
, 1996
"... The relation of inclusion between types has been suggested by the practice of programming, as it enriches the polymorphism of functional languages. We propose a simple (and linear) calculus of sequents for subtyping as logical entailment. This allows us to derive a complete and coherent approach to ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
The relation of inclusion between types has been suggested by the practice of programming, as it enriches the polymorphism of functional languages. We propose a simple (and linear) calculus of sequents for subtyping as logical entailment. This allows us to derive a complete and coherent approach to subtyping from a few, logically meaningful, sequents. In particular, transitivity and antisymmetry will be derived from elementary logical principles, which stresses the power of sequents and Gentzenstyle proof methods. Proof techniques based on cutelimination will be at the core of our results. 1 Introduction 1.1 Motivations, Theories and Models In recent years, several extensions of core functional languages have been proposed to deal with the notion of subtyping; see, for example, [CW85, Mit88, BL90, BCGS91, CMMS91, CG92, PS94, Tiu96, TU96]. These extensions were suggested by the practice of programming in computer science. In particular, they were inspired by the notion of inheritance...
Coherence and Transitivity of Subtyping as Entailment
, 1996
"... The relation of inclusion between types has been suggested by the practice of programming as it enriches the polymorphism of functional languages. We propose a simple (and linear) sequent calculus for subtyping as logical entailment. This allows us to derive a complete and coherent approach to subty ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
The relation of inclusion between types has been suggested by the practice of programming as it enriches the polymorphism of functional languages. We propose a simple (and linear) sequent calculus for subtyping as logical entailment. This allows us to derive a complete and coherent approach to subtyping from a few, logically meaningful sequents. In particular, transitivity and antisymmetry will be derived from elementary logical principles.
Type Theory via Exact Categories (Extended Abstract)
 In Proceedings of the 13th Annual IEEE Symposium on Logic in Computer Science LICS '98
, 1998
"... Partial equivalence relations (and categories of these) are a standard tool in semantics of type theories and programming languages, since they often provide a cartesian closed category with extended definability. Using the theory of exact categories, we give a categorytheoretic explanation of why ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Partial equivalence relations (and categories of these) are a standard tool in semantics of type theories and programming languages, since they often provide a cartesian closed category with extended definability. Using the theory of exact categories, we give a categorytheoretic explanation of why the construction of a category of partial equivalence relations often produces a cartesian closed category. We show how several familiar examples of categories of partial equivalence relations fit into the general framework. 1 Introduction Partial equivalence relations (and categories of these) are a standard tool in semantics of programming languages, see e.g. [2, 5, 7, 9, 15, 17, 20, 22, 35] and [6, 29] for extensive surveys. They are usefully applied to give proofs of correctness and adequacy since they often provide a cartesian closed category with additional properties. Take for instance a partial equivalence relation on the set of natural numbers: a binary relation R ` N\ThetaN on th...
An Introduction to Polymorphic Lambda Calculus
 Logical Foundations of Functional Programming
, 1994
"... Introduction to the Polymorphic Lambda Calculus John C. Reynolds Carnegie Mellon University December 23, 1994 The polymorphic (or secondorder) typed lambda calculus was invented by JeanYves Girard in 1971 [11, 10], and independently reinvented by myself in 1974 [24]. It is extraordinary that ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Introduction to the Polymorphic Lambda Calculus John C. Reynolds Carnegie Mellon University December 23, 1994 The polymorphic (or secondorder) typed lambda calculus was invented by JeanYves Girard in 1971 [11, 10], and independently reinvented by myself in 1974 [24]. It is extraordinary that essentially the same programming language was formulated independently by the two of us, especially since we were led to the language by entirely different motivations. In my own case, I was seeking to extend conventional typed programming languages to permit the definition of "polymorphic" procedures that could accept arguments of a variety of types. I started with the ordinary typed lambda calculus and added the ability to pass types as parameters (an idea that was "in the air" at the time, e.g. [4]). For example, as in the ordinary typed lambda calculus one can write f int!int : x int : f(f (x)) to denote the "doubling" function for the type int, which accepts a function from integers
A Generic Normalisation Proof for Pure Type Systems
, 1996
"... We prove the strong normalisation for any PTS, provided the existence of a certainset A * (s) for every sort s of the system. The properties verified by the A * (s)'s depend of the axiom and rules of the type system. 1 Introduction 1.1 Brief History This work is an attempt to deal with the s ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We prove the strong normalisation for any PTS, provided the existence of a certainset A * (s) for every sort s of the system. The properties verified by the A * (s)'s depend of the axiom and rules of the type system. 1 Introduction 1.1 Brief History This work is an attempt to deal with the structure of complex Type Theories. Historically, once Girard had transposed the BuraliForti paradox to type theory, MartinLof replied by suppressing the guilty Type : Type rule and remediated to the resulting loss of expressiveness by introducing a new concept of stratified universes [10]. Today this notion can be found, in different forms and variants, in most Type Theories, especially the ones with foundational ambitions. For example, it appears in the theories used in actually implemented proofcheckers (NuPRL, Coq, Lego. . . ). The main idea is that all types are no longer equal. Each one inhabits a certain universe (MartinLof) or sort (Pure Type Systems). In general, universes are emb...
A Simple Model Construction for the Calculus of Constructions
 Types for Proofs and Programs, International Workshop TYPES'95
, 1996
"... . We present a model construction for the Calculus of Constructions (CC) where all dependencies are carried out in a settheoretical setting. The Soundness Theorem is proved and as a consequence of it Strong Normalization for CC is obtained. Some other applications of our model constructions are: sh ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
. We present a model construction for the Calculus of Constructions (CC) where all dependencies are carried out in a settheoretical setting. The Soundness Theorem is proved and as a consequence of it Strong Normalization for CC is obtained. Some other applications of our model constructions are: showing that CC + Classical logic is consistent (by constructing a model for it) and showing that the Axiom of Choice is not derivable in CC (by constructing a model in which the type that represents the Axiom of Choice is empty). 1 Introduction In the literature there are many investigations on the semantics of polymorphic calculus with dependent types (see for example [12, 11, 10, 1, 5, 13]). Most of the existing models present a semantics for systems in which the inhabitants of the impredicative universe (types) are "lifted" to inhabitants of the predicative universe (kinds) (see [16]). Such systems are convenient to be modeled by locally Cartesianclosed categories having small Cartesia...
Carnap's remarks on Impredicative Definitions and the Genericity Theorem
 IN LOGIC, METHODOLOGY AND PHILOSOPHY OF SCIENCE: LOGIC IN FLORENCE
, 1997
"... In a short, but relevant paper [Car31], Rudolf Carnap summarizes the logicist foundation of mathematics, largely following Frege and Russell 's view. Carnap moves away though from Russell's approach on a crucial aspect: a detailed justification of impredicative definitions (a formal version of Russe ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In a short, but relevant paper [Car31], Rudolf Carnap summarizes the logicist foundation of mathematics, largely following Frege and Russell 's view. Carnap moves away though from Russell's approach on a crucial aspect: a detailed justification of impredicative definitions (a formal version of Russell's "vicious circle"), that he accepts. In this note we revisit Carnap's justification of impredicativity, within the frame of impredicative Type Theory. More precisely, we recall the treatment of impredicativity given in Girard's System F and justify it by reference to a recent result, the Genericity Theorem in [LMS93], which may help to set on mathematical grounds Carnap's informal remark. We then discuss the logical complexity of (the proof of) that theorem. Finally, the role of the Genericity Theorem in understanding the surprising "uniformities" of the consistency proof of Arithmetic, via System F, is hinted. The problem A definition is said to be impredicative, if it defines a concep...