Results 11  20
of
22
An operational domaintheoretic treatment of recursive types
 in: TwentySecond Mathematical Foundations of Programming Semantics
, 2006
"... We develop a domain theory for treating recursive types with respect to contextual equivalence. The principal approach taken here deviates from classical domain theory in that we do not produce the recursive types via the usual inverse limits constructions we have it for free by working directly wi ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We develop a domain theory for treating recursive types with respect to contextual equivalence. The principal approach taken here deviates from classical domain theory in that we do not produce the recursive types via the usual inverse limits constructions we have it for free by working directly with the operational semantics. By extending type expressions to endofunctors on a ‘syntactic ’ category, we establish algebraic compactness. To do this, we rely on an operational version of the minimal invariance property. In addition, we apply techniques developed herein to reason about FPC programs. Key words: Operational domain theory, recursive types, FPC, realisable functor, algebraic compactness, generic approximation lemma, denotational semantics 1
A Formal Semantics for Isorecursive and Equirecursive State Abstractions
"... Most methodologies for static program verification support recursivelydefined predicates in specifications, in order to reason about recursive data structures. Intuitively, a predicate instance represents the complete unrolling of its definition; this is the equirecursive interpretation. However, t ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Most methodologies for static program verification support recursivelydefined predicates in specifications, in order to reason about recursive data structures. Intuitively, a predicate instance represents the complete unrolling of its definition; this is the equirecursive interpretation. However, this semantics is unsuitable for static verification, when the recursion becomes unbounded. For this reason, most static verifiers supporting recursive definitions employ explicit folding and unfolding of recursive definitions (specified using ghost commands, or inferred). Such a semantics differentiates between, e.g., a predicate instance and its corresponding body, while providing a facility to map between the two; this is the isorecursive semantics. While this latter interpretation is usually implemented in practice, only the equirecursive semantics is typically treated in theoretical work. In this paper we provide both an isorecursive and an equirecursive formal semantics for recursive definitions in the context of Chalice, a verification methodology based on implicit dynamic frames. We extend these assertion semantics to appropriate Hoare Logics, and prove the soundness of our definitions. The development of such formalisations requires addressing several subtle issues, regarding both the possibility of infinitelyrecursive definitions and the need for the isorecursive semantics to correctly reflect the restrictions that make it readily implementable. These questions are made more challenging still in the context of implicit dynamic frames, where the use of heapdependent expressions provides further pitfalls for a correct formal treatment.
A Relational Realizability Model for Higherorder Stateful ADTs
, 2010
"... We present a realizability model for reasoning about contextual equivalence of higherorder programs with impredicative polymorphism, recursive types, and higherorder mutable state. The model combines the virtues of two recent earlier models: (1) Ahmed, Dreyer, and Rossberg’s stepindexed logical r ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We present a realizability model for reasoning about contextual equivalence of higherorder programs with impredicative polymorphism, recursive types, and higherorder mutable state. The model combines the virtues of two recent earlier models: (1) Ahmed, Dreyer, and Rossberg’s stepindexed logical relations model, which was designed to facilitate proofs of representation independence for “statedependent” ADTs and (2) Birkedal, Støvring, and Thamsborg’s realizability logical relations model, which was designed to facilitate abstract proofs without tedious proofs of representation independence for “statedependent” ADTs.
Semantic Orthogonality of Type Disciplines
, 1997
"... We consider a version of PCF, and prove, using both syntactic and semantic means, that the operational equivalences of the base language are preserved when the language is extended with sum and product types, with polymorphic types, and with recursive types. These theorems show that the additions to ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We consider a version of PCF, and prove, using both syntactic and semantic means, that the operational equivalences of the base language are preserved when the language is extended with sum and product types, with polymorphic types, and with recursive types. These theorems show that the additions to the type systems are orthogonal to the original language. 1 Introduction Type systems for programming languages are rarely monolithic: although a type system may be composed of many parts, each part can usually be understood on its own. Consider, for instance, the programming language Standard ML (SML) [22]. SML's type system includes base types of integers, reals, strings, and characters, and type constructors for lists, functions, tuples, references, exceptions, userdefined recursive datatypes, and polymorphism. On a syntactic level, the type rules of the parts do not interfere with one another: the typechecking rule for application, for example, uses only the fact that the operator is...
Declarative coinductive axiomatization of regular expression containment and its computational interpretation (preliminary version
 In Proc. 38th ACM SIGACTSIGPLAN Symposium on Principles of Programming Languages (POPL
, 2011
"... We present a new sound and complete coinductive axiomatization of regular expression containment. It consists of the conventional axiomatization of concatenation, alternation, empty set and (the singleton set containing the) empty string as an idempotent semiring, with E ∗ = 1 + E × E ∗ for Kleene ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We present a new sound and complete coinductive axiomatization of regular expression containment. It consists of the conventional axiomatization of concatenation, alternation, empty set and (the singleton set containing the) empty string as an idempotent semiring, with E ∗ = 1 + E × E ∗ for Kleenestar and the general coinduction rule Γ, E ≤ F ⊢ E ≤ F (side condition for soundness) Γ ⊢ E ≤ F as the only additional rules. The axiomatization admits a natural CurryHowardstyle computational interpretation where regular expressions are straightforwardly interpreted as types containing not the strings themselves, but proofs of string membership in a regular expression. It turns out that this coincides with interpreting concatenation, alternation, empty set, empty string and Kleene star as product, sum, empty, unit and list types, respectively. A proof of containment between regular expressions can then be interpreted computationally as a total function mapping a proof of string membership in one regular expression to a proof of membership in the containing regular expression for the same string. Computationally, the coinduction rule corresponds to definition by recursion, and its side condition in its most general form simply stipulates that the thus defined function be total, yielding soundness of our axiomatization. We provide a syntactically easily checkable totality criterion and show that Kozen’s rules can be coded using it, yielding completeness. We show how to synthesize containment proofs and how to construct and transform regularexpressionspecific bit representations of strings. Since membership proofs correspond to syntax trees, our computational interpretation of regular expressions become a refinement type theory for strings represented intensionally by their syntax trees. Regular expressions are mostly used for substring matching, a form of parsing, not just membership testing, and containment proofs retain syntactic information thrown away by automata constructions.
Proceedings of the Third ACM SIGPLAN Workshop on Continuations (CW'01)
, 2001
"... Local CPS conversion is a compiler transformation for improving the code generated for nested loops by a directstyle compiler. The transformation consists of a combination of CPS conversion and lightweight closure conversion, which allows the compiler to merge the environments of nested recursive ..."
Abstract
 Add to MetaCart
Local CPS conversion is a compiler transformation for improving the code generated for nested loops by a directstyle compiler. The transformation consists of a combination of CPS conversion and lightweight closure conversion, which allows the compiler to merge the environments of nested recursive functions. This merging, in turn, allows the backend to use a single machinelevel procedure to implement the nested loops. Preliminary experiments with the Moby compiler show the potential for significant reductions in loop overhead as a result of Local CPS conversion. 1 Introduction Most compilers for functional languages use a calculus based intermediate representation (IR) for their optimization phases. The  calculus is a good match for this purpose because, on the one hand, it models surfacelanguage features like higherorder functions and lexical scoping, while, on the other hand, it can be transformed into a form that is quite close to the machine model. To make analysis and opt...
Type Systems for Programming Languages
, 2001
"... These notes belong to the course Type Systems for Programming Languages, given to fourth year students in Computing and Joint Mathematics and Computing with some experience in reasoning and logic, and students in the Advanced Masters programme at the Department of Computing, Imperial College, London ..."
Abstract
 Add to MetaCart
These notes belong to the course Type Systems for Programming Languages, given to fourth year students in Computing and Joint Mathematics and Computing with some experience in reasoning and logic, and students in the Advanced Masters programme at the Department of Computing, Imperial College, London. The course is intended for students interested in theoretical computer science, who possess some knowledge of logic. No prior knowledge on type systems or proof techniques is assumed, other than being familiar with the principle of induction. Aims • To lay out in detail the design of type assignment systems for programming languages. • To focus on the importance of a sound theoretical framework, in order to be able to reason about properties of a typed program. • To understand the concepts of: type checking, type reconstruction, polymorphism, type derivation, typeability, typing of recursive functions, termination in the context of typeability, and undecidable systems. • To study various systems and various languages, and to compare those and to select.
Computational Soundness and Adequacy for Typed Object Calculus
"... By giving a translation from typed object calculus into Plotkin’s FPC, we demonstrate that every computationally sound and adequate model of FPC (with eager operational semantics), is also a sound and adequate model of typed object calculus. This establishes that denotational equality is contained i ..."
Abstract
 Add to MetaCart
By giving a translation from typed object calculus into Plotkin’s FPC, we demonstrate that every computationally sound and adequate model of FPC (with eager operational semantics), is also a sound and adequate model of typed object calculus. This establishes that denotational equality is contained in operational equivalence, i.e. that for any such model of typed object calculus, if two terms have equal denotations, then no program (or rather program context) can distinguish between those two terms. Hence we show that FPC models can be used in the study of program transformations (program algebra) for typed object calculus. Our treatment is based on selfapplication interpretation and subtyping is not considered. The typed object calculus under consideration is a variation of Abadi and Cardelli’s firstorder calculus with sum and product types, recursive types, and the usual method update and method invocation in a form where the object types have assimilated the recursive types. As an additional result, we prove subject reduction for this calculus.
External Examiner
, 2006
"... The results reported in Part III consist of joint work with Martín Escardó [14]. All the other results reported in this thesis are due to the author, except for background results, which are clearly stated as such. Some of the results in Part IV have already appeared as [28]. Note This version of th ..."
Abstract
 Add to MetaCart
The results reported in Part III consist of joint work with Martín Escardó [14]. All the other results reported in this thesis are due to the author, except for background results, which are clearly stated as such. Some of the results in Part IV have already appeared as [28]. Note This version of the thesis, produced on October 31, 2006, is the result of completing all the minor modifications as suggested by both the examiners in the viva report (Ref: CLM/AC/497773). We develop an operational domain theory to reason about programs in sequential functional languages. The central idea is to export domaintheoretic techniques of the Scott denotational semantics directly to the study of contextual preorder and equivalence. We investigate to what extent this can be done for two deterministic functional programming languages: PCF (Programminglanguage for Computable Functionals) and FPC (Fixed Point Calculus).
Subtyping, Recursion and Parametric Polymorphism in Kernel Fun
, 2003
"... We study subtype checking for recursive types in system kernel Fun, a typed λcalculus with subtyping and bounded secondorder polymorphism. Along the lines of [AC93], we define a subtype relation over kernel Fun recursive types, and prove it to be transitive. We then show that the natural extension ..."
Abstract
 Add to MetaCart
We study subtype checking for recursive types in system kernel Fun, a typed λcalculus with subtyping and bounded secondorder polymorphism. Along the lines of [AC93], we define a subtype relation over kernel Fun recursive types, and prove it to be transitive. We then show that the natural extension of the algorithm introduced in [AC93] to compare firstorder recursive types yields a non complete algorithm. Finally, we prove the completeness and correctness of a different algorithm, which lends itself to efficient implementations.