Results 1  10
of
35
A Linearly Typed Assembly Language
 In Workshop on Types in Compilation
"... Today's typesafe lowlevel languages rely on garbage collection to recycle heapallocated objects safely. We present LTAL, a safe, lowlevel, yet simple language that "stands on its own": it guarantees safe execution within a fixed memory space, without relying on external runtime s ..."
Abstract

Cited by 146 (35 self)
 Add to MetaCart
Today's typesafe lowlevel languages rely on garbage collection to recycle heapallocated objects safely. We present LTAL, a safe, lowlevel, yet simple language that "stands on its own": it guarantees safe execution within a fixed memory space, without relying on external runtime support. We demonstrate the expressiveness of LTAL by giving a typepreserving compiler for the functional core of ML. But this independence comes at a steep price: LTAL's type system imposes a draconian discipline of linearity that ensures that memory can be reused safely, but prohibits any useful kind of sharing. We present the results of experiments with a prototype LTAL system that show just how high the price of linearity can be.
Alias Types for Recursive Data Structures
, 2000
"... Linear type systems permit programmers to deallocate or explicitly recycle memory, but they are severly restricted by the fact that they admit no aliasing. This paper describes a pseudolinear type system that allows a degree of aliasing and memory reuse as well as the ability to define complex recu ..."
Abstract

Cited by 136 (14 self)
 Add to MetaCart
Linear type systems permit programmers to deallocate or explicitly recycle memory, but they are severly restricted by the fact that they admit no aliasing. This paper describes a pseudolinear type system that allows a degree of aliasing and memory reuse as well as the ability to define complex recursive data structures. Our type system can encode conventional linear data structures such as linear lists and trees as well as more sophisticated data structures including cyclic and doublylinked lists and trees. In the latter cases, our type system is expressive enough to represent pointer aliasing and yet safely permit destructive operations such as object deallocation. We demonstrate the flexibility of our type system by encoding two common compiler optimizations: destinationpassing style and DeutschSchorrWaite or "linkreversal" traversal algorithms.
A Type System for Bounded Space and Functional inPlace Update
, 2000
"... We show how linear typing can be used to obtain functional programs which modify heapallocated data structures in place. We present this both as a "design pattern" for writing Ccode in a functional style and as a compilation process from linearly typed firstorder functional programs int ..."
Abstract

Cited by 85 (15 self)
 Add to MetaCart
We show how linear typing can be used to obtain functional programs which modify heapallocated data structures in place. We present this both as a "design pattern" for writing Ccode in a functional style and as a compilation process from linearly typed firstorder functional programs into malloc()free C code. The main technical result is the correctness of this compilation. The crucial innovation over previous linear typing schemes consists of the introduction of a resource type # which controls the number of constructor symbols such as cons in recursive definitions and ensures linear space while restricting expressive power surprisingly little. While the space e#ciency brought about by the new typing scheme and the compilation into C can also be realised by with stateoftheart optimising compilers for functional languages such as Ocaml [16], the present method provides guaranteed bounds on heap space which will be of use for applications such as languages for embedd...
A Brief Guide to Linear Logic
, 1993
"... An overview of linear logic is given, including an extensive bibliography and a simple example of the close relationship between linear logic and computation. ..."
Abstract

Cited by 53 (8 self)
 Add to MetaCart
An overview of linear logic is given, including an extensive bibliography and a simple example of the close relationship between linear logic and computation.
A lambda calculus for quantum computation
 SIAM Journal of Computing
"... The classical lambda calculus may be regarded both as a programming language and as a formal algebraic system for reasoning about computation. It provides a computational model equivalent to the Turing machine, and continues to be of enormous benefit in the classical theory of computation. We propos ..."
Abstract

Cited by 49 (1 self)
 Add to MetaCart
The classical lambda calculus may be regarded both as a programming language and as a formal algebraic system for reasoning about computation. It provides a computational model equivalent to the Turing machine, and continues to be of enormous benefit in the classical theory of computation. We propose that quantum computation, like its classical counterpart, may benefit from a version of the lambda calculus suitable for expressing and reasoning about quantum algorithms. In this paper we develop a quantum lambda calculus as an alternative model of quantum computation, which combines some of the benefits of both the quantum Turing machine and the quantum circuit models. The calculus turns out to be closely related to the linear lambda calculi used in the study of Linear Logic. We set up a computational model and an equational proof system for this calculus, and we argue that it is equivalent to the quantum Turing machine.
Applications of Linear Logic to Computation: An Overview
, 1993
"... This paper is an overview of existing applications of Linear Logic (LL) to issues of computation. After a substantial introduction to LL, it discusses the implications of LL to functional programming, logic programming, concurrent and objectoriented programming and some other applications of LL, li ..."
Abstract

Cited by 41 (3 self)
 Add to MetaCart
This paper is an overview of existing applications of Linear Logic (LL) to issues of computation. After a substantial introduction to LL, it discusses the implications of LL to functional programming, logic programming, concurrent and objectoriented programming and some other applications of LL, like semantics of negation in LP, nonmonotonic issues in AI planning, etc. Although the overview covers pretty much the stateoftheart in this area, by necessity many of the works are only mentioned and referenced, but not discussed in any considerable detail. The paper does not presuppose any previous exposition to LL, and is addressed more to computer scientists (probably with a theoretical inclination) than to logicians. The paper contains over 140 references, of which some 80 are about applications of LL. 1 Linear Logic Linear Logic (LL) was introduced in 1987 by Girard [62]. From the very beginning it was recognized as relevant to issues of computation (especially concurrency and stat...
On Regions and Linear Types
"... We explore how two different mechanisms for reasoning about state, linear typing and the type, region and effect discipline, complement one another in the design of a strongly typed functional programming language. The basis for our language is a simple lambda calculus containing firstclass regions ..."
Abstract

Cited by 36 (2 self)
 Add to MetaCart
We explore how two different mechanisms for reasoning about state, linear typing and the type, region and effect discipline, complement one another in the design of a strongly typed functional programming language. The basis for our language is a simple lambda calculus containing firstclass regions, which are explicitly passed as arguments to functions, returned as results and stored in userdefined data structures. In order to ensure appropriate memory safety properties, we draw upon the literature on linear type systems to help control access to and deallocation of regions. In fact, we use two different interpretations of linear types, one in which multipleuse values are freely copied and discarded and one in which multipleuse values are explicitly referencecounted, and show that both interpretations give rise to interesting invariants for manipulating regions. We also explore new programming paradigms that arise by mixing firstclass regions and conventional linear data stru...
A Type Theory for Memory Allocation and Data Layout (Extended Version)
 In Proceedings of the 30th ACM SIGPLANSIGACT Symposium on Principles of Programming Languages
, 2002
"... Ordered type theory is an extension of linear type theory in which variables in the context may be neither dropped nor reordered. This restriction gives rise to a natural notion of adjacency. We show that a language based on ordered types can use this property to give an exact account of the layout ..."
Abstract

Cited by 27 (3 self)
 Add to MetaCart
Ordered type theory is an extension of linear type theory in which variables in the context may be neither dropped nor reordered. This restriction gives rise to a natural notion of adjacency. We show that a language based on ordered types can use this property to give an exact account of the layout of data in memory. The fuse constructor from ordered logic describes adjacency of values in memory, and the mobility modal describes pointers into the heap. We choose a particular allocation model based on a common implementation scheme for copying garbage collection and show how this permits us to separate out the allocation and initialization of memory locations in such a way as to account for optimizations such as the coalescing of multiple calls to the allocator.
Operational Interpretations of Linear Logic
, 1998
"... Two different operational interpretations of intuitionistic linear logic have been proposed in the literature. The simplest interpretation recomputes nonlinear values every time they are required. It has good memorymanagement properties, but is often dismissed as being too inefficient. Alternative ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
Two different operational interpretations of intuitionistic linear logic have been proposed in the literature. The simplest interpretation recomputes nonlinear values every time they are required. It has good memorymanagement properties, but is often dismissed as being too inefficient. Alternatively, one can memoize the results of evaluating nonlinear values. This avoids any recomputation, but has weaker memorymanagement properties. Using a novel combination of typetheoretic and operational techniques we give a concise formal comparison of the two interpretations. Moreover, we show that there is a subset of linear logic where the two operational interpretations coincide. In this subset, which is sufficiently expressive to encode callbyvalue lambdacalculus, we can have the best of both worlds: a simple and efficient implementation, and good memorymanagement properties. Keywords: linear logic, operational semantics, callbyvalue lambda calculus, memory management. 1 Introductio...
The Logical Approach to Stack Typing
, 2003
"... We develop a logic for reasoning about adjacency and separation of memory blocks, as well as aliasing of pointers. We provide a memory model for our logic and present a sound set of natural deductionstyle inference rules. We deploy the logic in a simple type system for a stackbased assembly langu ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
We develop a logic for reasoning about adjacency and separation of memory blocks, as well as aliasing of pointers. We provide a memory model for our logic and present a sound set of natural deductionstyle inference rules. We deploy the logic in a simple type system for a stackbased assembly language. The connectives for the logic provide a flexible yet concise mechanism for controlling allocation, deallocation and access to both heapallocated and stackallocated data.