Results 11  20
of
72
A modular, polyvariant, and typebased closure analysis
 In ICFP ’97 [ICFP97
"... We observe that the principal typing property of a type system is the enabling technology for modularity and separate compilation [10]. We use this technology to formulate a modular and polyvariant closure analysis, based on the rank 2 intersection types annotated with controlflow information. Modu ..."
Abstract

Cited by 57 (1 self)
 Add to MetaCart
(Show Context)
We observe that the principal typing property of a type system is the enabling technology for modularity and separate compilation [10]. We use this technology to formulate a modular and polyvariant closure analysis, based on the rank 2 intersection types annotated with controlflow information. Modularity manifests itself in a syntaxdirected, annotatedtype inference algorithm that can analyse program fragments containing free variables: a principal typing property is used to formalise it. Polyvariance manifests itself in the separation of different behaviours of the same function at its different uses: this is formalised via the rank 2 intersection types. As the rank 2 intersection type discipline types at least all (core) ML programs, our analysis can be used in the separate compilation of such programs. 1
Checking and Inferring Local NonAliasing
, 2003
"... In prior work [15] we studied a language construct restrict that allows programmers to specify that certain pointers are not aliased to other pointers used within a lexical scope. Among other applications, programming with these constructs helps program analysis tools locally recover strong updates, ..."
Abstract

Cited by 48 (11 self)
 Add to MetaCart
In prior work [15] we studied a language construct restrict that allows programmers to specify that certain pointers are not aliased to other pointers used within a lexical scope. Among other applications, programming with these constructs helps program analysis tools locally recover strong updates, which can improve the tracking of state in flowsensitive analyses. In this paper we continue the study of restrict and introduce the construct confine. We present a type and effect system for checking the correctness of these annotations, and we develop efficient constraintbased algorithms implementing these type checking systems. To make it easier to use restrict and confine in practice, we show how to automatically infer such annotations without programmer assistance. In experiments on locking in 589 Linux device drivers, confine inference can automatically recover strong updates to eliminate 95% of the type errors resulting from weak updates.
Programming Languages and Dimensions
, 1996
"... Scientists and engineers must ensure that the equations and formulae which they use are dimensionally consistent, but existing programming languages treat all numeric values as dimensionless. This thesis investigates the extension of programming languages to support the notion of physical dimension. ..."
Abstract

Cited by 42 (3 self)
 Add to MetaCart
(Show Context)
Scientists and engineers must ensure that the equations and formulae which they use are dimensionally consistent, but existing programming languages treat all numeric values as dimensionless. This thesis investigates the extension of programming languages to support the notion of physical dimension. A type system is presented similar to that of the programming language ML but extended with polymorphic dimension types. An algorithm which infers most general dimension types automatically is then described and proved correct. The semantics of the language is given by a translation into an explicitlytyped language in which dimensions are passed as arguments to functions. The operational semantics of this language is specified in the usual way by an evaluation relation defined by a set of rules. This is used to show that if a program is welltyped then no dimension errors can occur during its evaluation. More abstract properties of the language are investigated using a denotational semantics: these include a notion of invariance under changes in the units of measure used, analogous to parametricity in the polymorphic lambda calculus. Finally the dissertation is summarised and many possible directions for future research in dimension types and related type systems are described. i ii
Formally optimal boxing
 In POPL '94 [37
"... An important implementation decision in polymorphically typed functional programming languages is whether to represent data in boxed or unboxed form and when to transform them from one representation to the other. Using a language with explicit representation types and boxing/unboxing operations w ..."
Abstract

Cited by 41 (0 self)
 Add to MetaCart
(Show Context)
An important implementation decision in polymorphically typed functional programming languages is whether to represent data in boxed or unboxed form and when to transform them from one representation to the other. Using a language with explicit representation types and boxing/unboxing operations we axiomatize equationally the set of all explicitly boxed versions, called completions, of a given source program. In a twostage process we give some of the equations a rewriting interpretation that captures eliminating boxing/unboxing operations without relying on a specific implementation or even semantics of the underlying language. The resulting reduction systems operate on congruence classes of completions defined by the remaining equations E, which can be understood as moving boxing/unboxing operations along data flow paths in the source program. We call a completion eopt formally optimal if every other completion for the same program (and at the same representation type) reduces to eopt under this twostage reduction. We show that every source program has formally optimal completions, which are unique modulo E. This is accomplished by first “polarizing ” the equations in E and orienting them to obtain two canonical (confluent and strongly normalizing) rewriting systems. The completions produced by Leroy’s and Poulsen’s algorithms are generally not formally optimal in our sense. The rewriting systems have been implemented and applied to some simple Standard ML programs. Our results show that the amount of boxing and unboxing operations is also in practice substantially reduced in comparison to Leroy’s completions. This analysis is intended to be integrated into Tofte’s regionbased implementation of Standard ML currently underway at DIKU.
Type Qualifiers: Lightweight Specifications to Improve Software Quality
, 2002
"... ..."
(Show Context)
Polymorphic Recursion and Subtype Qualifications: Polymorphic BindingTime Analysis in Polynomial Time
 Static Analysis, Second International Symposium, number 983 in Lecture
"... Abstract. The combination of parameter polymorphism, subtyping extended to qualified and polymorphic types, and polymorphic recursion is useful in standard type inference and gives expressive typebased program analyses, but raises difficult algorithmic problems. In a program analysis context we sho ..."
Abstract

Cited by 28 (2 self)
 Add to MetaCart
(Show Context)
Abstract. The combination of parameter polymorphism, subtyping extended to qualified and polymorphic types, and polymorphic recursion is useful in standard type inference and gives expressive typebased program analyses, but raises difficult algorithmic problems. In a program analysis context we show how Mycroft’s iterative method of computing principal types for a type system with polymorphic recursion can be generalized and adapted to work in a setting with subtyping. This does not only yield a proof of existence of principal types (most general properties), but also an algorithm for computing them. The punchline of the development is that a very simple modification of the basic algorithm reduces its computational complexity from exponential time to polynomial time relative to the size of the given, explicitly typed program. This solves the open problem of finding an inference algorithm for polymorphic bindingtime analysis [7]. 1
Macroscopic Data Structure Analysis and Optimization
, 2005
"... Providing high performance for pointerintensive programs on modern architectures is an increasingly difficult problem for compilers. Pointerintensive programs are often bound by memory latency and cache performance, but traditional approaches to these problems usually fail: Pointerintensive progr ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
(Show Context)
Providing high performance for pointerintensive programs on modern architectures is an increasingly difficult problem for compilers. Pointerintensive programs are often bound by memory latency and cache performance, but traditional approaches to these problems usually fail: Pointerintensive programs are often highlyirregular and the compiler has little control over the layout of heap allocated objects. This thesis presents a new class of techniques named “Macroscopic Data Structure Analyses and Optimizations”, which is a new approach to the problem of analyzing and optimizing pointerintensive programs. Instead of analyzing individual load/store operations or structure definitions, this approach identifies, analyzes, and transforms entire memory structures as a unit. The foundation of the approach is an analysis named Data Structure Analysis and a transformation named Automatic Pool Allocation. Data Structure Analysis is a contextsensitive pointer analysis which identifies data structures on the heap and their important properties (such as type safety). Automatic Pool Allocation uses the results of Data Structure Analysis to segregate dynamically allocated objects on the heap, giving control over the layout of the data structure in memory to the compiler. Based on these two foundation techniques, this thesis describes several performance improving
Monadic Regions
, 2004
"... Regionbased type systems provide programmer control over memory management without sacrificing typesafety. However, the type systems for regionbased languages, such as the MLKit or Cyclone, are relatively complicated, so proving their soundness is nontrivial. This paper shows that the complicati ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
(Show Context)
Regionbased type systems provide programmer control over memory management without sacrificing typesafety. However, the type systems for regionbased languages, such as the MLKit or Cyclone, are relatively complicated, so proving their soundness is nontrivial. This paper shows that the complication is in principle unnecessary. In particular, we show that plain old parametric polymorphism, as found in Haskell, is all that is needed. We substantiate this claim by giving a type and meaningpreserving translation from a regionbased language based on core Cyclone to a monadic variant of System F with region primitives whose types and operations are inspired by (and generalize) the ST monad.
Syntactic Type Soundness Results for the Region Calculus
 INFORMATION AND COMPUTATION
, 2001
"... The region calculus of Tofte and Talpin is a polymorphically typed lambda calculus with annotations that make memory allocation and deallocation explicit. It is intended as an intermediate language for implementing ML without garbage collection. Static region and eect inference can be used to genera ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
(Show Context)
The region calculus of Tofte and Talpin is a polymorphically typed lambda calculus with annotations that make memory allocation and deallocation explicit. It is intended as an intermediate language for implementing ML without garbage collection. Static region and eect inference can be used to generate annotations from a given ML program. Soundness of the calculus with respect to the region and eect system is crucial to guarantee safe deallocation of regions, i.e., deallocation should only take place for objects which are provably dead. The original soundness proof by Tofte and Talpin requires a complex coinductive safety relation. In this paper, we present two smallstep operational semantics for the region calculus and prove their soundness. Following the syntactic approach of Wright, Felleisen, and Harper, we obtain simple inductive proofs. The rst semantics is storeless. It is simple and elegant and gives rise to perspicuous proofs. The second semantics provides a storebased model for the region calculus. It is slightly more complicated, but includes operations on references with destructive update. We prove (the pure fragment of) both semantics equivalent to the original evaluationstyle formulation by Tofte and Talpin.
A Generic TypeandEffect System
"... Typeandeffect systems are a natural approach for statically reasoning about a program’s execution. They have been used to track a variety of computational effects, for example memory manipulation, exceptions, and locking. However, each typeandeffect system is typically implemented as its own mon ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
(Show Context)
Typeandeffect systems are a natural approach for statically reasoning about a program’s execution. They have been used to track a variety of computational effects, for example memory manipulation, exceptions, and locking. However, each typeandeffect system is typically implemented as its own monolithic type system that hardcodes a particular syntax of effects along with particular rules to track and control those effects. We present a generic typeandeffect system, which is parameterized by the syntax of effects to track and by two functions that together specify the effect discipline to be statically enforced. We describe how a standard form of type soundness is ensured by requiring these two functions to obey a few natural monotonicity requirements. We demonstrate that several effect systems from the literature can be viewed as instantiations of our generic type system. Finally, we describe the implementation of our typeandeffect system and mechanically checked type soundness proof in the Twelf proof assistant.