Results 1  10
of
33
Deriving algorithms from type inference systems: Application to strictness analysis
, 1994
"... The role of nonstandard type inference in static program analysis has been much studied recently. Early work emphasised the efficiency of type inference algorithms and paid little attention to the correctness of the inference system. Recently more powerful inference systems have been investigated b ..."
Abstract

Cited by 26 (8 self)
 Add to MetaCart
The role of nonstandard type inference in static program analysis has been much studied recently. Early work emphasised the efficiency of type inference algorithms and paid little attention to the correctness of the inference system. Recently more powerful inference systems have been investigated but the connection with efficient inference algorithms has been obscured. The contribution of this paper is twofold: first we show how to transform a program logic into an algorithm and, second, we introduce the notion of lazy types and show how to derive an efficient algorithm for strictness analysis. 1 Introduction Two major formal frameworks have been proposed for static analysis of functional languages: abstract interpretation and type inference. A lot of work has been done to characterise formally the correctness and the power of abstract interpretation. However the development of algorithms has not kept pace with the theoretical developments. This is now a major barrier that is preven...
Compilation of Functional Languages Using Flow Graph Analysis
, 1994
"... syntax, and syntactic and semantic domains of a flow graph Figure 9. Semantic equations Def and Exp of a flow graph The first argument to the functions Def and Exp specifies a set of nodes that represent a flow graph, from which the element(s) of current interest are selected by pattern matching. ..."
Abstract

Cited by 19 (13 self)
 Add to MetaCart
(Show Context)
syntax, and syntactic and semantic domains of a flow graph Figure 9. Semantic equations Def and Exp of a flow graph The first argument to the functions Def and Exp specifies a set of nodes that represent a flow graph, from which the element(s) of current interest are selected by pattern matching.
The Impact of seq on Free TheoremsBased Program Transformations
 Fundamenta Informaticae
, 2006
"... Parametric polymorphism constrains the behavior of pure functional programs in a way that allows the derivation of interesting theorems about them solely from their types, i.e., virtually for free. Unfortunately, standard parametricity results — including socalled free theorems — fail for nonstrict ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
Parametric polymorphism constrains the behavior of pure functional programs in a way that allows the derivation of interesting theorems about them solely from their types, i.e., virtually for free. Unfortunately, standard parametricity results — including socalled free theorems — fail for nonstrict languages supporting a polymorphic strict evaluation primitive such as Haskell’s seq. A folk theorem maintains that such results hold for a subset of Haskell corresponding to a GirardReynolds calculus with fixpoints and algebraic datatypes even when seq is present provided the relations which appear in their derivations are required to be bottomreflecting and admissible. In this paper we show that this folklore is incorrect, but that parametricity results can be recovered in the presence of seq by restricting attention to leftclosed, total, and admissible relations instead. The key novelty of our approach is the asymmetry introduced by leftclosedness, which leads to “inequational” versions of standard parametricity results together with preconditions guaranteeing their validity even when seq is present. We use these results to derive criteria ensuring that both equational and inequational versions of short cut fusion and related program transformations based on free theorems hold in the presence of seq.
Single Assignment C  Entwurf und Implementierung einer CVariante mit spezieller Unterstützung shapeinvarianter ArrayOperationen
, 1996
"... ..."
TEA: Automatically proving termination of programs in a nonstrict higherorder functional language
 IN PROC. 4TH INT. STATIC ANALYSIS SYMP
, 1997
"... We present TEA, a tool that is able to detect termination of functions written in a nonstrict highlevel functional programming language like Haskell. Since almost every compiler for lazy functional languages transforms programs into a functional core language, we use such a core language as the so ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
We present TEA, a tool that is able to detect termination of functions written in a nonstrict highlevel functional programming language like Haskell. Since almost every compiler for lazy functional languages transforms programs into a functional core language, we use such a core language as the source language for the analysis. TEA is able to detect two kinds of termination: nftermination and lazy termination. Intuitively, nftermination of f means that given arguments a i in normal form, the normal order evaluation of the expression f a1 : : : an terminates with a normal form. If an expression evaluates to an infinite normal form, we will call it lazy terminating. In order to prove nftermination of an expression, TEA tries to generate a preclosed tableau using abstract reduction and path analysis. The preclosed tableau issues a set of ordering constraints, which have to be satisfied by some Noetherian ordering. A subsystem is called to generate this Noetherian ordering, and to v...
FUNDIO: A LambdaCalculus with a letrec, case, Constructors, and an IOInterface: Approaching a Theory of unsafePerformIO. Frank report 16, Institut für
 Informatik, J.W. GoetheUniversität Frankfurt, September 2003. SSSS04. Manfred SchmidtSchauß, Marko Schütz, and
"... Abstract. A nondeterministic callbyneed lambdacalculus λndlr with case, constructors, letrec and a (nondeterministic) erratic choice, based on rewriting rules is investigated. A standard reduction is defined as a variant of leftmost outermost reduction. The semantics is defined by contextual e ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Abstract. A nondeterministic callbyneed lambdacalculus λndlr with case, constructors, letrec and a (nondeterministic) erratic choice, based on rewriting rules is investigated. A standard reduction is defined as a variant of leftmost outermost reduction. The semantics is defined by contextual equivalence of expressions instead of using αβ(η)equivalence. It is shown that several program transformations are correct, for example all (deterministic) rules of the calculus, and in addition the rules for garbage collection, removing indirections and unique copy. This shows that the combination of a context lemma and a metarewriting on reductions using complete sets of commuting (forking, resp.) diagrams is a useful and successful method for providing a semantics of a functional programming language and proving correctness of program transformations. 1
On the Safety of Nöcker’s Strictness Analysis
 FRANKFURT AM MAIN, GERMANY
"... This paper proves correctness of Nöcker’s method of strictness analysis, implemented for Clean, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt, which addresses correctness of the ..."
Abstract

Cited by 8 (7 self)
 Add to MetaCart
This paper proves correctness of Nöcker’s method of strictness analysis, implemented for Clean, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt, which addresses correctness of the abstract reduction rules. Our method also addresses the cycle detection rules, which are the main strength of Nöcker’s strictness analysis. We reformulate Nöcker’s strictness analysis algorithm in a higherorder lambdacalculus with case, constructors, letrec, and a nondeterministic choice operator ⊕ used as a union operator. Furthermore, the calculus is expressive enough to represent abstract constants like Top or Inf. The operational semantics is a smallstep semantics and equality of expressions is defined by a contextual semantics that observes termination of expressions. The correctness of several reductions is proved using a context lemma and complete sets of forking and commuting diagrams. The
CLEAN: a programming environment based on term graph rewriting
 Proc. of Joint COMPUGRAPH/SEMAGRAPH Workshop on Graph Rewriting and Computation (SEGRAGRA’95
, 1995
"... ..."
(Show Context)
Lazy Type Inference for the Strictness Analysis of Lists
, 1994
"... We present a type inference system for the strictness analysis of lists and we show that it can be used as the basis for an efficient algorithm. The algorithm is as accurate as the usual abstract interpretation technique. One distinctive advantage of this approach is that it is not necessary to impo ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We present a type inference system for the strictness analysis of lists and we show that it can be used as the basis for an efficient algorithm. The algorithm is as accurate as the usual abstract interpretation technique. One distinctive advantage of this approach is that it is not necessary to impose an abstract domain of a particular depth prior to the analysis: the lazy type algorithm will instead explore the part of a potentially infinite domain required to prove the strictness property. 1 Introduction Simple strictness analysis returns information about the fact that the result of a function application is undefined when some of the arguments are undefined. This information can be used in a compiler for a lazy functional language because the argument of a strict function can be evaluated (up to weak head normal form) and passed by value. However a more sophisticated property might be useful in the presence of lists or other recursive data structures which are pervasive in functio...
Higher Order Demand Propagation
 Lecture Notes in Computer Science
, 1998
"... . A new denotational semantics is introduced for realistic nonstrict functional languages, which have a polymorphic type system and support higher order functions and user definable algebraic data types. It maps each function definition to a demand propagator, which is a higher order function, t ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
. A new denotational semantics is introduced for realistic nonstrict functional languages, which have a polymorphic type system and support higher order functions and user definable algebraic data types. It maps each function definition to a demand propagator, which is a higher order function, that propagates context demands to function arguments. The relation of this "higher order demand propagation semantics" to the standard semantics is explained and it is used to define a backward strictness analysis. The strictness information deduced by this analysis is very accurate, because demands can actually be constructed during the analysis. These demands conform better to the analysed functions than abstract values, which are constructed alone with respect to types like in other existing strictness analyses. The richness of the semantic domains of higher order demand propagation makes it possible to express generalised strictness information for higher order functions even ac...