Results 1  10
of
14
Deriving algorithms from type inference systems: Application to strictness analysis
, 1994
"... The role of nonstandard type inference in static program analysis has been much studied recently. Early work emphasised the efficiency of type inference algorithms and paid little attention to the correctness of the inference system. Recently more powerful inference systems have been investigated b ..."
Abstract

Cited by 26 (8 self)
 Add to MetaCart
The role of nonstandard type inference in static program analysis has been much studied recently. Early work emphasised the efficiency of type inference algorithms and paid little attention to the correctness of the inference system. Recently more powerful inference systems have been investigated but the connection with efficient inference algorithms has been obscured. The contribution of this paper is twofold: first we show how to transform a program logic into an algorithm and, second, we introduce the notion of lazy types and show how to derive an efficient algorithm for strictness analysis. 1 Introduction Two major formal frameworks have been proposed for static analysis of functional languages: abstract interpretation and type inference. A lot of work has been done to characterise formally the correctness and the power of abstract interpretation. However the development of algorithms has not kept pace with the theoretical developments. This is now a major barrier that is preven...
Proving the Correctness of Compiler Optimisations Based on a Global Analysis: A Study of Strictness Analysis
, 1992
"... A substantial amount of work has been devoted to the proof of correctness of various program analyses but much less attention has been paid to the correctness of compiler optimisations based on these analyses. In this paper we tackle the problem in the context of strictness analysis for lazy functio ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
A substantial amount of work has been devoted to the proof of correctness of various program analyses but much less attention has been paid to the correctness of compiler optimisations based on these analyses. In this paper we tackle the problem in the context of strictness analysis for lazy functional languages. We show that compiler optimisations based on strictness analysis can be expressed formally in the functional framework using continuations. This formal presentation has two benefits: it allows us to give a rigorous correctness proof of the optimised compiler; and it exposes the various optimisations made possible by a strictness analysis.
Realtime Signal Processing  Dataflow, Visual, and Functional Programming
, 1995
"... This thesis presents and justifies a framework for programming realtime signal processing systems. The framework extends the existing "blockdiagram" programming model; it has three components: a very highlevel textual language, a visual language, and the dataflow process network model o ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
(Show Context)
This thesis presents and justifies a framework for programming realtime signal processing systems. The framework extends the existing "blockdiagram" programming model; it has three components: a very highlevel textual language, a visual language, and the dataflow process network model of computation. The dataflow process network model, although widelyused, lacks a formal description, and I provide a semantics for it. The formal work leads into a new form of actor. Having established the semantics of dataflow processes, the functional language Haskell is layered above this model, providing powerful featuresnotably polymorphism, higherorder functions, and algebraic program transformationabsent in blockdiagram systems. A visual equivalent notation for Haskell, Visual Haskell, ensures that this power does not exclude the "intuitive" appeal of visual interfaces; with some intelligent layout and suggestive icons, a Visual Haskell program can be made to look very like a block dia...
A Typebased Framework for Program Analysis
, 1994
"... . In this paper we present a general framework for typebased analyses of functional programs. Our framework is a generalisation of our earlier work on strictness analysis and was inspired by Burn's logical framework. The framework is parameterised by a set of types to represent properties and ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
. In this paper we present a general framework for typebased analyses of functional programs. Our framework is a generalisation of our earlier work on strictness analysis and was inspired by Burn's logical framework. The framework is parameterised by a set of types to represent properties and interpretations for constants in the language. To construct a new analysis, the user needs only to supply a model for the types (which properties they denote) and sound rules for the constants. We identify the local properties that must be proven to guarantee the correctness of a specific analysis and algorithm. We illustrate the approach by recasting Hunt and Sand's binding time analysis in our framework. Furthermore we report on experimental results suggesting that our generic inference algorithm can provide the basis for an efficient program analyser. 1 Introduction The first explicit use of types in program analysis was by Kuo and Mishra [14]. They presented a type system for inferring stric...
Lazy Type Inference for the Strictness Analysis of Lists
, 1994
"... We present a type inference system for the strictness analysis of lists and we show that it can be used as the basis for an efficient algorithm. The algorithm is as accurate as the usual abstract interpretation technique. One distinctive advantage of this approach is that it is not necessary to impo ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We present a type inference system for the strictness analysis of lists and we show that it can be used as the basis for an efficient algorithm. The algorithm is as accurate as the usual abstract interpretation technique. One distinctive advantage of this approach is that it is not necessary to impose an abstract domain of a particular depth prior to the analysis: the lazy type algorithm will instead explore the part of a potentially infinite domain required to prove the strictness property. 1 Introduction Simple strictness analysis returns information about the fact that the result of a function application is undefined when some of the arguments are undefined. This information can be used in a compiler for a lazy functional language because the argument of a strict function can be evaluated (up to weak head normal form) and passed by value. However a more sophisticated property might be useful in the presence of lists or other recursive data structures which are pervasive in functio...
Proving the Correctness of Compiler Optimisations Based on Strictness Analysis
 in Proceedings 5th int. Symp. on Programming Language Implementation and Logic Programming, LNCS 714
, 1993
"... . We show that compiler optimisations based on strictness analysis can be expressed formally in the functional framework using continuations. This formal presentation has two benefits: it allows us to give a rigorous correctness proof of the optimised compiler; and it exposes the various optimisatio ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
. We show that compiler optimisations based on strictness analysis can be expressed formally in the functional framework using continuations. This formal presentation has two benefits: it allows us to give a rigorous correctness proof of the optimised compiler; and it exposes the various optimisations made possible by a strictness analysis. 1 Introduction Realistic compilers for imperative or functional languages include a number of optimisations based on nontrivial global analyses. Proving the correctness of such optimising compilers can be done in three steps: 1. proving the correctness of the original (unoptimised) compiler; 2. proving the correctness of the analysis; and 3. proving the correctness of the modifications of the simpleminded compiler to exploit the results of the analysis. A substantial amount of work has been devoted to steps (1) and (2) but there have been surprisingly few attempts at tackling step (3). In this paper we show how to carry out this third step in the...
Projectionbased Program Analysis
, 1994
"... Projectionbased program analysis techniques are remarkable for their ability togive highly detailed and useful information not obtainable by other methods. The rst proposed projectionbased analysis techniques were those of Wadler and Hughes for strictness analysis, and Launchbury for bindingtime ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Projectionbased program analysis techniques are remarkable for their ability togive highly detailed and useful information not obtainable by other methods. The rst proposed projectionbased analysis techniques were those of Wadler and Hughes for strictness analysis, and Launchbury for bindingtime analysis � both techniques are restricted to analysis of rstorder monomorphic languages. Hughes and Launchbury generalised the strictness analysis technique, and Launchbury the bindingtime analysis technique, to handle polymorphic languages, again restricted to rst order. Other than a general approach to higherorder analysis suggested by Hughes, and an ad hoc implementation of higherorder bindingtime analysis by Mogensen, neither of which had any formal notion of correctness, there has been no successful generalisation to higherorder analysis. We present a complete redevelopment of monomorphic projectionbased program analysis from rst principles, starting by considering the analysis of functions (rather than programs) to establish bounds on the intrinsic power of projectionbased analysis, showing also that projectionbased analysis can capture interesting termination
Fast Strictness Analysis Based on Demand Propagation
 ACM Transactions on Programming Languages and Systems
, 1995
"... Interpretation versus Demand Propagation Wadler [1987] uses abstract interpretation over a fourpoint domain for reasoning about strictness on lists. The four points correspond to undefined list (represented by value 0), infinite lists and lists with some tail undefined (value 1), lists with at lea ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Interpretation versus Demand Propagation Wadler [1987] uses abstract interpretation over a fourpoint domain for reasoning about strictness on lists. The four points correspond to undefined list (represented by value 0), infinite lists and lists with some tail undefined (value 1), lists with at least one head undefined (value 2), and all lists (value 3). Burn's work [Burn 1987] on evaluation transformers also uses abstract interpretation on the abovementioned domain for strictness analysis. He defines four evaluators that correspond to the four points mentioned above in the sense that the ith evaluator will fail to produce an answer when given a list with the abstract value i \Gamma 1.
CpsTranslation and the Correctness of Optimising Compilers
, 1992
"... We show that compiler optimisations based on strictness analysis can be expressed formally in the functional framework using continuations. This formal presentation has two benefits: it allows us to give a rigorous correctness proof of the optimised compiler; and it exposes the various optimisations ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We show that compiler optimisations based on strictness analysis can be expressed formally in the functional framework using continuations. This formal presentation has two benefits: it allows us to give a rigorous correctness proof of the optimised compiler; and it exposes the various optimisations made possible by a strictness analysis. These benefits are especially significant in the presence of partially evaluated data structures. 1 Introduction Realistic compilers for imperative or functional languages include a number of optimisations based on nontrivial global analyses. Proving the correctness of such optimising compilers should involve three steps: 1. proving the correctness of the original (unoptimised) compiler; 2. proving the correctness of the analysis; and 3. proving the correctness of the modifications of the simpleminded compiler to exploit the results of the analysis. A substantial amount of work has been devoted to steps (1) and (2) but there has been surprisingly ...