Results 1  10
of
40
Projections for Strictness Analysis
, 1987
"... Contexts have been proposed as a means of performing strictness analysis on nonflat domains. Roughly speaking, a context describes how much a subexpression will be evaluated by the surrounding program. This paper shows how contexts can be represented using the notion of projection from domain theo ..."
Abstract

Cited by 98 (4 self)
 Add to MetaCart
Contexts have been proposed as a means of performing strictness analysis on nonflat domains. Roughly speaking, a context describes how much a subexpression will be evaluated by the surrounding program. This paper shows how contexts can be represented using the notion of projection from domain theory. This is clearer than the previous explanation of contexts in terms of continuations. In addition, this paper describes finite domains of contexts over the nonflat list domain. This means that recursive context equations can be solved using standard fixpoint techniques, instead of the algebraic manipulation previously used. Praises of lazy functional languages have been widely sung, and so have some curses. One reason for praise is that laziness supports programming styles that are inconvenient or impossible otherwise [Joh87, Hug84, Wad85a]. One reason for cursing is that laziness hinders efficient implementation. Still, acceptable efficiency for lazy languages is at last being achieved...
ContextSensitive Computations in Functional and Functional Logic Programs
 JOURNAL OF FUNCTIONAL AND LOGIC PROGRAMMING
, 1998
"... ..."
Analysis and Efficient Implementation of Functional Programs
, 1991
"... machines and implementations : : : : : : : : : : : : : : : : : : : 7 1.4 Optimized implementation : : : : : : : : : : : : : : : : : : : : : : : : : : : 9 1.5 Plan of the report : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 10 2 SemanticsBased Program Analysis 12 2.1 Semantics an ..."
Abstract

Cited by 49 (1 self)
 Add to MetaCart
machines and implementations : : : : : : : : : : : : : : : : : : : 7 1.4 Optimized implementation : : : : : : : : : : : : : : : : : : : : : : : : : : : 9 1.5 Plan of the report : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 10 2 SemanticsBased Program Analysis 12 2.1 Semantics and program analysis : : : : : : : : : : : : : : : : : : : : : : : : 12 2.2 Abstract interpretation : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 12 2.3 Semantic analysis information : : : : : : : : : : : : : : : : : : : : : : : : : 13 2.4 Instrumented semantics : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 14 2.5 Correctness of analyses : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 15 3 A Lazy Example Language 19 3.1 The lazy language L : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 19 3.2 The call by name Krivine machine : : : : : : : : : : : : : : : : : : : : : : : 21 3.3 Adding data structures : : : : : : : : : : : : : : : : : : : : : : : ...
ContextSensitive Rewriting Strategies
, 1997
"... Contextsensitive rewriting is a simple restriction of rewriting which is formalized by imposing fixed restrictions on replacements. Such a restriction is given on a purely syntactic basis: it is (explicitly or automatically) specified on the arguments of symbols of the signature and inductively ..."
Abstract

Cited by 43 (30 self)
 Add to MetaCart
Contextsensitive rewriting is a simple restriction of rewriting which is formalized by imposing fixed restrictions on replacements. Such a restriction is given on a purely syntactic basis: it is (explicitly or automatically) specified on the arguments of symbols of the signature and inductively extended to arbitrary positions of terms built from those symbols. Termination is not only preserved but usually improved and several methods have been developed to formally prove it. In this paper, we investigate the definition, properties, and use of contextsensitive rewriting strategies, i.e., particular, fixed sequences of contextsensitive rewriting steps. We study how to define them in order to obtain efficient computations and to ensure that contextsensitive computations terminate whenever possible. We give conditions enabling the use of these strategies for rootnormalization, normalization, and infinitary normalization. We show that this theory is suitable for formalizing ...
Abstract Interpretation of Functional Languages: From Theory to Practice
, 1991
"... Abstract interpretation is the name applied to a number of techniques for reasoning about programs by evaluating them over nonstandard domains whose elements denote properties over the standard domains. This thesis is concerned with higherorder functional languages and abstract interpretations with ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
Abstract interpretation is the name applied to a number of techniques for reasoning about programs by evaluating them over nonstandard domains whose elements denote properties over the standard domains. This thesis is concerned with higherorder functional languages and abstract interpretations with a formal semantic basis. It is known how abstract interpretation for the simply typed lambda calculus can be formalised by using binary logical relations. This has the advantage of making correctness and other semantic concerns straightforward to reason about. Its main disadvantage is that it enforces the identification of properties as sets. This thesis shows how the known formalism can be generalised by the use of ternary logical relations, and in particular how this allows abstract values to deno...
Annotated Type Systems for Program Analysis
, 1995
"... Interpretation Table 1.2: Annotations in the Thesis In Chapter 2 we present a combined strictness and totality analysis.We are specifying the analysis as an annotated type system. The type system allows conjunctions of annotated types, but only at the toplevel. The analysis is somewhat more powerf ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Interpretation Table 1.2: Annotations in the Thesis In Chapter 2 we present a combined strictness and totality analysis.We are specifying the analysis as an annotated type system. The type system allows conjunctions of annotated types, but only at the toplevel. The analysis is somewhat more powerful than the strictness analysis by Kuo and Mishra [KM89] due to the conjunctions and in that we also consider totality. The analysis is shown sound with respect to a naturalstyle operational semantics. The analysis is not immediately extendable to full conjunction. The analysis of Chapter 3 is also a combined strictness and totality analysis, however with "full" conjunction. Soundness of the analysis is shown with respect to a denotational semantics. The analysis is more powerful than the strictness analyses by Jensen [Jen92a] and Benton [Ben93] in that it in addition to strictness considers totality. So far we have only specified the analyses, however in order for the analyses to be practically useful we need an algorithm for inferring the annotated types. In Chapter 4 we construct an algorithm for the analysis of Chapter 2 The conjunctions are only allow at the "toplevel". 1.3. OVERVIEW OF THESIS 25 3usingthelazy type approach by Hankin and Le Metayer [HM94a]. The reason for choosing the analysis from Chapter 3 is that the approach not applicable to the analysis from Chapter 2. In Chapter 5 we study a binding time analysis. We take the analysis specified by Nielson and Nielson [NN92] and we construct an more e#cient algorithm than the one proposed in [NN92]. The algorithm collects constraints in a structural manner as the algorithm T [Dam85]. Afterwards the minimal solution to the set of constraints is found. The analysis in Chapter 6 is specified by abstract interp...
Compilation of Functional Languages Using Flow Graph Analysis
, 1994
"... syntax, and syntactic and semantic domains of a flow graph Figure 9. Semantic equations Def and Exp of a flow graph The first argument to the functions Def and Exp specifies a set of nodes that represent a flow graph, from which the element(s) of current interest are selected by pattern matching. ..."
Abstract

Cited by 17 (13 self)
 Add to MetaCart
syntax, and syntactic and semantic domains of a flow graph Figure 9. Semantic equations Def and Exp of a flow graph The first argument to the functions Def and Exp specifies a set of nodes that represent a flow graph, from which the element(s) of current interest are selected by pattern matching.
Proving the Correctness of Compiler Optimisations Based on a Global Analysis: A Study of Strictness Analysis
, 1992
"... A substantial amount of work has been devoted to the proof of correctness of various program analyses but much less attention has been paid to the correctness of compiler optimisations based on these analyses. In this paper we tackle the problem in the context of strictness analysis for lazy functio ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
A substantial amount of work has been devoted to the proof of correctness of various program analyses but much less attention has been paid to the correctness of compiler optimisations based on these analyses. In this paper we tackle the problem in the context of strictness analysis for lazy functional languages. We show that compiler optimisations based on strictness analysis can be expressed formally in the functional framework using continuations. This formal presentation has two benefits: it allows us to give a rigorous correctness proof of the optimised compiler; and it exposes the various optimisations made possible by a strictness analysis. 1 Introduction Realistic compilers for imperative or functional languages include a number of optimisations based on nontrivial global analyses. Proving the correctness of such optimising compilers can be done in three steps: 1. proving the correctness of the original (unoptimised) compiler; Correspondence regarding this paper should be ...
Strictness and Totality Analysis
 In Static Analysis, LNCS 864
, 1994
"... We definea novel inference system for strictness and totality analysis for the simplytyped lazy lambdacalculus with constants and fixpoints. Strictness information identifies those terms that definitely denote bottom (i.e. do not evaluate to WHNF) whereas totality information identifies those terms ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
We definea novel inference system for strictness and totality analysis for the simplytyped lazy lambdacalculus with constants and fixpoints. Strictness information identifies those terms that definitely denote bottom (i.e. do not evaluate to WHNF) whereas totality information identifies those terms that definitely do not denote bottom (i.e. do evaluate to WHNF). The analysis is presented as an annotated type system allowing conjunctions only at "toplevel". We give examples of its use and prove the correctness with respect to a naturalstyle operational semantics. 1 Introduction Strictness analysis has proved useful in the implementation of lazy functional languages as Miranda, Lazy ML and Haskell: when a function is strict it is safe to evaluate its argument before performing the function call. Totality analysis is equally useful but has not be adopted so widely: if the argument to a function is known to terminate then it is safe to evaluate it before performing the function call [1...
The Evaluation Transformer Model of Reduction and Its Correctness
 in TAPSOFT 91
, 1991
"... Lazy evaluation of functional programs incurs time and memory overheads, and restricts parallelism compared with programs that are evaluated strictly. A number of analysis techniques, such as abstract interpretation and projection analysis, have been developed to find out information that can allevi ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
Lazy evaluation of functional programs incurs time and memory overheads, and restricts parallelism compared with programs that are evaluated strictly. A number of analysis techniques, such as abstract interpretation and projection analysis, have been developed to find out information that can alleviate these overheads. This paper formalises an evaluation model, the evaluation transformer model of reduction, which can use information from these analysis techniques, and proves that the resulting reduction strategies produce the same answers as those obtained using lazy evaluation.