Results 1 
3 of
3
Oostrom, Uniform normalisation beyond orthogonality
 Proceedings of the Twelfth International Conference on Rewriting Techniques and Applications (RTA ’01), Lecture Notes in Computer Science (2001
, 2001
"... Abstract. A rewrite system is called uniformly normalising if all its steps are perpetual, i.e. are such that if s → t and s has an infinite reduction, then t has one too. For such systems termination (SN) is equivalent to normalisation (WN). A wellknown fact is uniform normalisation of orthogonal ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract. A rewrite system is called uniformly normalising if all its steps are perpetual, i.e. are such that if s → t and s has an infinite reduction, then t has one too. For such systems termination (SN) is equivalent to normalisation (WN). A wellknown fact is uniform normalisation of orthogonal nonerasing term rewrite systems, e.g. the λIcalculus. In the present paper both restrictions are analysed. Orthogonality is seen to pertain to the linear part and nonerasingness to the nonlinear part of rewrite steps. Based on this analysis, a modular proof method for uniform normalisation is presented which allows to go beyond orthogonality. The method is shown applicable to biclosed first and secondorder term rewrite systems as well as to a λcalculus with explicit substitutions. 1
Two Applications of Standardization and Evaluation in Combinatory Reduction Systems
, 2000
"... We present two worked applications of a general framework that can be used to support reasoning about the operational equality relation defined by a programming language semantics. The framework, based on Combinatory Reduction Systems, facilitates the proof of standardization theorems for programmin ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We present two worked applications of a general framework that can be used to support reasoning about the operational equality relation defined by a programming language semantics. The framework, based on Combinatory Reduction Systems, facilitates the proof of standardization theorems for programming calculi. The importance of standardization theorems to programming language semantics was shown by Plotkin [Plo75]: standardization together with confluence guarantee that two terms equated in the calculus are semantically equal. We apply the framework to the λ_νcalculus and to an untyped version of the λ^CILcalculus. The latter is a basis for an intermediate language being used in a compiler.
Type and FlowDirected Compilation for Specialized Data Representations
, 2002
"... The combination of intersection and union types with ow types gives the compiler writer unprecedented exibility in choosing data representations in the context of a typed intermediate language. We present the design of such a language and the design of a framework for exploiting the type system to s ..."
Abstract
 Add to MetaCart
The combination of intersection and union types with ow types gives the compiler writer unprecedented exibility in choosing data representations in the context of a typed intermediate language. We present the design of such a language and the design of a framework for exploiting the type system to support multiple representations of the same data type in a single program. The framework can transform the input term, in a typesafe way, so that dierent data representations can be used in the transformed term  even if they share a use site in the pretransformed term. We have implemented a compiler using the typed intermediate language and instantiated the framework to allow specialized function representations. We test the compiler on a set of benchmarks and show that the compiletime performance is reasonable. We further show that the compiled code does indeed bene t from specialized function representations.