Results 11  20
of
26
Short cut fusion: Proved and improved
 Pages 47–71 of: Semantics, Applications, and Implementation of Program Generation
, 2001
"... Abstract. Short cut fusion is a particular program transformation technique which uses a single, local transformation — called the foldrbuild rule — to remove certain intermediate lists from modularly constructed functional programs. Arguments that short cut fusion is correct typically appeal eithe ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Short cut fusion is a particular program transformation technique which uses a single, local transformation — called the foldrbuild rule — to remove certain intermediate lists from modularly constructed functional programs. Arguments that short cut fusion is correct typically appeal either to intuition or to “free theorems ” — even though the latter have not been known to hold for the languages supporting higherorder polymorphic functions and fixed point recursion in which short cut fusion is usually applied. In this paper we use Pitts ’ recent demonstration that contextual equivalence in such languages is relationally parametric to prove that programs in them which have undergone short cut fusion are contextually equivalent to their unfused counterparts. The same techniques in fact yield a much more general result. For each algebraic data type we define a generalization augment of build which constructs substitution instances of its associated data structures. Together with the wellknown generalization cata of foldr to arbitrary algebraic data types, this allows us to formulate and prove correct for each a contextual equivalencepreserving cataaugment fusion rule. These rules optimize compositions of functions that uniformly consume algebraic data structures with functions that uniformly produce substitution instances of them. 1
Lumberjack Summer Camp: A CrossInstitutional Undergraduate Research Experience in Computer Science
 Computer Science Education
, 2001
"... This paper describes our experiences in leading Lumberjack Summer Camp, a tenweek undergraduate research experience in compilerbased optimization techniques for functional programs, held during the summer of 2000. Like many undergraduate research experiences, Lumberjack Summer Camp was designed ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
This paper describes our experiences in leading Lumberjack Summer Camp, a tenweek undergraduate research experience in compilerbased optimization techniques for functional programs, held during the summer of 2000. Like many undergraduate research experiences, Lumberjack Summer Camp was designed to provide an opportunity for students and faculty to work closely together toward a common research goal. But Lumberjack Summer Camp was designed around an additional aim as well: to bring together a critical mass of researchers from two small liberal arts colleges to pursue individual research projects situated within one overarching, collaborative, cuttingedge research endeavor. We explore some important consequences of this design choice, ultimately o#ering Lumberjack Summer Camp as an unusual, but very workable, model for undergraduate research experiences.
Calculation Rules for Warmingup in Fusion Transformation
"... Warmup transformation is an important preprocess for shortcut fusion. In this paper, we formalize the warmup transformation by proposing a set of general and powerful calculation rules that can be directly implemented with higherorder pattern matching. The newly formalized warmup transformation ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Warmup transformation is an important preprocess for shortcut fusion. In this paper, we formalize the warmup transformation by proposing a set of general and powerful calculation rules that can be directly implemented with higherorder pattern matching. The newly formalized warmup transformation can deal with programs that existing methods may fail, and have been efficiently implemented with the Yicho calculation system. One important advantage of our calculational approach is its compatibility with other calculations, such as fusion, tupling, accumulation, and parallelization. Therefore, warmup transformation in this form can coexist well with many other calculations in the same system.
ListProcessing Optimizations in Curry
 In Proceedings of the 9th Intl. Workshop on Functional and Logic Programming
, 2000
"... The multiparadigm language Curry integrates features from functional, logic, and concurrent programming. In this work, we consider two wellknown listprocessing optimizations: short cut deforestation (from functional programming) and differencelists (from logic programming), and study their ada ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The multiparadigm language Curry integrates features from functional, logic, and concurrent programming. In this work, we consider two wellknown listprocessing optimizations: short cut deforestation (from functional programming) and differencelists (from logic programming), and study their adaptation to our integrated setting. While short cut deforestation adapts smoothly, the use of differencelists in Curry is impractical due to the absence of nonstrict equality in the language. Despite all, we have developed a novel transformation which achieves a similar effect over functional logic programs. Both transformations combined together yield a simple and practical method for optimizing listprocessing programs. 1
Theoretical Foundations for Practical ‘Totally Functional Programming’
, 2007
"... Interpretation is an implicit part of today’s programming; it has great power but is overused and has
significant costs. For example, interpreters are typically significantly hard to understand and hard
to reason about. The methodology of “Totally Functional Programming” (TFP) is a reasoned
attempt ..."
Abstract
 Add to MetaCart
Interpretation is an implicit part of today’s programming; it has great power but is overused and has
significant costs. For example, interpreters are typically significantly hard to understand and hard
to reason about. The methodology of “Totally Functional Programming” (TFP) is a reasoned
attempt to redress the problem of interpretation. It incorporates an awareness of the undesirability
of interpretation with observations that definitions and a certain style of programming appear to
offer alternatives to it. Application of TFP is expected to lead to a number of significant outcomes,
theoretical as well as practical. Primary among these are novel programming languages to lessen or
eliminate the use of interpretation in programming, leading to betterquality software. However,
TFP contains a number of lacunae in its current formulation, which hinder development of these
outcomes. Among others, formal semantics and typesystems for TFP languages are yet to be
discovered, the means to reduce interpretation in programs is to be determined, and a detailed
explication is needed of interpretation, definition, and the differences between the two. Most
important of all however is the need to develop a complete understanding of the nature of
interpretation. In this work, suitable typesystems for TFP languages are identified, and guidance
given regarding the construction of appropriate formal semantics. Techniques, based around the
‘fold’ operator, are identified and developed for modifying programs so as to reduce the amount of
interpretation they contain. Interpretation as a means of languageextension is also investigated.
v
Finally, the nature of interpretation is considered. Numerous hypotheses relating to it considered in
detail. Combining the results of those analyses with discoveries from elsewhere in this work leads
to the proposal that interpretation is not, in fact, symbolbased computation, but is in fact something
more fundamental: computation that varies with input. We discuss in detail various implications of
this characterisation, including its practical application. An often moreuseful property, ‘inherent
interpretiveness’, is also motivated and discussed in depth. Overall, our inquiries act to give
conceptual and theoretical foundations for practical TFP.
On Deforesting Parameters of Accumulating Maps (Extended Abstract)
, 2001
"... Deforestation is a wellknown program transformation technique which eliminates intermediate data structures that are passed between functions. One of its weaknesses is the inability to deforest programs using accumulating parameters. We show how intermediate lists built by a selected class of funct ..."
Abstract
 Add to MetaCart
Deforestation is a wellknown program transformation technique which eliminates intermediate data structures that are passed between functions. One of its weaknesses is the inability to deforest programs using accumulating parameters. We show how intermediate lists built by a selected class of functional programs, namely `accumulating maps', can be deforested using a single composition rule. For this we introduce a new function dmap, a symmetric extension of the familiar function map. While the associated composition rule cannot capture all deforestation problems, it can handle accumulator fusion of functions de ned in terms of dmap in a surprisingly simple way. The rule for accumulator fusion presented here can also be viewed as a restricted composition scheme for attribute grammars, which in turn may help us to bridge the gap between the attribute and functional world.
IOS Press The Impact of seq on Free TheoremsBased Program Transformations ∗
"... Abstract. Parametric polymorphism constrains the behavior of pure functional programs in a way that allows the derivation of interesting theorems about them solely from their types, i.e., virtually for free. Unfortunately, standard parametricity results — including socalled free theorems — fail for ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Parametric polymorphism constrains the behavior of pure functional programs in a way that allows the derivation of interesting theorems about them solely from their types, i.e., virtually for free. Unfortunately, standard parametricity results — including socalled free theorems — fail for nonstrict languages supporting a polymorphic strict evaluation primitive such as Haskell’s seq. A folk theorem maintains that such results hold for a subset of Haskell corresponding to a GirardReynolds calculus with fixpoints and algebraic datatypes even when seq is present provided the relations which appear in their derivations are required to be bottomreflecting and admissible. In this paper we show that this folklore is incorrect, but that parametricity results can be recovered in the presence of seq by restricting attention to leftclosed, total, and admissible relations instead. The key novelty of our approach is the asymmetry introduced by leftclosedness, which leads to “inequational ” versions of standard parametricity results together with preconditions guaranteeing their validity even when seq is present. We use these results to derive criteria ensuring that both equational and inequational versions of short cut fusion and related program transformations based
Research Summary
, 2001
"... My main research area is the design, analysis, and implementation of expressive programming languages. I also work on pedagogical aspects and applications of programming and of programming languages. This document summarizes my research and publications in these areas. Much of the research described ..."
Abstract
 Add to MetaCart
(Show Context)
My main research area is the design, analysis, and implementation of expressive programming languages. I also work on pedagogical aspects and applications of programming and of programming languages. This document summarizes my research and publications in these areas. Much of the research described here was undertaken as part of the Church Project 1, a group of programming language researchers investigating applications of formal systems in programming language design, analysis, and implementation. I was a cofounder of the Church Project in September,
Deforestation of Functional Programs through Type Inference Olaf Chitil
"... Abstract. Deforestation optimises a functional program by transforming it into another one that does not create certain intermediate data structures. Short cut deforestation is a deforestation method which is based on a single, local transformation rule. In return, short cut deforestation expects ..."
Abstract
 Add to MetaCart
Abstract. Deforestation optimises a functional program by transforming it into another one that does not create certain intermediate data structures. Short cut deforestation is a deforestation method which is based on a single, local transformation rule. In return, short cut deforestation expects both producer and consumer of the intermediate structure in a certain form. Starting from the fact that short cut deforestation is based on a parametricity theorem of the secondorder typed λcalculus, we show how the required form of a list producer can be derived through the use of type inference. Type inference can also indicates which function definitions need to be inlined. Because only limited inlining across module boundaries is practically feasible, we develop a scheme for splitting a function definition into a worker definition and a wrapper definition. For deforestation we only need to inline the small wrapper definition. 1