Results 1  10
of
17
Finally Tagless, Partially Evaluated  Tagless Staged Interpreters for Simpler Typed Languages
 UNDER CONSIDERATION FOR PUBLICATION IN J. FUNCTIONAL PROGRAMMING
"... We have built the first family of tagless interpretations for a higherorder typed object language in a typed metalanguage (Haskell or ML) that require no dependent types, generalized algebraic data types, or postprocessing to eliminate tags. The statically typepreserving interpretations include an ..."
Abstract

Cited by 53 (9 self)
 Add to MetaCart
(Show Context)
We have built the first family of tagless interpretations for a higherorder typed object language in a typed metalanguage (Haskell or ML) that require no dependent types, generalized algebraic data types, or postprocessing to eliminate tags. The statically typepreserving interpretations include an evaluator, a compiler (or staged evaluator), a partial evaluator, and callbyname and callbyvalue CPS transformers. Our principal technique is to encode de Bruijn or higherorder abstract syntax using combinator functions rather than data constructors. In other words, we represent object terms not in an initial algebra but using the coalgebraic structure of the λcalculus. Our representation also simulates inductive maps from types to types, which are required for typed partial evaluation and CPS transformations. Our encoding of an object term abstracts uniformly over the family of ways to interpret it, yet statically assures that the interpreters never get stuck. This family of interpreters thus demonstrates again that it is useful to abstract over higherkinded types.
Optimizing data structures in highlevel programs: New directions for extensible compilers based on staging
 In POPL
, 2013
"... High level data structures are a cornerstone of modern programming and at the same time stand in the way of compiler optimizations. In order to reason about user or librarydefined data structures, compilers need to be extensible. Common mechanisms to extend compilers fall into two categories. Front ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
(Show Context)
High level data structures are a cornerstone of modern programming and at the same time stand in the way of compiler optimizations. In order to reason about user or librarydefined data structures, compilers need to be extensible. Common mechanisms to extend compilers fall into two categories. Frontend macros, staging or partial evaluation systems can be used to programmatically remove abstraction and specialize programs before they enter the compiler. Alternatively, some compilers allow extending the internal workings by adding new transformation passes at different points in the compile chain or adding new intermediate representation (IR) types. None of these mechanisms alone is sufficient to handle the challenges posed by high level data structures. This paper shows a novel way to combine them to yield benefits that are greater than the sum of the parts.
Shifting the Stage  Staging with Delimited Control
, 2009
"... It is often hard to write programs that are efficient yet reusable. For example, an efficient implementation of Gaussian elimination should be specialized to the structure and known static properties of the input matrix. The most profitable optimizations, such as choosing the best pivoting or memoiz ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
It is often hard to write programs that are efficient yet reusable. For example, an efficient implementation of Gaussian elimination should be specialized to the structure and known static properties of the input matrix. The most profitable optimizations, such as choosing the best pivoting or memoization, cannot be expected of even an advanced compiler because they are specific to the domain, but expressing these optimizations directly makes for ungainly source code. Instead, a promising and popular way to reconcile efficiency with reusability is for a domain expert to write code generators. Two pillars of this approach are types and effects. Typed multilevel languages such as MetaOCaml ensure safety: a welltyped code generator neither goes wrong nor generates code that goes wrong. Side effects such as state and control ease correctness: an effectful generator can resemble the textbook presentation of an algorithm, as is familiar to domain experts, yet insert let for memoization and if for boundschecking, as is necessary for efficiency. However, adding effects blindly renders multilevel types unsound. We introduce the first twolevel calculus with control effects and a sound type system. We give smallstep operational semantics as well as a continuationpassing style (CPS) translation. For soundness, our calculus restricts the code generator’s effects to the scope of generated binders. Even with this restriction, we can finally write efficient code generators for dynamic programming and numerical methods in direct style, like in algorithm textbooks, rather than in CPS or monadic style.
Accumulating bindings
"... We give a Haskell implementation of Filinski’s normalisation by evaluation algorithm for the computational lambdacalculus with sums. Taking advantage of extensions to the GHC compiler, our implementation represents object language types as Haskell types and ensures that type errors are detected sta ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
We give a Haskell implementation of Filinski’s normalisation by evaluation algorithm for the computational lambdacalculus with sums. Taking advantage of extensions to the GHC compiler, our implementation represents object language types as Haskell types and ensures that type errors are detected statically. Following Filinski, the implementation is parameterised over a residualising monad. The standard residualising monad for sums is a continuation monad. Defunctionalising the uses of the continuation monad we present the binding tree monad as an alternative. 1
Partial evaluation and residual theorems in computer algebra
 in: Ranise and Bigatti [28
"... We have implemented a partial evaluator for Maple. One of the applications of this partial evaluator is to find, in Maple, what is the difference between generic or symbolic evaluation, and complete evaluation. More precisely, when asked degree(a*xˆ2+3,x), Maple replies 2, which is generically true. ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
We have implemented a partial evaluator for Maple. One of the applications of this partial evaluator is to find, in Maple, what is the difference between generic or symbolic evaluation, and complete evaluation. More precisely, when asked degree(a*xˆ2+3,x), Maple replies 2, which is generically true. However, we are interested in the residual formula ¬(a = 0) which, as a guard, makes the answer 2 correct. While special algorithms have been derived in the past for this particular situation, we show how we can derive many of these algorithms as special cases of partially evaluating Maple code. Key words: Maple, symbolic computation, partial evaluation, specialization problem 1
Partial Evaluation of Maple Programs
, 2006
"... Partial Evaluation (PE) is a program transformation technique that generates a specialized version of a program with respect to a subset of its inputs. PE is an automatic approach to program generation and metaprogramming. This thesis presents a method of partially evaluating Maple programs using ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Partial Evaluation (PE) is a program transformation technique that generates a specialized version of a program with respect to a subset of its inputs. PE is an automatic approach to program generation and metaprogramming. This thesis presents a method of partially evaluating Maple programs using a fully online methodology. We present an implementation called MapleMIX, and use it towards two goals. Firstly we show how MapleMIX can be used to generate optimized versions of generic programs written in Maple. Secondly we use MapleMIX to mine symbolic computation code for residual theorems, which we present as precise solutions to parametric problems encountered in Computer Algebra Systems. The implementation of MapleMIX has been modularized using a highlevel intermediate language called Mform. Several syntax transformations from Maple to Mform make it an ideal representation for performing program specialization. Many specialization techniques have been explored including a novel online approach to handle
Abstract Partial Evaluation of Maple
"... Having been convinced of the potential benefits of partial evaluation, we wanted to apply these techniques to code written in Maple, our Computer Algebra System of choice. Maple is a very large language, with a number of nonstandard features. When we tried to implement a partial evaluator for it, w ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Having been convinced of the potential benefits of partial evaluation, we wanted to apply these techniques to code written in Maple, our Computer Algebra System of choice. Maple is a very large language, with a number of nonstandard features. When we tried to implement a partial evaluator for it, we ran into a number of difficulties for which we could find no solution in the literature. Undaunted, we persevered and ultimately implemented a working partial evaluator with which we were able to very successfully conduct our experiments, first on small codes, and now on actual routines taken from Maple’s own library. Here, we document the techniques we had to invent or adapt to achieve these results. Key words: Maple, symbolic computation, partial evaluation, residual theorems 1
Staging Dynamic Programming Algorithms
, 2005
"... Applications of dynamic programming (DP) algorithms are numerous, and include genetic engineering and operations research problems. At a high level, DP algorithms are specified as a system of recursive equations implemented using memoization. The recursive nature of these equations suggests that the ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Applications of dynamic programming (DP) algorithms are numerous, and include genetic engineering and operations research problems. At a high level, DP algorithms are specified as a system of recursive equations implemented using memoization. The recursive nature of these equations suggests that they can be written naturally in a functional language. However, the requirement for memoization poses a subtle challenge: memoization can be implemented using monads, but a systematic treatment introduces several layers of abstraction that can have a prohibitive runtime overhead. Inspired by other researchers' experience with automatic specialization (partial evaluation), this paper investigates the feasibility of explicitly staging DP algorithms in the functional setting. We find that the key challenge is code duplication (which is automatically handled by partial evaluators), and show that a key source of code duplication can be isolated and addressed once and for all. The result is a simple combinator library. We use this library to implement several standard DP algorithms including ones in standard algorithm textbooks (e.g.
Design, Languages
"... It is often hard to write programs that are efficient yet reusable. For example, an efficient implementation of Gaussian elimination should be specialized to the structure and known static properties of the input matrix. The most profitable optimizations, such as choosing the best pivoting or memoiz ..."
Abstract
 Add to MetaCart
(Show Context)
It is often hard to write programs that are efficient yet reusable. For example, an efficient implementation of Gaussian elimination should be specialized to the structure and known static properties of the input matrix. The most profitable optimizations, such as choosing the best pivoting or memoization, cannot be expected of even an advanced compiler because they are specific to the domain, but expressing these optimizations directly makes for ungainly source code. Instead, a promising and popular way to reconcile efficiency with reusability is for a domain expert to write code generators. Two pillars of this approach are types and effects. Typed multilevel languages such as MetaOCaml ensure safety: a welltyped code generator neither goes wrong nor generates code that goes wrong. Side effects such as state and control ease correctness: an effectful generator can resemble the textbook presentation of an algorithm, as is familiar to domain experts, yet insert let for memoization and if for boundschecking, as is necessary for efficiency. However, adding effects blindly renders multilevel types unsound. We introduce the first twolevel calculus with control effects and a sound type system. We give smallstep operational semantics as well as a continuationpassing style (CPS) translation. For soundness, our calculus restricts the code generator’s effects to the scope of generated binders. Even with this restriction, we can finally write efficient code generators for dynamic programming and numerical methods in direct style, like in algorithm textbooks, rather than in CPS or monadic style.
Computational Effects across Generated Binders Maintaining futurestage lexical scope
"... Code generation is the leading approach to making highperformance software reusable. Effects are indispensable in code generators, whether to report failures or to insert letstatements and ifguards. Extensive painful experience shows that unrestricted effects interact with generated binders in und ..."
Abstract
 Add to MetaCart
Code generation is the leading approach to making highperformance software reusable. Effects are indispensable in code generators, whether to report failures or to insert letstatements and ifguards. Extensive painful experience shows that unrestricted effects interact with generated binders in undesirable ways to produce unexpectedly unbound variables, or worse, unexpectedly bound ones. These subtleties prevent experts in the application domain, not in programming languages, from using and extending the generator. A pressing problem is thus to express the desired effects while regulating them so that the generated code is correct, or at least correctly scoped, by construction. In an imminently practical codegeneration framework, we show how to express arbitrary effects, including mutable references and delimited control, that move open code across generated binders. The static types of our generator expressions not only ensure that a welltyped generator produces welltyped and wellscoped code, but also express the lexical scopes of generated binders and prevent mixing up variables with different scopes. This precise notion of lexical scope subsumes the complaints about intuitively wrong example generators in the literature. For the first time, we demonstrate statically safe letinsertion across an arbitrary number of binders. Our framework is implemented as a Haskell library that embeds an extensible typed higherorder domainspecific language. It may be regarded as ‘staged Haskell. ’ The library is convenient to use thanks to the maturity of Haskell, higherorder abstract syntax, and polymorphism over generated type environments.