Results 1 
9 of
9
The TAMPR Program Transformation System: Design and Applications
 Modern Software Tools for Scientific Computing, Birkhauser
, 1997
"... TAMPR is a fully automatic, rewriterule based program transformation system. From its initial implementation in 1970, TAMPR has evolved into a powerful tool for generating correct and efficient programs from specifications. The TAMPR approach to program transformation is distinguished by ffl A res ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
TAMPR is a fully automatic, rewriterule based program transformation system. From its initial implementation in 1970, TAMPR has evolved into a powerful tool for generating correct and efficient programs from specifications. The TAMPR approach to program transformation is distinguished by ffl A restricted repertoire of constructs for expressing transformations; ffl A declarative semantics for transformations; ffl Application of transformations to exhaustion; ffl An emphasis on sequences of canonical forms; ffl Completely automatic operation; and ffl The ability to effortlessly "replay" the application of transformations. We describe some of the applications of the TAMPR system, to document the power and practicality of its approach. We then discuss the TAMPR approach to program transformation. This approach manifests itself in TAMPR's highlevel language for expressing transformations; we discuss some aspects of both the design of this language and the philosophy of transformatio...
Parametric Fortran – A Program Generator for Customized Generic Fortran Extensions
 Proceedings Practical Aspects of Declarative Languages (PADL 2004), LNCS
, 2004
"... Abstract. We describe the design and implementation of a program generator that can produce extensions of Fortran that are specialized to support the programming of particular applications. Extensions are specified through parameter structures that can be referred to in Fortran programs to specify t ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Abstract. We describe the design and implementation of a program generator that can produce extensions of Fortran that are specialized to support the programming of particular applications. Extensions are specified through parameter structures that can be referred to in Fortran programs to specify the dependency of program parts on these parameters. By providing parameter values, a parameterized Fortran program can be translated into a regular Fortran program. We describe as a realworld application of this program generator the implementation of a generic inverse ocean modeling tool. The program generator is implemented in Haskell and makes use of sophisticated features, such as multiparameter type classes, existential types, and generic programming extensions and thus represents the application of an advanced applicative language to a realworld problem.
A New Architecture for TransformationBased Generators
 IEEE Trans. on Software Engineering
, 2004
"... A serious problem of many transformationbased generators is that they are trying to achieve three mutually antagonistic goals simultaneously: 1) deeply factored operators and operands to gain the combinatorial programming leverage provided by composition, 2) high performance code in the generated p ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
A serious problem of many transformationbased generators is that they are trying to achieve three mutually antagonistic goals simultaneously: 1) deeply factored operators and operands to gain the combinatorial programming leverage provided by composition, 2) high performance code in the generated program, and 3) small (i.e., practical) generation search spaces. The hypothesis of this paper is that current generator structures are inadequate to fully achieve these goals because they often induce an explosion of the generation search space. Therefore, new architectures are required. A generator has been implemented in Common LISP to explore architectural variations needed to address this quandary. It is called the Anticipatory Optimization Generator (AOG ) because it allows programmers to anticipate optimization opportunities and to prepare an abstract, distributed plan that attempts to achieve them. More generally, AOG uses several strategies to prevent generation search spaces from becoming an explosion of choices but the fundamental principle underlying all of them is to solve separate, narrow and specialized generation problems by strategies tailored to each individual problem rather than attempting to solve all problems by a single, general strategy. A second fundamental notion is the preservation and use of domain specific information as a way to gain leverage on generation problems. This paper will focus on two specific mechanisms: 1) Localization: The generation and merging of implicit control structures, and 2) TagDirected Transformations: A new control structure for transformationbased optimization that allows differing kinds of domain knowledge (e.g., optimization knowledge) to be anticipated, affixed to the component parts in the reuse library, and trig...
The Data Field Model
 Coyne R D, Rosenman M A, Radford A D, Balachandran M and Gero J S Knowledgebased
, 2001
"... Indexed data structures are prevalent in many programming applications. Collectionoriented languages provide means to operate directly on these structures, rather than having to loop or recurse through them. This style of programming will often yield clear and succinct programs. However, these prog ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Indexed data structures are prevalent in many programming applications. Collectionoriented languages provide means to operate directly on these structures, rather than having to loop or recurse through them. This style of programming will often yield clear and succinct programs. However, these programming languages will often provide only a limited choice of indexed data types and primitives, and the exact semantics of these primitives will sometimes vary with the data type and language. In this paper we develop a unifying semantical model for indexed data structures. The purpose is to support the construction of abstract data types and language features for such structures from first principles, such that they are largely generic over many kinds of data structures. The use of these abstract data types can make programs and their semantics less dependent of the actual data structure. This makes programs more portable across different architectures and facilitates the early design phase. The model is a generalisation of arrays, which we call data fields: these are functions with explicit information about their domains. This information can be conventional array bounds but it could also define other shapes, for instance sparse. Data fields can be interpreted as partial functions, and we define a metalanguage for partial functions. In this language we define abstract versions of collectionoriented operations, and we show a number of identities for them. This theory is used to guide the design of data fields and their operations so they correspond closely to the more abstract notion of partial functions. We define phiabstraction, a lambdalike syntax for defining data fields in a shapeindependent manner, and prove a theorem which relates phiabstraction and lambdaabstraction semantically. We also define a small data field language whose semantics is given by formal data fields, and give examples of data field programming for parallel algorithms with arrays and sparse structures, database quering and computing, and specification of symbolic drawings.
Development and Verification of Parallel Algorithms in the Data Field Model
 Proc. 2nd Int. Workshop on Constructive Methods for Parallel Programming
, 2000
"... . Data fields are partial functions provided with explicit domain information. They provide a very general, formal model for collections of data. Algorithms computing data collections can be described in this formalism at various levels of abstraction: in particular, explicit data distributions a ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
. Data fields are partial functions provided with explicit domain information. They provide a very general, formal model for collections of data. Algorithms computing data collections can be described in this formalism at various levels of abstraction: in particular, explicit data distributions are easy to model. Parallel versions of algorithms can then be formally verified against algorithm specifications in the model. Functions computing data fields can be directly programmed in the language Data Field Haskell. In this paper we give a brief introduction to the data field model. We then describe Data Field Haskell and make a small case study of how an algorithm and a parallel version of it both can be specified in the language. We then verify the correctness of the parallel version in the data field model. 1 Introduction Many computing applications require indexed data structures. In many applications the indexing capability provides an important part of the model. On the ...
Array Form Transformations: Proofs of Correctness
, 1995
"... A number of program transformations are proved to preserve the meaning of programs. The transformations convert array operations expressed using a small number of generalpurpose functions into applications of a large number of functions suited to efficient implementation on an array processor. ..."
Abstract
 Add to MetaCart
A number of program transformations are proved to preserve the meaning of programs. The transformations convert array operations expressed using a small number of generalpurpose functions into applications of a large number of functions suited to efficient implementation on an array processor.
A Case Study on Proving Transformations Correct: . . .
"... The issue of correctness in the context of a certain style of program transformation is investigated. This style is characterised by the fully automated application of large numbers of simple transformation rules to a representation of a functional program (serving as a specification) to produce an ..."
Abstract
 Add to MetaCart
The issue of correctness in the context of a certain style of program transformation is investigated. This style is characterised by the fully automated application of large numbers of simple transformation rules to a representation of a functional program (serving as a specification) to produce an equivalent efficient imperative program. The simplicity of the transformation rules ensures that the proofs of their correctness are straightforward. A selection of transformations appropriate for use in a particular context are shown to preserve program meaning. The transformations convert array operations expressed as the application of a small number of generalpurpose functions into applications of a large number of functions which are amenable to efficient implementation on an array processor.
Multilevel Neural Modelling: . . .
"... NeuModel, an ongoing project aimed at developing a neural simulation environment that is extremely computationally powerful and flexible, is described. It is shown that the use of good Software Engineering techniques in NeuModel’s design and implementation is resulting in a high performance syste ..."
Abstract
 Add to MetaCart
NeuModel, an ongoing project aimed at developing a neural simulation environment that is extremely computationally powerful and flexible, is described. It is shown that the use of good Software Engineering techniques in NeuModel’s design and implementation is resulting in a high performance system that is powerful and flexible enough to allow rigorous exploration of brain function at a variety of conceptual levels.
1.1 The General Problem
"... Abstract A challenge of many transformationbased generators is that they are trying to achieve three mutually antagonistic goals simultaneously: 1) deeply factored operators and operands to gain the combinatorial programming leverage provided by composition, 2) high performance code in the generat ..."
Abstract
 Add to MetaCart
Abstract A challenge of many transformationbased generators is that they are trying to achieve three mutually antagonistic goals simultaneously: 1) deeply factored operators and operands to gain the combinatorial programming leverage provided by composition, 2) high performance code in the generated program, and 3) small (i.e., practical) generation search spaces. The Anticipatory Optimization Generator (AOG) has been built to explore architectures and strategies that address this challenge. The fundamental principle underlying all of AOG’s strategies is to solve separate, narrow and specialized generation problems by strategies that are narrowly tailored to specific problems rather than a single, universal strategy aimed at all problems. A second fundamental notion is the preservation and use of domainspecific information as a way to gain extra leverage on generation problems. This paper will focus on two specific mechanisms: 1) Localization: The generation and merging of implicit control structures, and 2) TagDirected Transformations: A new control structure for transformationbased optimization that allows differing kinds of retained domain knowledge (e.g., optimization knowledge) to be anticipated, affixed to the component parts in the reuse library, and triggered when the time is right for its use.