Results 1 
7 of
7
Program Derivation With Verified Transformations  A Case Study
, 1995
"... A program development methodology based on verified program transformations is described and illustrated through derivations of a high level bisimulation algorithm and an improved minimumstate DFA algorithm. Certain doubts that were raised about the correctness of an initial paperandpencil deriva ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
A program development methodology based on verified program transformations is described and illustrated through derivations of a high level bisimulation algorithm and an improved minimumstate DFA algorithm. Certain doubts that were raised about the correctness of an initial paperandpencil derivation of the DFA minimizationalgorithm were laid to rest by machinechecked formal proofs of the most difficult derivational steps. Although the protracted labor involved in designing and checking these proofs was almost overwhelming, the expense was somewhat offset by a successful reuse of major portions of these proofs. In particular, the DFA minimization algorithm is obtained by specializing and then extending the last step in the derivation of the high level bisimulation algorithm. Our experience suggests that a major focus of future research should be aimed towards improving the technology of machine checkable proofs  their construction, presentation, and reuse. This paper demonstrat...
Compiler Optimizations for Lowlevel Redundancy Elimination: An Application of Metalevel Prolog Primitives
 Proc. Third Workshop on Metaprogramming in Logic (META92
, 1992
"... Much of the work on applications of metalevel primitives in logic programs focusses on highlevel aspects such as sourcelevel program transformation, interpretation, and partial evaluation. In this paper, we show how metalevel primitives can be used in a very simple way for lowlevel code opt ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Much of the work on applications of metalevel primitives in logic programs focusses on highlevel aspects such as sourcelevel program transformation, interpretation, and partial evaluation. In this paper, we show how metalevel primitives can be used in a very simple way for lowlevel code optimization in compilers. The resulting code optimizer is small, simple, efficient, and easy to modify and retarget. An optimizer based on these ideas is currently being used in a compiler that we have developed for Janus [6]. 1 Introduction Much of the work on applications of metalevel primitives in logic programs focuses on highlevel aspects such as sourcelevel program transformation, interpretation, and partial evaluation. In this paper, we consider instead the use of metalevel Prolog primitives in lowlevel code optimization in compilers. We show how such primitives can be used in a very simple way for a lowlevel code optimization called common subexpression elimination. The resu...
Multiset Discrimination  a Method for Implementing Programming Language Systems Without Hashing
"... It is generally assumed that hashing is essential to many algorithms related to efficient compilation; e.g., symbol table formation and maintenance, grammar manipulation, basic block optimization, and global optimization. This paper questions this assumption, and initiates development of an effic ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
It is generally assumed that hashing is essential to many algorithms related to efficient compilation; e.g., symbol table formation and maintenance, grammar manipulation, basic block optimization, and global optimization. This paper questions this assumption, and initiates development of an efficient alternative compiler methodology without hashing or sorting. Underlying this methodology are several generic algorithmic tools, among which special importance is given to Multiset Discrimination, which partitions a multiset into blocks of duplicate elements. We show how multiset discrimination, together with other tools, can be tailored to rid compilation of hashing without loss in asymptotic performance. Because of the simplicity of these tools, our results may be of practical as well as theoretical interest. The various applications presented culminate with a new algorithm to solve iterated strength reduction folded with useless code elimination that runs in worst case asympto...
Computational Divided Differencing and DividedDifference Arithmetics
, 2000
"... Tools for computational differentiation transform a program that computes a numerical function F (x) into a related program that computes F 0 (x) (the derivative of F ). This paper describes how techniques similar to those used in computationaldifferentiation tools can be used to implement other pr ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Tools for computational differentiation transform a program that computes a numerical function F (x) into a related program that computes F 0 (x) (the derivative of F ). This paper describes how techniques similar to those used in computationaldifferentiation tools can be used to implement other program transformations  in particular, a variety of transformations for computational divided differencing . The specific technical contributions of the paper are as follows: It presents a program transformation that, given a numerical function F (x) de ned by a program, creates a program that computes F [x0 ; x1 ], the first divided difference of F(x), where F [x0 ; x1 ] def = F (x 0 ) F (x 1 ) x 0 x 1 if x0 6= x1 d dz F (z); evaluated at z = x0 if x0 = x1 It shows how computational first divided differencing generalizes computational differentiation. It presents a second program transformation that permits the creation of higherorder divided differences of a numerical function de ...
Binary Decision Diagrams and Applications for Reliability Analysis
, 2000
"... This thesis investigates practical and theoretical concerns for the use of Binary Decision Diagrams (BDDs) for qualitative and quantitative risk assessments of complex systems. Boolean models describing failure relationships between components, and fault trees in particular, are boolean formulas who ..."
Abstract
 Add to MetaCart
This thesis investigates practical and theoretical concerns for the use of Binary Decision Diagrams (BDDs) for qualitative and quantitative risk assessments of complex systems. Boolean models describing failure relationships between components, and fault trees in particular, are boolean formulas whose variables are individual component failures; assessment of these models can be performed by analysis of the boolean function induced by the formula. Resource consumption for BDD computations, which is determined by the form of the boolean formula and the order imposed on its variables, is in many cases exponentially smaller than the truth table for the function. The use of Binary Decision Diagrams has made possible ordersofmagnitude increases in the complexity of systems that can be assessed efficiently. Nonetheless, the practical limits of straightforward use of BDDs for reliability analysis are often surpassed by realworld systems. Understanding why this happens is the first subject...
Compiler Optimizations Should Pay for Themselves  Applying the Spirit of Oberon to Code Optimization by Compiler
"... Optimizing compilers tend to be much larger and much slower than their straightforward counterparts. Their designers usually do not follow Oberon's maxim of making things "as simple as possible", but are inclined to completely disregard cost (in terms of compiler size, compilation speed, and maintai ..."
Abstract
 Add to MetaCart
Optimizing compilers tend to be much larger and much slower than their straightforward counterparts. Their designers usually do not follow Oberon's maxim of making things "as simple as possible", but are inclined to completely disregard cost (in terms of compiler size, compilation speed, and maintainability) in favor of code_quality benefits that often turn out to be relatively marginal. Trying to make an optimizing compiler as simple as possible and yet as powerful as necessary requires, before all else, a measurement standard, by which both simplicity and power can be judged. For a compiler that is written in the language it compiles, two such standards are easily found by considering first the time required for self_compilation, and then the size of the resulting object program. With the help of these benchmarks, one may pit simplicity against power, requiring that every new capability added to the compiler "pays its own way" by creating more benefit than cost on account of at leas...