Results 1 
6 of
6
From interpreting to compiling binding times
 Proceedings of the 3rd European Symposium on Programming
, 1990
"... The key to realistic selfapplicable partial evaluation is to analyze binding times in the source program, i.e., whether the result of partially evaluating a source expression is static or dynamic, given a static/dynamic division of the input. Source programs are specialized with respect to the stat ..."
Abstract

Cited by 26 (8 self)
 Add to MetaCart
The key to realistic selfapplicable partial evaluation is to analyze binding times in the source program, i.e., whether the result of partially evaluating a source expression is static or dynamic, given a static/dynamic division of the input. Source programs are specialized with respect to the static part of their input. When a source expression depends on the concrete result of specializing another expression, the binding time of this other expression is first interpreted. A safe approximation of these abstract values is computed by binding time analysis. This paper points out that this valuebased information can be compiled into controlbased directives driving the specializer as to what to do for each expression – instead of how to use the result of partially evaluating an expression. This compilation is achieved by a nonstandard interpretation of the specialization semantics, based on the observation that a source expression is either reduced or rebuilt. The result is an action trees isomorphic to the abstract syntax tree of the source program. This approach suggests to reorganize the specializer so that it is driven first
SemanticsDirected Compilation of NonLinear Patterns
 Information Processing Letters
, 1990
"... This paper describes the automatic derivation of compiled patterns and of a pattern compiler by partial evaluation. Compiling a pattern is achieved by specializing a pattern matching program with respect to the pattern. Generating a pattern compiler is achieved by specializing the specializer with r ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
This paper describes the automatic derivation of compiled patterns and of a pattern compiler by partial evaluation. Compiling a pattern is achieved by specializing a pattern matching program with respect to the pattern. Generating a pattern compiler is achieved by specializing the specializer with respect to the pattern matching program, i.e., by selfapplying the partial evaluator. The compiled patterns and the compiler are semanticsbased because they are obtained using meaningpreserving transformations upon the definitional pattern matching program and the partial evaluator. The results are unexpectedly good: not only all are the operations depending on the pattern (syntax analysis, resolution of crossreferences due to the nonlinearity) performed at compile time, but whereas the general pattern matcher builds the substitution environment incrementally and for nothing in case of failure, compiled patterns perform all the structural and equality tests first, and build the result on...
Automatic generation of efficient string matching algorithms by generalized partial computation
 IN CHIN [8
, 2002
"... This paper shows that Generalized Partial Computation (GPC) can automatically generate efficient string matching algorithms. GPC is a program transformation method utilizing partial information about input data and auxiliary functions as well as the logical structure of a source program. GPC uses bo ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
This paper shows that Generalized Partial Computation (GPC) can automatically generate efficient string matching algorithms. GPC is a program transformation method utilizing partial information about input data and auxiliary functions as well as the logical structure of a source program. GPC uses both a classical partial evaluator and an inference engine such as a theorem prover to optimize programs. First, we show that a BoyerMoore (BM) type pattern matcher without the badcharacter heuristic can be generated from a simple nonlinear backward matcher by GPC. This sort of problems has already been discussed in the literature using offline partial evaluators. However, there was no proof that every generated matcher runs in the same way as the BM. In this paper we prove that the problem can be solved starting from a simple nonlinear pattern matcher as a source program. We also prove that a KnuthMorrisPratt (KMP) type linear string matcher can be generated from a naive nonlinear forward matcher by GPC.
Computing Types During Partial Evaluation
, 1990
"... : We have developed techniques for obtaining and using type information during program specialization (partial evaluation). Computed along with every residual expression and every specialized program is type information that bounds the possible values that the specialized program will compute at run ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
: We have developed techniques for obtaining and using type information during program specialization (partial evaluation). Computed along with every residual expression and every specialized program is type information that bounds the possible values that the specialized program will compute at run time. The three keystones of this research are symbolic values that represent both the set of values that might be computed at runtime and the code for creating the runtime value, generalization of symbolic values, and the use of online fixedpoint iterations for computing the type of values returned by specialized recursive functions. This work differs from previous specializers in computing type information for all residual expressions, including residual if expressions, and residual calls to specialized user functions. The specializer exploits the type information it computes to increase the efficiency of specialized functions. In particular, this research allows the class of recursive d...
, A. CristóbalSalas b
"... —In this paper, a partial evaluation technique to reduce communication costs of distributed image processing is presented. It combines application of incomplete structures and partial evaluation together with classical program optimization such as constantpropagation, loop unrolling and deadcode ..."
Abstract
 Add to MetaCart
—In this paper, a partial evaluation technique to reduce communication costs of distributed image processing is presented. It combines application of incomplete structures and partial evaluation together with classical program optimization such as constantpropagation, loop unrolling and deadcode elimination. Through a detailed performance analysis, we establish conditions under which the technique is beneficial. DOI: