Results 1 
7 of
7
Lazy Code Motion
, 1992
"... We present a bitvector algorithm for the optimal and economical placement of computations within flow graphs, which is as efficient as standard unidirectional analyses. The point of our algorithm is the decomposition of the bidirectional structure of the known placement algorithms into a sequenc ..."
Abstract

Cited by 157 (20 self)
 Add to MetaCart
We present a bitvector algorithm for the optimal and economical placement of computations within flow graphs, which is as efficient as standard unidirectional analyses. The point of our algorithm is the decomposition of the bidirectional structure of the known placement algorithms into a sequence of a backward and a forward analysis, which directly implies the efficiency result. Moreover, the new compositional structure opens the algorithm for modification: two further unidirectional analysis components exclude any unnecessary code motion. This laziness of our algorithm minimizes the register pressure, which has drastic effects on the runtime behaviour of the optimized programs in practice, where an economical use of registers is essential.
The Interprocedural Coincidence Theorem
 In Int. Conf. on Comp. Construct
, 1992
"... We present an interprocedural generalization of the wellknown (intraprocedural) Coincidence Theorem of Kam and Ullman, which provides a sufficient condition for the equivalence of the meet over all paths (MOP ) solution and the maximal fixed point (MFP ) solution to a data flow analysis problem. Th ..."
Abstract

Cited by 87 (11 self)
 Add to MetaCart
We present an interprocedural generalization of the wellknown (intraprocedural) Coincidence Theorem of Kam and Ullman, which provides a sufficient condition for the equivalence of the meet over all paths (MOP ) solution and the maximal fixed point (MFP ) solution to a data flow analysis problem. This generalization covers arbitrary imperative programs with recursive procedures, global and local variables, and formal value parameters. In the absence of procedures, it reduces to the classical intraprocedural version. In particular, our stackbased approach generalizes the coincidence theorems of Barth and Sharir/Pnueli for the same setup, which do not properly deal with local variables of recursive procedures. 1 Motivation Data flow analysis is a classical method for the static analysis of programs that supports the generation of efficient object code by "optimizing" compilers (cf. [He, MJ]). For imperative languages, it provides information about the program states that may occur at s...
Automatic Derivation of Path and Loop Annotations in ObjectOriented RealTime Programs
 Journal of Parallel and Distributed Computing Practices
, 1998
"... This paper will present a new method, based on the notion of abstract interpretation, that can be used to derive path and loop annotationsautomatically for objectoriented realtime programs. Normally, these annotations,necessary to get a tight calculation of the worst case execution time (WCET ), m ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
This paper will present a new method, based on the notion of abstract interpretation, that can be used to derive path and loop annotationsautomatically for objectoriented realtime programs. Normally, these annotations,necessary to get a tight calculation of the worst case execution time (WCET ), must be given manually by the programmer. The method is illustrated by showing the analysis of an example in Smalltalk. Keywords: realtime software, objectoriented programming, execution time analysis. 1 Introduction The execution time of most programs varies, depending on the input data and the system state. For programs with some complexity, it can be hard to find the input data that causes the actual worst case execution time, WCETA . Therefore, measurement is not considered a feasible method in the general case. Instead, static analysis, which from the source code derives WCETC (the calculated worst case execution time), has been proposed by many researchers [PK89, PS90, CBW94, LM95, ...
Towards a Tool Kit for the Automatic Generation of Interprocedural Data Flow Analyses
, 1996
"... this article, the classical application of DFA. In this context, designers of a DFA are typically faced with the problem of how to construct an algorithm that determines the set of program points of an argument program which satisfy a certain property of interest. Though this problem has been studie ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
this article, the classical application of DFA. In this context, designers of a DFA are typically faced with the problem of how to construct an algorithm that determines the set of program points of an argument program which satisfy a certain property of interest. Though this problem has been studied in detail for the intraprocedural case, the construction of interprocedural analyses is still
Abstract Interpretation using Attribute Grammars
 Volume 461 of LNCS
, 1990
"... This paper deals with the correctness proofs of attribute grammars using methods from abstract interpretation. The technique will be described by defining a livevariable analysis for a small flowchart language and proving it correct with respect to a continuation style semantics. The proof techniq ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
This paper deals with the correctness proofs of attribute grammars using methods from abstract interpretation. The technique will be described by defining a livevariable analysis for a small flowchart language and proving it correct with respect to a continuation style semantics. The proof technique is based on fixpoint induction and introduces an extended class of attribute grammars as to express a standard semantics. 1
Abstract Interpretation and Attribute Grammars
, 1992
"... The objective of this thesis is to explore the connections between abstract interpretation and attribute grammars as frameworks in program analysis. Abstract interpretation is a semanticsbased program analysis method. A large class of data flow analysis problems can be expressed as nonstandard sem ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The objective of this thesis is to explore the connections between abstract interpretation and attribute grammars as frameworks in program analysis. Abstract interpretation is a semanticsbased program analysis method. A large class of data flow analysis problems can be expressed as nonstandard semantics where the “meaning ” contains information about the runtime behaviour of programs. In an abstract interpretation the analysis is proved correct by relating it to the usual semantics for the language. Attribute grammars provide a method and notation to specify code generation and program analysis directly from the syntax of the programming language. They are especially used for describing compilation of programming languages and very efficient evaluators have been developed for subclasses of attribute grammars. By relating abstract interpretation and attribute grammars we obtain a closer connection between the specification and implementation of abstract interpretations which at the same time facilitates the correctness proofs of interpretations. Implementation and specification of abstract interpretations using circular attribute grammars is realised with an evaluator system for a class of domain theoretic attribute grammars. In this system thecircularity of attribute grammars is resolved by fixpoint iteration. The use of finite lattices in abstract interpretations requires automatic generation of specialised fixpoint iterators. This is done using a technique called lazy fixpoint iteration which is presented in the thesis. Methods from abstract interpretation can also be used in correctness proofs of attribute grammars. This proof technique introduces a new class of attribute grammars based on domain theory. This method is illustrated with examples. i ii SUMMARY
Program Timing Analysis
, 1994
"... This report is submitted as a first year qualifying dissertation. The field of program timing analysis is surveyed in detail. The review includes details of relevant static analysis techniques, highlevel program analysis, lowlevel timing analysis, and example languages and tools. Each of the techn ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This report is submitted as a first year qualifying dissertation. The field of program timing analysis is surveyed in detail. The review includes details of relevant static analysis techniques, highlevel program analysis, lowlevel timing analysis, and example languages and tools. Each of the techniques used in timing analysis is considered in detail, followed by a look at six contemporary tools  the strengths and weaknesses of these are highlighted. From these conclusions, a plan for future research is proposed. Finally, the Ada language and the SPARK system are also considered.  2  Table of Contents Part I  Introduction 1. Introduction and overview ..............................................................................................3 Part II  Field Survey and Review 2. Static analysis .................................................................................................................5 2.1 Control flow analysis .........................................