Results 1 - 10
of
649
Program Analysis and Specialization for the C Programming Language
, 1994
"... Software engineers are faced with a dilemma. They want to write general and wellstructured programs that are flexible and easy to maintain. On the other hand, generality has a price: efficiency. A specialized program solving a particular problem is often significantly faster than a general program. ..."
Abstract
-
Cited by 629 (0 self)
- Add to MetaCart
Software engineers are faced with a dilemma. They want to write general and wellstructured programs that are flexible and easy to maintain. On the other hand, generality has a price: efficiency. A specialized program solving a particular problem is often significantly faster than a general program. However, the development of specialized software is time-consuming, and is likely to exceed the production of today’s programmers. New techniques are required to solve this so-called software crisis. Partial evaluation is a program specialization technique that reconciles the benefits of generality with efficiency. This thesis presents an automatic partial evaluator for the Ansi C programming language. The content of this thesis is analysis and transformation of C programs. We develop several analyses that support the transformation of a program into its generating extension. A generating extension is a program that produces specialized programs when executed on parts of the input. The thesis contains the following main results.
Refactoring Object-Oriented Frameworks
, 1992
"... This thesis defines a set of program restructuring operations (refactorings) that support the design, evolution and reuse of object-oriented application frameworks. The focus of the thesis is on automating the refactorings in a way that preserves the behavior of a program. The refactorings are defin ..."
Abstract
-
Cited by 489 (4 self)
- Add to MetaCart
(Show Context)
This thesis defines a set of program restructuring operations (refactorings) that support the design, evolution and reuse of object-oriented application frameworks. The focus of the thesis is on automating the refactorings in a way that preserves the behavior of a program. The refactorings are defined to be behavior preserving, provided that their preconditions are met. Most of the refactorings are simple to implement and it is almost trivial to show that they are behavior preserving. However, for a few refactorings, one or more of their preconditions are in general undecidable. Fortunately, for some cases it can be determined whether these refactorings can be applied safely. Three of the most complex refactorings are defined in detail: generalizing the inheritance hierarchy, specializing the inheritance hierarchy and using aggregations to model the relationships among classes. These operations are decomposed into more primitive parts, and the power of these operations is discussed from the perspectives of automatability and usefulness in supporting design. Two design constraints needed in refactoring are class invariants and exclusive components. These constraints are needed to ensure that behavior is preserved across some refactorings. This thesis gives some conservative algorithms for determining whether a program satisfies these constraints, and describes how to use this design information to refactor a program.
Deforestation: Transforming Programs to Eliminate Trees.” In:
- Theoretical Computer Science
, 1988
"... ..."
(Show Context)
An Efficient Unification Algorithm
- TRANSACTIONS ON PROGRAMMING LANGUAGES AND SYSTEMS (TOPLAS)
, 1982
"... The unification problem in first-order predicate calculus is described in general terms as the solution of a system of equations, and a nondeterministic algorithm is given. A new unification algorithm, characterized by having the acyclicity test efficiently embedded into it, is derived from the nond ..."
Abstract
-
Cited by 371 (1 self)
- Add to MetaCart
The unification problem in first-order predicate calculus is described in general terms as the solution of a system of equations, and a nondeterministic algorithm is given. A new unification algorithm, characterized by having the acyclicity test efficiently embedded into it, is derived from the nondeterministic one, and a PASCAL implementation is given. A comparison with other well-known unification algorithms shows that the algorithm described here performs well in all cases.
Higher-Order Abstract Syntax
"... We describe motivation, design, use, and implementation of higher-order abstract syntax as a central representation for programs, formulas, rules, and other syntactic objects in program manipulation and other formal systems where matching and substitution or syntax incorporates name binding informat ..."
Abstract
-
Cited by 357 (17 self)
- Add to MetaCart
We describe motivation, design, use, and implementation of higher-order abstract syntax as a central representation for programs, formulas, rules, and other syntactic objects in program manipulation and other formal systems where matching and substitution or syntax incorporates name binding information in a uniform and language generic way. Thus it acts as a powerful link integrating diverse tools in such formal environments. We have implemented higherorder abstract syntax, a supporting matching and unification algorithm, and some clients in Common
A real-time garbage collector based on the lifetimes of objects
- Communications of the ACM
, 1983
"... ABSTRACT: In previous heap storage systems, the cost of creating objects and garbage collection is independent of the lifetime of the object. Since objects with short lifetimes account for a large portion of storage use, it is worth optimizing a garbage collector to reclaim storage for these objects ..."
Abstract
-
Cited by 268 (1 self)
- Add to MetaCart
(Show Context)
ABSTRACT: In previous heap storage systems, the cost of creating objects and garbage collection is independent of the lifetime of the object. Since objects with short lifetimes account for a large portion of storage use, it is worth optimizing a garbage collector to reclaim storage for these objects more quickly. The garbage collector should spend proportionately less effort reclaiming objects with longer lifetimes. We present a garbage collection algorithm that (1) makes storage for short-lived objects cheaper than storage for long-lived objects, (2) that operates in real-time--object creation and access times are bounded, (3) increases locality of reference, for better virtual memory performance, (4) works well with multiple processors and a large address space. 1.
KIDS: A Semi-Automatic Program Development System
- Client Resources on the Internet, IEEE Multimedia Systems ’99
, 1990
"... The Kestrel Interactive Development System (KIDS) provides automated sup- port for the development of correct and efficient programs from formal specifications. ..."
Abstract
-
Cited by 268 (26 self)
- Add to MetaCart
(Show Context)
The Kestrel Interactive Development System (KIDS) provides automated sup- port for the development of correct and efficient programs from formal specifications.
Tutorial Notes on Partial Evaluation
- Proceedings of the Twentieth Annual ACM Symposium on Principles of Programming Languages
, 1993
"... The last years have witnessed a flurry of new results in the area of partial evaluation. These tutorial notes survey the field and present a critical assessment of the state of the art. 1 Introduction Partial evaluation is a source-to-source program transformation technique for specializing program ..."
Abstract
-
Cited by 248 (59 self)
- Add to MetaCart
(Show Context)
The last years have witnessed a flurry of new results in the area of partial evaluation. These tutorial notes survey the field and present a critical assessment of the state of the art. 1 Introduction Partial evaluation is a source-to-source program transformation technique for specializing programs with respect to parts of their input. In essence, partial evaluation removes layers of interpretation. In the most general sense, an interpreter can be defined as a program whose control flow is determined by its input data. As Abelson points out, [43, Foreword], even programs that are not themselves interpreters have important interpreter-like pieces. These pieces contain both compile-time and run-time constructs. Partial evaluation identifies and eliminates the compile-time constructs. 1.1 A complete example We consider a function producing formatted text. Such functions exist in most programming languages (e.g., format in Lisp and printf in C). Figure 1 displays a formatting functio...
A Short Cut to Deforestation
, 1993
"... Lists are often used as "glue" to connect separate parts of a program together. We propose an automatic technique for improving the efficiency of such programs, by removing many of these intermediate lists, based on a single, simple, local transformation. We have implemented the method in ..."
Abstract
-
Cited by 229 (17 self)
- Add to MetaCart
Lists are often used as "glue" to connect separate parts of a program together. We propose an automatic technique for improving the efficiency of such programs, by removing many of these intermediate lists, based on a single, simple, local transformation. We have implemented the method in the Glasgow Haskell compiler.
The Concept of a Supercompiler
- ACM Transactions on Programming Languages and Systems
, 1986
"... A supercompiler is a program transformer of a certain type. It traces the possible generalized histories of computation by the original program, and compiles an equivalent program, reducing in the process the redundancy that could be present in the original program. The nature of the redundancy that ..."
Abstract
-
Cited by 194 (3 self)
- Add to MetaCart
(Show Context)
A supercompiler is a program transformer of a certain type. It traces the possible generalized histories of computation by the original program, and compiles an equivalent program, reducing in the process the redundancy that could be present in the original program. The nature of the redundancy that can be eliminated by supercompilation may be various, e.g., some variables might have predefined values (as in partial evaluation), or the structure of control transfer could be made more efficient (as in lazy evaluation), or it could simply be the fact that the same variable is used more than once. The general principles of supercompilation are described and compared with the usual approach to program transformation as a stepwise application of a number of equivalence rules. It is argued that the language Refal serves the needs of supercompilation best. Refal is formally defined and compared with Prolog and other languages. Examples are given of the operation of a Refal supercompiler implemented at CCNY on an IBM/370.