Results 1 
6 of
6
Parametric Prediction of Heap Memory Requirements
"... This work presents a technique to compute symbolic polynomial approximations of the amount of dynamic memory required to safely execute a method without running out of memory, for Javalike imperative programs. We consider object allocations and deallocations made by the method and the methods it tra ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
This work presents a technique to compute symbolic polynomial approximations of the amount of dynamic memory required to safely execute a method without running out of memory, for Javalike imperative programs. We consider object allocations and deallocations made by the method and the methods it transitively calls. More precisely, given an initial configuration of the stack and the heap, the peak memory consumption is the maximum space occupied by newly created objects in all states along a run from it. We overapproximate the peak memory consumption using a scopedmemory management where objects are organized in regions associated with the lifetime of methods. We model the problem of computing the maximum memory occupied by any region configuration as a parametric polynomial optimization problem over a polyhedral domain and resort to Bernstein basis to solve it. We apply the developed tool to several benchmarks.
Live Heap Space Analysis for Languages with Garbage Collection
 In ISMM’09: Proceedings of the 8th international symposium on Memory management
, 2009
"... The peak heap consumption of a program is the maximum size of the live data on the heap during the execution of the program, i.e., the minimum amount of heap space needed to run the program without exhausting the memory. It is wellknown that garbage collection (GC) makes the problem of predicting t ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
The peak heap consumption of a program is the maximum size of the live data on the heap during the execution of the program, i.e., the minimum amount of heap space needed to run the program without exhausting the memory. It is wellknown that garbage collection (GC) makes the problem of predicting the memory required to run a program difficult. This paper presents, the best of our knowledge, the first live heap space analysis for garbagecollected languages which infers accurate upper bounds on the peak heap usage of a program’s execution that are not restricted to any complexity class, i.e., we can infer exponential, logarithmic, polynomial, etc., bounds. Our analysis is developed for an (sequential) objectoriented bytecode language with a scopedmemory manager that reclaims unreachable memory when methods return. We also show how our analysis can accommodate other GC schemes which are closer to the ideal GC which collects objects as soon as they become unreachable. The practicality of our approach is experimentally evaluated on a prototype implementation. We demonstrate that it is fully automatic, reasonably accurate and efficient by inferring live heap space bounds for a standardized set of benchmarks, the JOlden suite.
Symbolic polynomial maximization over convex sets and its application to memory requirement estimation
, 2009
"... Memory requirement estimation is an important issue in the development of embedded systems, since memory directly influences performance, cost and power consumption. It is therefore crucial to have tools that automatically compute accurate estimates of the memory requirements of programs to better ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
Memory requirement estimation is an important issue in the development of embedded systems, since memory directly influences performance, cost and power consumption. It is therefore crucial to have tools that automatically compute accurate estimates of the memory requirements of programs to better control the development process and avoid some catastrophic execution exceptions. Many important memory issues can be expressed as the problem of maximizing a parametric polynomial defined over a parametric convex domain. Bernstein expansion is a technique that has been used to compute upper bounds on polynomials defined over intervals and parametric “boxes”. In this paper, we propose an extension of this theory to more general parametric convex domains and illustrate its applicability to the resolution of memory issues with several application examples.
Checklist of information for inclusion in reports of clinical trials. The Asilomar Working Group on Recommendations for Reporting
 of Clinical Trials in the Biomedical Literature. Ann Intern Med
, 2007
"... Abstract. This work presents a technique to compute symbolic nonlinear approximations of the amount of dynamic memory required to safely run a method in (Javalike) imperative programs. We do that for scopedmemory management where objects are organized in regions associated with the lifetime of met ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. This work presents a technique to compute symbolic nonlinear approximations of the amount of dynamic memory required to safely run a method in (Javalike) imperative programs. We do that for scopedmemory management where objects are organized in regions associated with the lifetime of methods. Our approach resorts to a symbolic nonlinear optimization problem which is solved using Bernstein basis. 1
Parametric Heap Usage Analysis . . .
, 2009
"... This paper presents an analysis that derives a formula describing the worstcase live heap space usage of programs in a functional language with automated memory management (garbage collection). First, the given program is automatically transformed into bound functions that describe upper bounds on ..."
Abstract
 Add to MetaCart
This paper presents an analysis that derives a formula describing the worstcase live heap space usage of programs in a functional language with automated memory management (garbage collection). First, the given program is automatically transformed into bound functions that describe upper bounds on the live heap space usage and other related space metrics in terms of the sizes of function arguments. The bound functions are simplified and rewritten to obtain recurrences, which are then solved to obtain the desired formulas characterizing the worstcase space usage. These recurrences may be difficult to solve due to uses of the maximum operator. We give methods to automatically solve categories of such recurrences. Our analysis determines and exploits monotonicity and monotonicitylike properties of bound functions to derive upper bounds on heap usage, without considering behaviors of the program that cannot lead to maximal space usage.
Components
, 2007
"... We identify an abstract language for component software based on primitives for instantiating components and for deleting instances of components, as well as for sequential, alternative and parallel composition. We define an operational semantics for our language and give a type system in which the ..."
Abstract
 Add to MetaCart
We identify an abstract language for component software based on primitives for instantiating components and for deleting instances of components, as well as for sequential, alternative and parallel composition. We define an operational semantics for our language and give a type system in which the types express quantitative information on the components involved in the execution of the expressions of the language. Included in this information is for each component the maximum number of instances that are simultaneously active during the execution of the expression. The main contribution is a type inference algorithm which runs in time quadratic in the size of the input, whereas executing the expression according to the operational semantics can result in exponentially many runs of exponential length. We consider extensions of the language with loops and tail recursion, and with a scope mechanism. We illustrate the approach with some examples, one on UML diagram refinement and one on counting objects on the free store in C++. 1