Results 11  20
of
42
Parametric Inference of Memory Requirements for Garbage Collected Languages
"... The accurate prediction of program’s memory requirements is a critical component in software development. Existing heap space analyses either do not take deallocation into account or adopt specific models of garbage collectors which do not necessarily correspond to the actual memory usage. We presen ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
(Show Context)
The accurate prediction of program’s memory requirements is a critical component in software development. Existing heap space analyses either do not take deallocation into account or adopt specific models of garbage collectors which do not necessarily correspond to the actual memory usage. We present a novel approach to inferring upper bounds on memory requirements of Javalike programs which is parametric on the notion of object lifetime, i.e., on when objects become collectible. If objects lifetimes are inferred by a reachability analysis, then our analysis infers accurate upper bounds on the memory consumption for a reachabilitybased garbage collector. Interestingly, if objects lifetimes are inferred by a heap liveness analysis, then we approximate the program minimal memory requirement, i.e., the peak memory usage when using an optimal garbage collector which frees objects as soon as they become dead. The key idea is to integrate information on objects lifetimes into the process of generating the recurrence equations which capture the memory usage at the different program states. If the heap size limit is set to the memory requirement inferred by our analysis, it is ensured that execution will not exceed the memory limit with the only assumption that garbage collection works when the limit is reached. Experiments on Java bytecode programs provide evidence of the feasibility and accuracy of our analysis.
UserDefinable Resource Usage Bounds Analysis for Java Bytecode
 BYTECODE 2009
, 2009
"... Automatic cost analysis of programs has been traditionally concentrated on a reduced number of resources such as execution steps, time, or memory. However, the increasing relevance of analysis applications such as static debugging and/or certification of userlevel properties (including for mobile c ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
Automatic cost analysis of programs has been traditionally concentrated on a reduced number of resources such as execution steps, time, or memory. However, the increasing relevance of analysis applications such as static debugging and/or certification of userlevel properties (including for mobile code) makes it interesting to develop analyses for resource notions that are actually applicationdependent. This may include, for example, bytes sent or received by an application, number of files left open, number of SMSs sent or received, number of accesses to a database, money spent, energy consumption, etc. We present a fully automated analysis for inferring upper bounds on the usage that a Java bytecode program makes of a set of application programmerdefinable resources. In our context, a resource is defined by programmerprovided annotations which state the basic consumption that certain program elements make of that resource. From these definitions our analysis derives functions which return an upper bound on the usage that the whole program (and individual blocks) make of that resource for any given set of input data sizes. The analysis proposed is independent of the particular resource. We also present some experimental results from a prototype implementation of the approach covering a significant set of interesting resources.
Collected Size Semantics for Functional Programs
 Implementation and Application of Functional Languages: 20 th International Workshop, IFL 2008, Hertfordshire
"... Abstract. Size analysis is an important prerequisite for heap consumption analysis. This paper is a part of ongoing work about typing support for checking outputoninput size dependencies for function definitions in a strict functional language. A significant restriction for our earlier results is ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
(Show Context)
Abstract. Size analysis is an important prerequisite for heap consumption analysis. This paper is a part of ongoing work about typing support for checking outputoninput size dependencies for function definitions in a strict functional language. A significant restriction for our earlier results is that inner data structures (e.g. in a list of lists) all must have the same size. Here, we make a big step forwards by overcoming this limitation via the introduction of higherorder size annotations such that variate sizes of inner data structures can be expressed. 1
TestBased Inference of Polynomial LoopBound Functions
, 2010
"... This paper presents an interpolationbased method of inferring arbitrary degree loopbound functions for Java programs. Given a loop, by its “loopbound function ” we mean a function with the numeric program variables as its parameters, that is used to bound the number of loopiterations. Using our ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
(Show Context)
This paper presents an interpolationbased method of inferring arbitrary degree loopbound functions for Java programs. Given a loop, by its “loopbound function ” we mean a function with the numeric program variables as its parameters, that is used to bound the number of loopiterations. Using our analysis, loopbound functions that are polynomials with natural, rational or real coefficients can be found. Analysis of loop bounds is important in several different areas, including worstcase execution time (WCET) and heap consumption analysis, optimising compilers and terminationanalysis. While several other methods exist to infer numerical loop bounds, we know of no other research on the inference of nonlinear loopbound functions. Additionally, the inferred bounds are provable using external tools, e.g. KeY.
Constraint solving for highlevel WCET analysis
 In Proceedings of the 18th Workshop on Logicbased Methods in Programming Environments (WLPE 2008
"... Abstract The safety of our daytoday life depends crucially on the correct functioning of embedded software systems which control the functioning of more and more technical devices. Many of these software systems are timecritical. Hence, computations performed need not only to be correct, but must ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
(Show Context)
Abstract The safety of our daytoday life depends crucially on the correct functioning of embedded software systems which control the functioning of more and more technical devices. Many of these software systems are timecritical. Hence, computations performed need not only to be correct, but must also be issued in a timely fashion. Worst case execution time (WCET) analysis is concerned with computing tight upper bounds for the execution time of a system in order to provide formal guarantees for the proper timing behaviour of a system. Central for this is to compute safe and tight bounds for loops and recursion depths. In this paper, we highlight the TuBound approach to this challenge at whose heart is a constraint logic based approach for loop analysis. 1
Resource Usage Analysis and its Application to Resource Certification
"... Abstract. Resource usage is one of the most important characteristics of programs. Automatically generated information about resource usage can be used in multiple ways, both during program development and deployment. In this paper we discuss and present examples on how such information is obtained ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
Abstract. Resource usage is one of the most important characteristics of programs. Automatically generated information about resource usage can be used in multiple ways, both during program development and deployment. In this paper we discuss and present examples on how such information is obtained in COSTA, a state of the art static analysis system. COSTA obtains safe symbolic upper bounds on the resource usage of a large class of generalpurpose programs written in a mainstream programming language such as Java (bytecode). We also discuss the application of resourceusage information for code certification, whereby code not guaranteed to run within certain userspecified bounds is rejected. 1
Monotonicity constraints for termination in the integer domain
"... SizeChange Termination (SCT) is a method of proving program termination based on the impossibility of infinite descent. To this end we use a program abstraction in which transitions are described by monotonicity constraints over (abstract) variables. When only constraints of the form x> y ′ and ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
SizeChange Termination (SCT) is a method of proving program termination based on the impossibility of infinite descent. To this end we use a program abstraction in which transitions are described by monotonicity constraints over (abstract) variables. When only constraints of the form x> y ′ and x ≥ y ′ are allowed, we have sizechange graphs. In the last decade, both theory and practice have evolved significantly in this restricted framework. The crucial underlying assumption of most of the past work is that the domain of the variables is wellfounded. In a recent paper I showed how to extend and adapt some theory from the domain of sizechange graphs to general monotonicity constraints, thus complementing previous work, but remaining in the realm of wellfounded domains. However, monotonicity constraints are, interestingly, capable of proving termination also in the integer domain, which is not wellfounded. The purpose of this paper is to explore the application of monotonicity constraints in this domain. We lay the necessary theoretical foundation, and present precise decision procedures for termination; finally, we provide a procedure to construct explicit global ranking functions from monotonicity constraints in singlyexponential time, and of optimal worstcase size and dimension (ordinal).
Amortized resource analysis with polymorphic recursion and partial bigstep operational semantics
 In Proc. 8th APLAS, volume 6461 of LNCS
, 2010
"... Abstract. This paper studies the problem of statically determining upper bounds on the resource consumption of firstorder functional programs. A previous work approached the problem with an automatic typebased amortized analysis for polynomial resource bounds. The analysis is parametric in the re ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
(Show Context)
Abstract. This paper studies the problem of statically determining upper bounds on the resource consumption of firstorder functional programs. A previous work approached the problem with an automatic typebased amortized analysis for polynomial resource bounds. The analysis is parametric in the resource and can be instantiated to heap space, stack space, or clock cycles. Experiments with a prototype implementation have shown that programs are analyzed efficiently and that the computed bounds exactly match the measured worstcase resource behavior for many functions. This paper describes the inference algorithm that is used in the implementation of the system. It can deal with resourcepolymorphic recursion which is required in the type derivation of many functions. The computation of the bounds is fully automatic if a maximal degree of the polynomials is given. The soundness of the inference is proved with respect to a novel operational semantics for partial evaluations to show that the inferred bounds hold for terminating as well as nonterminating computations. A corollary is that runtime bounds also establish the termination of programs.
On Decidable GrowthRate Properties of Imperative Programs (invited talk paper)
"... In 2008, BenAmram, Jones and Kristiansen showed that for a simple “core ” programming language— an imperative language with bounded loops, and arithmetics limited to addition and multiplication— it is possible to decide precisely whether a program has certain growthrate properties, namely polynomi ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
In 2008, BenAmram, Jones and Kristiansen showed that for a simple “core ” programming language— an imperative language with bounded loops, and arithmetics limited to addition and multiplication— it is possible to decide precisely whether a program has certain growthrate properties, namely polynomial (or linear) bounds on computed values, or on the running time. This work emphasized the role of the core language in mitigating the notorious undecidability of program properties, so that one deals with decidable problems, while allowing the application of the technique to programs in a more realistic language. This is done by overapproximating the semantics of the concrete program. A natural and intriguing problem was whether more elements can be added to the core language, improving the precision of approximation, while keeping the growthrate properties decidable. In particular, the method presented could not handle a command that resets a variable to zero. This paper shows how to handle resets. The analysis is given in a logical style (proof rules), and the complexity of the algorithm is shown to be PSPACE. The problem is shown PSPACEcomplete (in contrast, without resets, the problem was PTIME). The analysis algorithm evolved from the previous solution in an interesting way: focus was shifted from proving a bound to disproving it, and the algorithm works topdown rather than bottomup. 1
A space consumption analysis by abstract interpretation
 In Proceedings of the 1st International Workshop on Foundational and Practical Aspects of Resource Analysis, FOPARA’09
, 2010
"... Safe is a firstorder functional language with an implicit regionbased memory system and explicit destruction of heap cells. Its static analysis for inferring regions, and a type system guaranteeing the absence of dangling pointers have been presented elsewhere. In this paper we present a new analy ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
Safe is a firstorder functional language with an implicit regionbased memory system and explicit destruction of heap cells. Its static analysis for inferring regions, and a type system guaranteeing the absence of dangling pointers have been presented elsewhere. In this paper we present a new analysis aimed at inferring upper bounds for heap and stack consumption. It is based on abstract interpretation, being the abstract domain the set of all nary monotonic functions from real nonnegative numbers to a real nonnegative result. This domain turns out to be a complete lattice under the usual v relation on functions. Our interpretation is monotonic in this domain and the solution we seek is the least fixpoint of the interpretation. We first explain the abstract domain and some correctness properties of the interpretation rules with respect to the language semantics, then present the inference algorithms for recursive functions, and finally illustrate the approach with the upper bounds obtained by our implementation for some case studies. 1