## Improvements to a Resource Analysis for Hume (2009)

Venue: | In Proc. FOPARA ’09: Intl. Workshop on Foundational and Practical Aspects of Resource Analysis, LNCS 6324 |

Citations: | 2 - 1 self |

### BibTeX

@INPROCEEDINGS{Loidl09improvementsto,

author = {Hans-wolfgang Loidl and Steffen Jost},

title = {Improvements to a Resource Analysis for Hume},

booktitle = {In Proc. FOPARA ’09: Intl. Workshop on Foundational and Practical Aspects of Resource Analysis, LNCS 6324},

year = {2009},

publisher = {Springer}

}

### OpenURL

### Abstract

Abstract. The core of our resource analysis for the embedded systems language Hume is a resource-generic, type-based inference engine that employs the concept of amortised costs to statically infer resource bounds. In this paper we present extensions and improvements of this resource analysis in several ways. We develop and assess a call count analysis for higher-order programs, as a specific instance of our inference engine. We address usability aspects in general and in particular discuss an improved presentation of the inferred resource bounds together with the possibility of interactively tuning these bounds. Finally, we demonstrate improvements in the performance of our analysis. 1

### Citations

8530 |
Introduction to Algorithms
- Cormen, Leiserson, et al.
- 1990
(Show Context)
Citation Context ...ed by a balancing operation. Note that, due to the above invariants, this cannot occur for a well-formed red-black tree: any insertion into the tree will trigger at most two balancing operations (see =-=[8]-=-[Chapter 13]). As expected, these (semantic) constraints are not captured by our analysis: our analysis must account for the worst-case of all well-typed programs. However, the type of red-black trees... |

227 | Purely Functional Data Structures
- Okasaki
- 1998
(Show Context)
Citation Context ...a y (ins x b) else (Node col a y b); rbInsert :: num -> tree -> tree; rbInsert x t = case ins x t of (Node _ a y b) -> Node Black a y b; Fig. 1. Example rbInsert: insertion into a red-black tree book =-=[24]-=-. A red-black tree is a binary search tree, in which nodes are coloured red or black. With the help of these colours, invariants can be formulated that guarantee that a tree is roughly balanced. The i... |

138 |
Amortized computational complexity
- Tarjan
- 1985
(Show Context)
Citation Context ...verview of the general cost-behaviour. Since the number of concrete inputs may be large or even infinite, this is generally infeasible. The original amortised analysis technique as proposed by Tarjan =-=[27]-=-, being very powerful, may generally produce such a precise “black box” cost oracle. This is not a hindrance for a manual technique, as the mathematician performing the method has direct control over ... |

130 | Static prediction of heap space usage for firstorder functional programs
- Hofmann, Jost
- 2003
(Show Context)
Citation Context ...is is achieved by simply copying the constraints associated with a function for each of its applications, using fresh variable names throughout. Since the generated LPs are sparse and easily solvable =-=[17]-=-, this blow-up of constraints is of little concern. More information on this mechanism for resource parametricity can be found in [21]. This once more illustrates that the result for analysing a funct... |

121 | Proving the Correctness of Reactive Systems Using Sized Types
- Hughes, Pareto, et al.
(Show Context)
Citation Context ...l not clear which bound is preferable. For a simple example, we consider the standard list zipping, such as adding two lists of numerical values. Using a Haskell-style syntax we have: zipWith add [ ] =-=[10, 20]-=- = [ ] zipWith add [1, 2, 3, 4] [10, 20] = [11, 22] zipWith add [1, 2, 3, 4] [10, 20, 30, 40, 50, 60] = [11, 22, 33, 44] We immediately see that the resource consumption, be it time or space, depends ... |

116 | Resource Bound Certification
- Crary, Weirich
- 2000
(Show Context)
Citation Context ...analysis by Tarjan [27]. Hofmann and Jost were the first to develop an automatic amortised analysis for heap consumption [17], exploiting a difference metric similar to that used by Crary and Weirich =-=[9]-=-. The latter work, however, only checks bounds, and does not infer them, as we do. Apart from inference, a notable difference of our work to the work of Tarjan [27] is that credits are associated on a... |

100 |
Reliable and Precise WCET Determination for a Real-Life Processor
- Ferdinand, Heckmann, et al.
- 2001
(Show Context)
Citation Context ...nds are necessarily less accurate, since the costs for the basic machine instructions are already worst-case bounds, which we obtained through analysis of the generated machine code with the aiT tool =-=[10]-=-. In general we aim for bounds within 30% of the observed costs, which might not be the worst case. We achieve this goal for three of the four test programs. The results for the call counts show an ex... |

76 | Recursion and dynamic data-structures in bounded space: Towards embedded ML programming
- Hughes, Pareto
- 1999
(Show Context)
Citation Context ...lying data structure, whereas our weights are factors of a linear bound on resource consumption. The original work was limited to type checking, but subsequent work has developed inference mechanisms =-=[6,19,29]-=-. Vasconcelos’ PhD thesis [28] extended these previous approaches by using abstract interpretation techniques to automatically infer linear approximations of the sizes of recursive data types and the ... |

71 | Hume: a Domain-Specific Language for RealTime Embedded Systems
- Hammond, Michaelson
(Show Context)
Citation Context ...e performance of our analysis. 1 Introduction In the past [22] we have developed an amortised cost based resource analysis for a higher-order, strict functional language, namely expression-level Hume =-=[15]-=-. Salient features of this analysis are its strong formal foundations, building on amortised program complexity and type systems, high performance due to employing efficient linear program solvers, an... |

52 | Calculating sized types
- Chin, Khoo
- 2001
(Show Context)
Citation Context ...lying data structure, whereas our weights are factors of a linear bound on resource consumption. The original work was limited to type checking, but subsequent work has developed inference mechanisms =-=[6,19,29]-=-. Vasconcelos’ PhD thesis [28] extended these previous approaches by using abstract interpretation techniques to automatically infer linear approximations of the sizes of recursive data types and the ... |

41 | Speed: precise and efficient static estimation of program computational complexity
- Gulwani, Mehra, et al.
- 2009
(Show Context)
Citation Context ...otherwise mirrors the normal program execution. Unlike our type-based analysis, the cost of this analysis therefore depends directly on the complexity of the input data. Gulwani et al.’s SPEED system =-=[13]-=- uses a symbolic evaluation approach to calculate non-linear complexity bounds for C/C++ procedures using an abstract interpretation-based invariant generation tool. Precise loop bounds are calculated... |

25 |
Inferring Cost Equations for Recursive, Polymorphic and Higher-Order Functional Programs
- Vasconcelos, Hammond
- 2003
(Show Context)
Citation Context ...lying data structure, whereas our weights are factors of a linear bound on resource consumption. The original work was limited to type checking, but subsequent work has developed inference mechanisms =-=[6,19,29]-=-. Vasconcelos’ PhD thesis [28] extended these previous approaches by using abstract interpretation techniques to automatically infer linear approximations of the sizes of recursive data types and the ... |

23 | Parametric Prediction of Heap Memory Requirements
- Braberman, Fernández, et al.
- 2008
(Show Context)
Citation Context ...preferable. For a simple example, we consider the standard list zipping, such as adding two lists of numerical values. Using a Haskell-style syntax we have: zipWith add [ ] [10, 20] = [ ] zipWith add =-=[1, 2, 3, 4]-=- [10, 20] = [11, 22] zipWith add [1, 2, 3, 4] [10, 20, 30, 40, 50, 60] = [11, 22, 33, 44] We immediately see that the resource consumption, be it time or space, depends on the length of the shorter in... |

23 | Type-Based Amortised Heap-Space Analysis
- Hofmann, Jost
- 2006
(Show Context)
Citation Context ...ithin the memory. Okasaki [24] also noted this as a problem, resorting to the use of lazy evaluation. In contrast, per-reference credits can be directly applied to strict evaluation. Hofmann and Jost =-=[18]-=- have extended their method to cover a comprehensive subset of Java, including imperative updates, inheritance and type casts. Shkaravska et al. [26] subsequently considered the inference of heap cons... |

22 |
Notebaert lp solve, an open source (mixed-integer) linear programming system. Homepage: http://lpsolve.sourceforge.net/5.5/. [7] D. Bertsekas The auction algorithm: A distributed relaxation method for the assignment problem
- Berkelaar, Eikland, et al.
(Show Context)
Citation Context ...preferable. For a simple example, we consider the standard list zipping, such as adding two lists of numerical values. Using a Haskell-style syntax we have: zipWith add [ ] [10, 20] = [ ] zipWith add =-=[1, 2, 3, 4]-=- [10, 20] = [11, 22] zipWith add [1, 2, 3, 4] [10, 20, 30, 40, 50, 60] = [11, 22, 33, 44] We immediately see that the resource consumption, be it time or space, depends on the length of the shorter in... |

22 |
al.: The Worst-Case Execution-Time Problem – An Overview of Methods and Survey of Tools
- Wilhelm, et
- 2008
(Show Context)
Citation Context ... list zipping, such as adding two lists of numerical values. Using a Haskell-style syntax we have: zipWith add [ ] [10, 20] = [ ] zipWith add [1, 2, 3, 4] [10, 20] = [11, 22] zipWith add [1, 2, 3, 4] =-=[10, 20, 30, 40, 50, 60]-=- = [11, 22, 33, 44] We immediately see that the resource consumption, be it time or space, depends on the length of the shorter input list. Therefore, we have the following admissible annotated types ... |

21 | Costa: Design and implementation of a cost and termination analyzer for java bytecode
- Albert, Arenas, et al.
- 2007
(Show Context)
Citation Context ...preferable. For a simple example, we consider the standard list zipping, such as adding two lists of numerical values. Using a Haskell-style syntax we have: zipWith add [ ] [10, 20] = [ ] zipWith add =-=[1, 2, 3, 4]-=- [10, 20] = [11, 22] zipWith add [1, 2, 3, 4] [10, 20, 30, 40, 50, 60] = [11, 22, 33, 44] We immediately see that the resource consumption, be it time or space, depends on the length of the shorter in... |

19 | Analysing Memory Resource Bounds for Low-Level Programs
- Chin, Nguyen, et al.
- 2008
(Show Context)
Citation Context ... al. [4] infer polynomial bounds on the live heap usage for a Java-like language with automatic memory management. However, unlike our system, they do not cover general recursive methods. Chin et al. =-=[7]-=- present a heap and a stack analysis for a low-level (assembler) language with explicit (de-)allocation. By inferring path-sensitive information and using symbolic evaluation they are able to infer ex... |

17 | Eekelen. Polynomial Size Analysis for First-Order Functions
- Shkaravska, Kesteren, et al.
- 2007
(Show Context)
Citation Context ...tly applied to strict evaluation. Hofmann and Jost [18] have extended their method to cover a comprehensive subset of Java, including imperative updates, inheritance and type casts. Shkaravska et al. =-=[26]-=- subsequently considered the inference of heap consumption for first-order polymorphic lists, and are currently studying extensions to non-linear bounds. Hoffmann and Hofmann [16] have recently presen... |

11 | Amortized Resource Analysis with Polynomial Potential
- Hoffmann, Hofmann
- 2010
(Show Context)
Citation Context .... Namely, they are linear in the sizes of the input. This restriction to linearly dependent bounds is our chosen trade-off to obtain an automated inference for the amortised analysis. Recent research =-=[16]-=- shows how this restriction of the inference to linear bounds may be lifted. This design guarantees that we can easily divide all possible inputs into large classes having a similar cost. For example,... |

10 | Amortised memory analysis using the depth of data structures
- Campbell
- 2009
(Show Context)
Citation Context ...tly not possible to express the fact that in the tree traversal the number of nodes visited on each path is at most log n. In the extension of the amortised cost based analysis, developed by Campbell =-=[5]-=-, such information on the depth of data structures is available, and his system is able to infer logarithmic bounds on space consumption for such examples. 3 Usability Improvements 3.1 Elaboration Mod... |

10 | Static Determination of Quantitative Resource Usage for Higher-Order Programs
- Jost, Hammond, et al.
- 2010
(Show Context)
Citation Context ...hroughout. Since the generated LPs are sparse and easily solvable [17], this blow-up of constraints is of little concern. More information on this mechanism for resource parametricity can be found in =-=[21]-=-. This once more illustrates that the result for analysing a function is the set of all admissible annotations, rather than any single annotation. Authors version for personal use only. Published at F... |

9 | Carbon Credits” for Resource-Bounded Computations using Amortised Analysis
- Jost, Loidl, et al.
- 2009
(Show Context)
Citation Context ...of the inferred resource bounds together with the possibility of interactively tuning these bounds. Finally, we demonstrate improvements in the performance of our analysis. 1 Introduction In the past =-=[22]-=- we have developed an amortised cost based resource analysis for a higher-order, strict functional language, namely expression-level Hume [15]. Salient features of this analysis are its strong formal ... |

8 |
Cost Inference and Analysis for Recursive Functional Programs
- Vasconcelos
- 2006
(Show Context)
Citation Context ...eights are factors of a linear bound on resource consumption. The original work was limited to type checking, but subsequent work has developed inference mechanisms [6,19,29]. Vasconcelos’ PhD thesis =-=[28]-=- extended these previous approaches by using abstract interpretation techniques to automatically infer linear approximations of the sizes of recursive data types and the stack and heap costs of recurs... |

6 | Exploiting Purely Functional Programming to Obtain Bounded Resource Behaviour: the Hume Approach
- Hammond
- 2005
(Show Context)
Citation Context ...ounts, heap- and stack-space consumption, and worst-case execution time (measured in clock cycles). The cost model results have been obtained from an instrumented version of the Hume abstract machine =-=[14]-=-. The cost model simply counts resource usage according to the cost table during an execution on some test input. The lists and trees used as test input for the cost model execution had a size of 10 e... |

3 |
Certification Using the Mobius Base Logic
- Beringer, Hofmann, et al.
- 2007
(Show Context)
Citation Context ...d for by the mobile network provider. In this scenario the “costs” of a function call are very real and measurable in pounds. Therefore, this particular example has been studied in the Mobius project =-=[2]-=-, where Java bytecode has been analysed. Authors version for personal use only. Published at FOPARA 2009. The original publication is available at www.springerlink.comOur cost table for the call coun... |

3 | A Sharing Analysis for Safe
- Peña, Segura, et al.
- 2006
(Show Context)
Citation Context ...fer linear approximations of the sizes of recursive data types and the stack and heap costs of recursive functions. A combination of sized types and regions is also being developed by Peña and Segura =-=[25]-=-, building on information provided by ancillary analyses on termination and safe destruction. Amortised Costs: The concept of amortised costs has first been developed in the context of complexity anal... |

2 | Worst-Case Execution Time Analysis through Types
- Jost, Loidl, et al.
- 2009
(Show Context)
Citation Context ...source consumption. This analysis has been successfully used to infer upper bounds on the heapand stack-space consumption and on the worst-case execution time of several embedded systems applications =-=[23]-=-. One of the main strengths of our analysis is its flexible design, which permits easy adaptation to model other quantitative resources. In essence, only a cost table, mapping abstract machine instruc... |

1 |
Authors version for personal use only. Published at FOPARA 2009. The original publication is available at www.springerlink.com
- orgghc
- 1998
(Show Context)
Citation Context ... example, we consider the standard list zipping, such as adding two lists of numerical values. Using a Haskell-style syntax we have: zipWith add [ ] [10, 20] = [ ] zipWith add [1, 2, 3, 4] [10, 20] = =-=[11, 22]-=- zipWith add [1, 2, 3, 4] [10, 20, 30, 40, 50, 60] = [11, 22, 33, 44] We immediately see that the resource consumption, be it time or space, depends on the length of the shorter input list. Therefore,... |