Results 1  10
of
43
Dimensions in program synthesis
"... Program Synthesis, which is the task of discovering programs that realize user intent, can be useful in several scenarios: enabling people with no programming background to develop utility programs, helping regular programmers automatically discover tricky/mundane details, program understanding, dis ..."
Abstract

Cited by 54 (20 self)
 Add to MetaCart
Program Synthesis, which is the task of discovering programs that realize user intent, can be useful in several scenarios: enabling people with no programming background to develop utility programs, helping regular programmers automatically discover tricky/mundane details, program understanding, discovery of new algorithms, and even teaching. This paper describes three key dimensions in program synthesis: expression of user intent, space of programs over which to search, and the search technique. These concepts are illustrated by brief description of various program synthesis projects that target synthesis of a wide variety of programs such as standard undergraduate textbook algorithms (e.g., sorting, dynamic programming), program inverses (e.g., decoders, deserializers), bitvector manipulation routines, deobfuscated programs, graph algorithms, textmanipulating routines, mutual exclusion algorithms, etc. Categories and Subject Descriptors D.1.2 [Programming Techniques]:
Multivariate Amortized Resource Analysis
, 2010
"... We study the problem of automatically analyzing the worstcase resource usage of procedures with several arguments. Existing automatic analyses based on amortization, or sized types bound the resource usage or result size of such a procedure by a sum of unary functions of the sizes of the arguments. ..."
Abstract

Cited by 47 (8 self)
 Add to MetaCart
(Show Context)
We study the problem of automatically analyzing the worstcase resource usage of procedures with several arguments. Existing automatic analyses based on amortization, or sized types bound the resource usage or result size of such a procedure by a sum of unary functions of the sizes of the arguments. In this paper we generalize this to arbitrary multivariate polynomial functions thus allowing bounds of the form mn which had to be grossly overestimated by m 2 + n 2 before. Our framework even encompasses bounds like ∑ i,j≤n mimj where the mi are the sizes of the entries of a list of length n. This allows us for the first time to derive useful resource bounds for operations on matrices that are represented as lists of lists and to considerably improve bounds on other superlinear operations on lists such as longest common subsequence and removal of duplicates from lists of lists. Furthermore, resource bounds are now closed under composition which improves accuracy of the analysis of composed programs when some or all of the components exhibit superlinear resource or size behavior. The analysis is based on a novel multivariate amortized resource analysis. We present it in form of a type system for a simple firstorder functional language with lists and trees, prove soundness, and describe automatic type inference based on linear programming. We have experimentally validated the automatic analysis on a wide range of examples from functional programming with lists and trees. The obtained bounds were compared with actual resource consumption. All bounds were asymptotically tight, and the constants were close or even identical to the optimal ones.
Probabilistically accurate program transformations
 In SAS
, 2011
"... Abstract. The standard approach to program transformation involves the use of discrete logical reasoning to prove that the transformation does not change the observable semantics of the program. We propose a new approach that, in contrast, uses probabilistic reasoning to justify the application of t ..."
Abstract

Cited by 38 (14 self)
 Add to MetaCart
(Show Context)
Abstract. The standard approach to program transformation involves the use of discrete logical reasoning to prove that the transformation does not change the observable semantics of the program. We propose a new approach that, in contrast, uses probabilistic reasoning to justify the application of transformations that may change, within probabilistic accuracy bounds, the result that the program produces. Our new approach produces probabilistic guarantees of the form P(D  ≥ B) ≤ ɛ, ɛ ∈ (0, 1), where D is the difference between the results that the transformed and original programs produce, B is an acceptability bound on the absolute value of D, and ɛ is the maximum acceptable probability of observing large D. We show how to use our approach to justify the application of loop perforation (which transforms loops to execute fewer iterations) to a set of computational patterns. 1
J.S.: Pathbased inductive synthesis for program inversion. In: PLDI
, 2011
"... ..."
(Show Context)
Bound Analysis of Imperative Programs with the Sizechange Abstraction
 IN: 18TH INT. STATIC ANALYSIS SYMPOSIUM
, 2011
"... The sizechange abstraction (SCA) is an important program abstraction for termination analysis, which has been successfully implemented in many tools for functional and logic programs. In this paper, we demonstrate that SCA is also a highly effective abstract domain for the bound analysis of imper ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
(Show Context)
The sizechange abstraction (SCA) is an important program abstraction for termination analysis, which has been successfully implemented in many tools for functional and logic programs. In this paper, we demonstrate that SCA is also a highly effective abstract domain for the bound analysis of imperative programs. We have implemented a bound analysis tool based on SCA for imperative programs. We abstract programs in a pathwise and context dependent manner, which enables our tool to analyze realworld programs effectively. Our work shows that SCA captures many of the essential ideas of previous termination and bound analysis and goes beyond in a conceptually simpler framework.
doi:10.1145/2240236.2240262 Continuity and Robustness of Programs
"... Computer scientists have long believed that software is different from physical systems in one fundamental way: while the latter have continuous dynamics, the former do not. In this paper, we argue that notions of continuity from mathematical analysis are relevant and interesting even for software. ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
Computer scientists have long believed that software is different from physical systems in one fundamental way: while the latter have continuous dynamics, the former do not. In this paper, we argue that notions of continuity from mathematical analysis are relevant and interesting even for software. First, we demonstrate that many everyday programs are continuous (i.e., arbitrarily small changes to their inputs only cause arbitrarily small changes to their outputs) or Lipschitz continuous (i.e., when their inputs change, their outputs change at most proportionally). Second, we give an mostlyautomatic framework for verifying that a program is continuous or Lipschitz, showing that traditional, discrete approaches to proving programs correct can be extended to reason about these properties. An immediate application of our analysis is in reasoning about the robustness of programs that execute on uncertain inputs. In the longer run, it raises hopes for a toolkit for reasoning about programs that freely combines logical and analytical mathematics. 1.
Using Bounded Model Checking to Focus Fixpoint Iterations
, 2011
"... Two classical sources of imprecision in static analysis by abstract interpretation are widening and merge operations. Merge operations can be done away by distinguishing paths, as in trace partitioning, at the expense of enumerating an exponential number of paths. In this article, we describe how to ..."
Abstract

Cited by 13 (6 self)
 Add to MetaCart
(Show Context)
Two classical sources of imprecision in static analysis by abstract interpretation are widening and merge operations. Merge operations can be done away by distinguishing paths, as in trace partitioning, at the expense of enumerating an exponential number of paths. In this article, we describe how to avoid such systematic exploration by focusing on a single path at a time, designated by SMTsolving. Our method combines well with acceleration techniques, thus doing away with widenings as well in some cases. We illustrate it over the wellknown domain of convex polyhedra.
Succinct representations for abstract interpretation
 In Static analysis (SAS
, 2012
"... Abstract. Abstract interpretation techniques can be made more precise by distinguishing paths inside loops, at the expense of possibly exponential complexity. SMTsolving techniques and sparse representations of paths and sets of paths avoid this pitfall. We improve previously proposed techniques fo ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
Abstract. Abstract interpretation techniques can be made more precise by distinguishing paths inside loops, at the expense of possibly exponential complexity. SMTsolving techniques and sparse representations of paths and sets of paths avoid this pitfall. We improve previously proposed techniques for guided static analysis and the generation of disjunctive invariants by combining them with techniques for succinct representations of paths and symbolic representations for transitions based on static single assignment. Because of the nonmonotonicity of the results of abstract interpretation with widening operators, it is difficult to conclude that some abstraction is more precise than another based on theoretical local precision results. We thus conducted extensive comparisons between our new techniques and previous ones, on a variety of opensource packages. 1.
Quantitative Abstraction Refinement ∗
"... We propose a general framework for abstraction with respect to quantitative properties, such as worstcase execution time, or power consumption. Our framework provides a systematic way for counterexample guided abstraction refinement for quantitative properties. The salient aspect of the framework ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
We propose a general framework for abstraction with respect to quantitative properties, such as worstcase execution time, or power consumption. Our framework provides a systematic way for counterexample guided abstraction refinement for quantitative properties. The salient aspect of the framework is that it allows anytime verification, that is, verification algorithms that can be stopped at any time (for example, due to exhaustion of memory), and report approximations that improve monotonically when the algorithms are given more time. We instantiate the framework with a number of quantitative abstractions and refinement schemes, which differ in terms of how much quantitative information they keep from the original system. We introduce both statebased and tracebased quantitative abstractions, and we describe conditions that define classes of quantitative properties for which the abstractions provide overapproximations. We give algorithms for evaluating the quantitative properties on the abstract systems. We present algorithms for counterexample based refinements for quantitative properties for both statebased and segmentbased abstractions. We perform a case study on worstcase execution time of executables to evaluate the anytime verification aspect and the quantitative abstractions we proposed.
Amortized resource analysis with polymorphic recursion and partial bigstep operational semantics
 In Proc. 8th APLAS, volume 6461 of LNCS
, 2010
"... Abstract. This paper studies the problem of statically determining upper bounds on the resource consumption of firstorder functional programs. A previous work approached the problem with an automatic typebased amortized analysis for polynomial resource bounds. The analysis is parametric in the re ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
(Show Context)
Abstract. This paper studies the problem of statically determining upper bounds on the resource consumption of firstorder functional programs. A previous work approached the problem with an automatic typebased amortized analysis for polynomial resource bounds. The analysis is parametric in the resource and can be instantiated to heap space, stack space, or clock cycles. Experiments with a prototype implementation have shown that programs are analyzed efficiently and that the computed bounds exactly match the measured worstcase resource behavior for many functions. This paper describes the inference algorithm that is used in the implementation of the system. It can deal with resourcepolymorphic recursion which is required in the type derivation of many functions. The computation of the bounds is fully automatic if a maximal degree of the polynomials is given. The soundness of the inference is proved with respect to a novel operational semantics for partial evaluations to show that the inferred bounds hold for terminating as well as nonterminating computations. A corollary is that runtime bounds also establish the termination of programs.