Results 1  10
of
29
Program Development Using Abstract Interpretation (and The Ciao System Preprocessor
 In 10th International Static Analysis Symposium (SAS’03), number 2694 in LNCS
, 2003
"... Abstract. The technique of Abstract Interpretation has allowed the development of very sophisticated global program analyses which are at the same time provably correct and practical. We present in a tutorial fashion a novel program development framework which uses abstract interpretation as a funda ..."
Abstract

Cited by 32 (23 self)
 Add to MetaCart
(Show Context)
Abstract. The technique of Abstract Interpretation has allowed the development of very sophisticated global program analyses which are at the same time provably correct and practical. We present in a tutorial fashion a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system libraries), to generate and simplify runtime tests, and specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedurelevel properties such as determinacy, termination, nonfailure, and bounds on resource consumption (time or space cost). CiaoPP, the preprocessor of the Ciao multiparadigm programming system, which implements the described functionality, will be used to illustrate the fundamental ideas.
Inference of Welltypings for Logic Programs with Application to Termination Analysis
, 2005
"... This paper develops a method to infer a polymorphic welltyping for a logic program. One of the main motivations is to contribute to a better automation of termination analysis in logic programs, by deriving types from which norms can automatically be constructed. Previous work on typebased termina ..."
Abstract

Cited by 14 (7 self)
 Add to MetaCart
(Show Context)
This paper develops a method to infer a polymorphic welltyping for a logic program. One of the main motivations is to contribute to a better automation of termination analysis in logic programs, by deriving types from which norms can automatically be constructed. Previous work on typebased termination analysis used either types declared by the user, or automatically generated monomorphic types describing the success set of predicates. Declared types are typically more precise and result in stronger termination conditions than those obtained with inferred types. Our type inference procedure involves solving set constraints generated from the program and derives a welltyping in contrast to a successset approximation. Experiments show that our automatically inferred welltypings are close to the declared types and thus result in termination conditions that are as good as those obtained with declared types for all our experiments to date. We describe the method, its implementation and experiments with termination analysis based on the inferred types.
Combining Norms to Prove Termination
, 2002
"... Automatic termination analyzers typically measure the size of terms applying norms which are mappings from terms to the natural numbers. This paper illustrates how to enable the use of size functions defined as tuples of these simpler norm functions. This approach enables us to simplify the problem ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
(Show Context)
Automatic termination analyzers typically measure the size of terms applying norms which are mappings from terms to the natural numbers. This paper illustrates how to enable the use of size functions defined as tuples of these simpler norm functions. This approach enables us to simplify the problem of deriving automatically a candidate norm with which to prove termination. Instead of deriving a single, complex norm function, it is sufficient to determine a collection of simpler norms, some combination of which, leads to a proof of termination. We propose that a collection of simple norms, one for each of the recursive datatypes in the program, is often a suitable choice. We first demonstrate the power of combining norm functions and then the adequacy of combining norms based on regulartypes.
Termination Analysis with Types is More Accurate
 In ICLP 2003: Proceedings of the 19th International Conference on Logic Programming, C. Palamidessi, Ed. Lecture Notes in Computer Science
, 2003
"... In this paper we show how we can use size and groundness analyses lifted to regular and (polymorphic) Hindley/Milner typed programs to determine more accurate termination of (type correct) programs. ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
In this paper we show how we can use size and groundness analyses lifted to regular and (polymorphic) Hindley/Milner typed programs to determine more accurate termination of (type correct) programs.
Semantic Query Optimization in the Presence of Types
"... Both semantic and typebased query optimization rely on the idea that queries often exhibit nontrivial rewritings if the state space of the database is restricted. Despite their close connection, these two problems to date have always been studied separately. We present a unifying, logicbased fram ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
Both semantic and typebased query optimization rely on the idea that queries often exhibit nontrivial rewritings if the state space of the database is restricted. Despite their close connection, these two problems to date have always been studied separately. We present a unifying, logicbased framework for query optimization in the presence of data dependencies and type information. It builds upon the classical chase algorithm and extends existing query minimization techniques to considerably larger classes of queries and dependencies. In particular, our setting requires chasing conjunctive queries (possibly with union and negation) in the presence of dependencies containing negation and disjunction. We study the applicability of the chase in this setting, develop novel conditions that guarantee its termination, identify fragments for which minimal query computation is always possible (w.r.t. a generic cost function), and investigate the complexity of related decision problems.
Reuse of Results in Termination Analysis of Typed Logic Programs
 In Static Analysis, 9th International Symposium
, 2002
"... Recent works by the authors address the problem of automating the selection of a candidate norm for the purpose of termination analysis. These works illustrate a powerful technique in which a collection of simple typebased norms, one for each data type in the program, are combined together to provi ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
Recent works by the authors address the problem of automating the selection of a candidate norm for the purpose of termination analysis. These works illustrate a powerful technique in which a collection of simple typebased norms, one for each data type in the program, are combined together to provide the candidate norm. This paper extends these results by investigating type polymorphism. We show that by considering polymorphic types we reduce, without sacrificing precision, the number of typebased norms which should be combined to provide the candidate norm. Moreover, we show that when a generic polymorphic typed program component occurs in one or more specific type contexts, we need not reanalyse it. All of the information concerning its termination and its e ect on the termination of other predicates in that context can be derived directly from the context independent analysis of that component based on norms derived from the polymorphic types.
Abstract conjunctive partial deduction using regular types and its application to model checking
 IN PROC. OF LOPSTR, NUMBER 2372 IN LNCS
, 2001
"... We present an abstract partial deduction technique which uses regular types as its domain and which can handle conjunctions, and thus perform deforestation and tupling. We provide a detailed description of all the required operations and present an implementation within the ecce system. We discuss ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
We present an abstract partial deduction technique which uses regular types as its domain and which can handle conjunctions, and thus perform deforestation and tupling. We provide a detailed description of all the required operations and present an implementation within the ecce system. We discuss the power of this new specialisation algorithm, especially in the light of verifying and specialising infinite state process algebras. Here, our new algorithm can provide a more precise treatment of synchronisation and can be used for refinement checking.
When Size Does Matter  Termination Analysis for Typed Logic Programs
 Logicbased Program Synthesis and Transformation, 11th International Workshop, LOPSTR 2001, Selected Papers, volume 2372 of LNCS
, 2002
"... Proofs of termination typically proceed by mapping program states to a well founded domain and showing that successive states of the computation are mapped to elements decreasing in size. Automated termination analysers for logic programs achieve this by measuring and comparing the sizes of succ ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
Proofs of termination typically proceed by mapping program states to a well founded domain and showing that successive states of the computation are mapped to elements decreasing in size. Automated termination analysers for logic programs achieve this by measuring and comparing the sizes of successive calls to recursive predicates. The size of the call is measured by a level mapping that in turn is based on a norm on the arguments of the call. A norm maps a term to a natural number.
Monotone ACtree automata
 In LPAR’05
, 2005
"... Abstract. We consider several questions about monotone ACtree automata, a class of equational tree automata whose transition rules correspond to rules in Kuroda normal form of contextsensitive grammars. Whereas it has been proved that this class has a decision procedure to determine if, given a mo ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We consider several questions about monotone ACtree automata, a class of equational tree automata whose transition rules correspond to rules in Kuroda normal form of contextsensitive grammars. Whereas it has been proved that this class has a decision procedure to determine if, given a monotone ACtree automaton, it accepts no terms, other important decidability or complexity results have not been wellinvestigated yet. In the paper, we prove that the membership problem for monotone ACtree automata is PSPACEcomplete. We then study the expressiveness of monotone ACtree automata: precisely, we prove that the family of ACregular tree languages is strictly subsumed in that of ACmonotone tree languages. The proof technique used in obtaining the above result yields the answers to two different questions, specifically that the family of monotone ACtree languages is not closed under complementation, and that the inclusion problem for monotone ACtree automata is undecidable.
A posteriori soundness for nondeterministic abstract interpretations
 In VMCAI ’09: Proceedings of the 10th International Conference on Verification, Model Checking, and Abstract Interpretation
"... Abstract. An abstract interpretation’s resourceallocation policy (e.g., one heap summary node per allocation site) largely determines both its speed and precision. Historically, context has driven allocation policies, and as a result, these policies are said to determine the “contextsensitivity” of ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
Abstract. An abstract interpretation’s resourceallocation policy (e.g., one heap summary node per allocation site) largely determines both its speed and precision. Historically, context has driven allocation policies, and as a result, these policies are said to determine the “contextsensitivity” of the analysis. This work gives analysis designers newfound freedom to manipulate speed and precision by severing the link between allocation policy and contextsensitivity: we find that abstract allocation policies may be unhinged not only from context, but also from even a predefined correspondence with a concrete allocation policy. We do so by proving that abstract allocation policies can be made nondeterministic without sacrificing correctness; this nondeterminism permits precisionguided allocation policies previously assumed to be unsafe. To prove correctness, we introduce the notion of a posteriori soundness for an analysis. A proof of a posteriori soundness differs from a standard proof of soundness in that the abstraction maps used in an a posteriori proof cannot be constructed until after an analysis has been run. Delaying construction allows them to be built so as to justify the decisions made by nondeterminism. The crux of the a posteriori soundness theorem is to demonstrate that a justifying abstraction map can always be constructed. 1