Results 1 - 10
of
13
A Classification and Survey of Analysis Strategies For Software Product Lines
"... Software-product-line engineering has gained considerable momentum in the recent years, both in industry and in academia. A software product line is a family of software products that share a common set of features. Software product lines challenge traditional analysis techniques, such as type check ..."
Abstract
-
Cited by 33 (18 self)
- Add to MetaCart
Software-product-line engineering has gained considerable momentum in the recent years, both in industry and in academia. A software product line is a family of software products that share a common set of features. Software product lines challenge traditional analysis techniques, such as type checking, model checking, and theorem proving, in their quest of ensuring correctness and reliability of software. Simply creating and analyzing all products of a product line is usually not feasible, due to the potentially exponential number of valid feature combinations. Recently, researchers began to develop analysis techniques that take the distinguishing properties of software product lines into account, for example, by checking feature-related code in isolation or by exploiting variability information during analysis. The emerging field of product-line analyses is both broad and diverse, so it is difficult for researchers and practitioners to understand their similarities and differences. We propose a classification of product-line analyses to enable systematic research and application. Based on our insights with classifying and comparing a corpus of 123 research articles, we
Compositional computational reflection
, 2014
"... Abstract. Current work on computational reflection is single-minded; each reflective procedure is written with a specific application or scope in mind. Composition of these reflective procedures is done by a proof-generating tactic language such as Ltac. This composition, however, comes at the cost ..."
Abstract
-
Cited by 7 (2 self)
- Add to MetaCart
(Show Context)
Abstract. Current work on computational reflection is single-minded; each reflective procedure is written with a specific application or scope in mind. Composition of these reflective procedures is done by a proof-generating tactic language such as Ltac. This composition, however, comes at the cost of both larger proof terms and redundant prepro-cessing. In this work, we propose a methodology for writing compos-able reflective procedures that solve many small tasks in a single invoca-tion. The key technical insights are techniques for reasoning semantically about extensible syntax in intensional type theory. Our techniques make it possible to compose sound procedures and write generic procedures parametrized by lemmas mimicking Coq’s support for hint databases.
Modular Type-Safety Proofs in Agda
"... Methods for reusing code are widespread and well researched, but methods for reusing proofs are still emerging. We consider the use of dependent types for this purpose, introducing a modular approach for composing mechanized proofs. We show that common techniques for abstracting algorithms over data ..."
Abstract
-
Cited by 5 (0 self)
- Add to MetaCart
(Show Context)
Methods for reusing code are widespread and well researched, but methods for reusing proofs are still emerging. We consider the use of dependent types for this purpose, introducing a modular approach for composing mechanized proofs. We show that common techniques for abstracting algorithms over data structures naturally translate to abstractions over proofs. We introduce a language composed of a series of smaller language components, each defined as functors, and tie them together by taking the fixed point of their sum [Malcom, 1990]. We then give proofs of type preservation for each language component and show how to compose these proofs into a proof for the entire language, again by taking the fixed point of a sum of functors.
Modular monadic meta-theory
"... This paper presents 3MT, a framework for developing modular mechanized meta-theory of languages with effects. Using 3MT, individual language features and their corresponding definitions – semantic functions, theorem statements and proofs – can be built separately and then reused to create different ..."
Abstract
-
Cited by 5 (0 self)
- Add to MetaCart
This paper presents 3MT, a framework for developing modular mechanized meta-theory of languages with effects. Using 3MT, individual language features and their corresponding definitions – semantic functions, theorem statements and proofs – can be built separately and then reused to create different languages with fully mechanized meta-theory. 3MT tackles the multi-faceted problem of modularity in this setting by combining modular datatypes and monads to define effectful semantic functions on a per-feature basis, without fixing a particular set of effects or language constructs. The main challenge that 3MT addresses is how to modularize the theorems and proofs for these modular monadic definitions. Theorem statements like type soundness depend intimately on the effects used in a language, making modularity particularly challenging to achieve. 3MT overcomes this problem by splitting theorems into two parts: a reusable theorem that captures type soundness of a feature in a language-independent way, and a languagespecific type soundness theorem. Proofs of the first only mention the effects of a specific feature, and proofs of the second need only to know that the first theorem holds for the features included in a language. To establish both theorems, 3MT uses two key reasoning techniques: modular induction and algebraic laws about effects. Several effectful language features, including references and errors, illustrate the capabilities of our framework. We reuse these features to build fully mechanized definitions and proofs for a number of languages, including a version of mini-ML with effects.
TeJaS: Retrofitting Type Systems for JavaScript
"... JavaScript programs vary widely in functionality, complexity, and use, and analyses of these programs must accommodate such variations. Type-based analyses are typically the simplest such analyses, but due to the language’s subtle idioms and many application-specific needs—such as ensuring general-p ..."
Abstract
-
Cited by 4 (0 self)
- Add to MetaCart
JavaScript programs vary widely in functionality, complexity, and use, and analyses of these programs must accommodate such variations. Type-based analyses are typically the simplest such analyses, but due to the language’s subtle idioms and many application-specific needs—such as ensuring general-purpose type correctness, security properties, or proper library usage—we have found that a single type system does not suffice for all purposes. However, these varied uses still share many reusable common elements. In this paper we present TeJaS, a framework for building type systems for JavaScript. TeJaS has been engineered modularly to encourage experimentation. Its initial type environment is reified, to admit easy modeling of the various execution contexts of JavaScript programs, and its type language and typing rules are extensible, to enable variations of the type system to be constructed easily. The paper presents the base TeJaS type system, which performs traditional type-checking for JavaScript. Because JavaScript demands complex types, we explain several design decisions to improve user ergonomics. We then describe TeJaS’s modular structure, and illustrate it by reconstructing the essence of a very different type system for JavaScript. Systems built from TeJaS have been applied to several real-world, third-party JavaScript programs.
Compilation à la Carte
"... In previous work, we proposed a new approach to the problem of implementing compilers in a modular manner, by combining earlier work on the development of modular interpreters using monad transformers with the à la carte approach to modular syntax. In this article, we refine and extend our existing ..."
Abstract
-
Cited by 3 (3 self)
- Add to MetaCart
(Show Context)
In previous work, we proposed a new approach to the problem of implementing compilers in a modular manner, by combining earlier work on the development of modular interpreters using monad transformers with the à la carte approach to modular syntax. In this article, we refine and extend our existing framework in a number of directions. In particular, we show how generalised algebraic datatypes can be used to support a more modular approach to typing individual language features, we increase the expressive power of the framework by considering mutable state, variable binding, and the issue of noncommutative effects, and we show how the Zinc Abstract Machine can be adapted to provide a modular universal target machine for our modular compilers.
Type Soundness and Race Freedom for Mezzo
- In Proceedings of the 12th International Symposium on Functional and Logic Programming (FLOPS 2014) (Lecture Notes in Computer Science
, 2014
"... Abstract. The programming language Mezzo is equipped with a rich type system that controls aliasing and access to mutable memory. We incorporate shared-memory concurrency into Mezzo and present a mod-ular formalization of its core type system, in the form of a concurrent λ-calculus, which we extend ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
(Show Context)
Abstract. The programming language Mezzo is equipped with a rich type system that controls aliasing and access to mutable memory. We incorporate shared-memory concurrency into Mezzo and present a mod-ular formalization of its core type system, in the form of a concurrent λ-calculus, which we extend with references and locks. We prove that well-typed programs do not go wrong and are data-race free. Our definitions and proofs are machine-checked. 1
Pick’n’Fix Capturing Control Flow in Modular Compilers
"... Abstract. We present a modular framework for implementing languages with effects and control structures such as loops and conditionals. This framework enables modular definitions of both syntax and semantics as well as modular implementations of compilers and virtual machines. In order to compile co ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
Abstract. We present a modular framework for implementing languages with effects and control structures such as loops and conditionals. This framework enables modular definitions of both syntax and semantics as well as modular implementations of compilers and virtual machines. In order to compile control structures, in particular cyclic ones, we employ Oliveira and Cook’s purely functional representation of graphs. More-over, to separate control flow features semantically from other language features, we represent source languages using Johann and Ghani’s en-coding of generalised algebraic datatypes as fixpoints of higher-order functors. We demonstrate the usage of our modular compiler framework with an extended running example and highlight the extensibility of our modular compiler implementations. 1
Pick’n’Fix Modular Control Structures
"... We present a modular framework for implementing languages with effects and control structures such as loops and conditionals. This framework enables modular definitions of both syntax and semantics as well as modular implementations of compilers and virtual machines. In order to compile control stru ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
We present a modular framework for implementing languages with effects and control structures such as loops and conditionals. This framework enables modular definitions of both syntax and semantics as well as modular implementations of compilers and virtual machines. In order to compile control structures, in particular cyclic ones, we employ Oliveira and Cook’s purely functional representation of graphs. Moreover, to separate control flow features semantically from other language features, we represent source languages using Johann and Ghani’s encoding of generalised algebraic datatypes as fixpoints of higher-order functors. We demonstrate the usage modular compiler framework with two running examples and highlight the extensibility of our modular semantic definitions and compiler implementations.
unknown title
"... My research interests are in the area of programming languages, with a focus on functional programming and object-oriented programming. I am especially interested on the modularity aspects of programs, proofs and programming languages. I am also interested in better programming models for graphs, pa ..."
Abstract
- Add to MetaCart
(Show Context)
My research interests are in the area of programming languages, with a focus on functional programming and object-oriented programming. I am especially interested on the modularity aspects of programs, proofs and programming languages. I am also interested in better programming models for graphs, parallelism and concurrency. I believe that existing programming languages and proof/programming techniques have several limitations with respect to modularity. Starting with McIlroy [17] vision of software as components, modularity has been seen as a holy grail of software engineering for over 40 years. However, the reality is that programming languages still strugle with basic modularity issues. Most of these issues are well-known to the programming languages and software engineering communities, which are still trying to address them. The lack of extensibility, as famously emphasized in the expression problem [26], is one of the most basic issues. Another issue are crosscutting and orthogonal concerns (or aspects) such as logging, memoization or security, which have been popularized by Aspect-Oriented Programming [15]. Finally, another problem is the lack of generic programming abstractions in programming languages, which lead to tedious, similar looking boilerplate code in programs.