Results 1  10
of
13
Contracts Made Manifest
, 2009
"... Since Findler and Felleisen [2002] introduced higherorder contracts, many variants of their system have been proposed. Broadly, these fall into two groups: some follow Findler and Felleisen in using latent contracts, purely dynamic checks that are transparent to the type system; others use manifest ..."
Abstract

Cited by 19 (3 self)
 Add to MetaCart
Since Findler and Felleisen [2002] introduced higherorder contracts, many variants of their system have been proposed. Broadly, these fall into two groups: some follow Findler and Felleisen in using latent contracts, purely dynamic checks that are transparent to the type system; others use manifest contracts, where refinement types record the most recent check that has been applied. These two approaches are generally assumed to be equivalent—different ways of implementing the same idea, one retaining a simple type system, and the other providing more static information. Our goal is to formalize and clarify this folklore understanding. Our work extends that of Gronski and Flanagan [2007], who defined a latent calculus λC and a manifest calculus λH, gave a translation φ from λC to λH, and proved that if a λC term reduces to a constant, then so does its φimage. We enrich their account with a translation ψ in the opposite direction and prove an analogous theorem for ψ. More importantly, we generalize the whole framework to dependent contracts, where the predicates in contracts can mention variables from the local context. This extension is both pragmatically crucial, supporting a much more interesting range of contracts, and theoretically challenging. We define dependent versions of λC (following Findler and Felleisen’s semantics) and λH, establish type soundness—a challenging result in itself, for λH—and extend φ and ψ accordingly. Interestingly, the intuition that the two systems are equivalent appears to break down here: we show that ψ preserves behavior exactly, but that a natural extension of φ to the dependent case will sometimes yield terms that blame more because of a subtle difference in the treatment of dependent function contracts when the codomain contract itself abuses the argument. Note to reviewers: This is a preliminary version. It is technically complete, but not yet fully polished. Please do not distribute. 1 1
Metatheory à la carte
 In POPL ’13
, 2013
"... Formalizing metatheory, or proofs about programming languages, in a proof assistant has many wellknown benefits. However, the considerable effort involved in mechanizing proofs has prevented it from becoming standard practice. This cost can be amortized by reusing as much of an existing formalizat ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
(Show Context)
Formalizing metatheory, or proofs about programming languages, in a proof assistant has many wellknown benefits. However, the considerable effort involved in mechanizing proofs has prevented it from becoming standard practice. This cost can be amortized by reusing as much of an existing formalization as possible when building a new language or extending an existing one. Unfortunately reuse of components is typically adhoc, with the language designer cutting and pasting existing definitions and proofs, and expending considerable effort to patch up the results. This paper presents a more structured approach to the reuse of formalizations of programming language semantics through the composition of modular definitions and proofs. The key contribution is the development of an approach to induction for extensible Church encodings which uses a novel reinterpretation of the universal property of folds. These encodings provide the foundation for a framework, formalized in Coq, which uses type classes to automate the composition of proofs from modular components. Several interesting language features, including binders and general recursion, illustrate the capabilities of our framework. We reuse these features to build fully mechanized definitions and proofs for a number of languages, including a version of miniML. Bounded induction enables proofs of properties for noninductive semantic functions, and mediating type classes enable proof adaptation for more featurerich languages. 1.
Binders Unbound
 In Proc. of the 16th International Conference on Functional Programming (ICFP
, 2011
"... Implementors of compilers, program refactorers, theorem provers, proof checkers, and other systems that manipulate syntax know that dealing with name binding is difficult to do well. Operations such as αequivalence and captureavoiding substitution seem simple, yet subtle bugs often go undetected. ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
(Show Context)
Implementors of compilers, program refactorers, theorem provers, proof checkers, and other systems that manipulate syntax know that dealing with name binding is difficult to do well. Operations such as αequivalence and captureavoiding substitution seem simple, yet subtle bugs often go undetected. Furthermore, their implementations are tedious, requiring “boilerplate ” code that must be updated whenever the object language definition changes. Many researchers have therefore sought to specify binding syntax declaratively, so that tools can correctly handle the details behind the scenes. This idea has been the inspiration for many new systems (such as Beluga, Delphin, FreshML, FreshOCaml, Cαml, FreshLib, and Ott) but there is still room for improvement in expressivity, simplicity and convenience. In this paper, we present a new domainspecific language, UNBOUND, for specifying binding structure. Our language is particularly expressive—it supports multiple atom types, pattern binders, type annotations, recursive binders, and nested binding (necessary for telescopes, a feature found in dependentlytyped languages). However, our specification language is also simple, consisting of just five basic combinators. We provide a formal semantics for this language derived from a locally nameless representation and prove that it satisfies a number of desirable properties. We also present an implementation of our binding specification language as a GHC Haskell library implementing an embedded domain specific language (EDSL). By using Haskell type constructors to represent binding combinators, we implement the EDSL succinctly using datatypegeneric programming. Our implementation supports a number of features necessary for practical programming, including flexibility in the treatment of userdefined types, besteffort name preservation (for error messages), and integration with Haskell’s monad transformer library.
Dependent types and program equivalence
 In Proceedings of the 37th ACM SIGACTSIGPLAN Symposium on Principles of Programming Languages (POPL). ACM
, 2009
"... The definition of type equivalence is one of the most important design issues for any typed language. In dependentlytyped languages, because terms appear in types, this definition must rely on a definition of term equivalence. In that case, decidability of type checking requires decidability for the ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
The definition of type equivalence is one of the most important design issues for any typed language. In dependentlytyped languages, because terms appear in types, this definition must rely on a definition of term equivalence. In that case, decidability of type checking requires decidability for the term equivalence relation. Almost all dependentlytyped languages require this relation to be decidable. Some, such as Coq, Epigram or Agda, do so by employing analyses to force all programs to terminate. Conversely, others, such as DML, ATS, Ωmega, or Haskell, allow nonterminating computation, but do not allow those terms to appear in types. Instead, they identify a terminating index language and use singleton types to connect indices to computation. In both cases, decidable type checking comes at a cost, in terms of complexity and expressiveness. Conversely, the benefits to be gained by decidable type checking are modest. Termination analyses allow dependently typed programs to verify total correctness properties. However, decidable type checking is not a prerequisite for type safety. Furthermore, decidability does not imply tractability. A decidable approximation of program equivalence may not be useful in practice. Therefore, we take a different approach: instead of a fixed notion for term equivalence, we parameterize our type system with an abstract relation that is not necessarily decidable. We then design a novel set of typing rules that require only weak properties of this abstract relation in the proof of the preservation and progress lemmas. This design provides flexibility: we compare valid instantiations of term equivalence which range from betaequivalence, to contextual equivalence, to some exotic equivalences.
GMeta: A Generic Formal Metatheory Framework for FirstOrder Representations
"... This paper presents GMeta: a generic framework for firstorder representations of variable binding that provides once and for all many of the socalled infrastructure lemmas and definitions required in mechanizations of formal metatheory. The key idea is to employ datatypegeneric programming (DGP) ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
This paper presents GMeta: a generic framework for firstorder representations of variable binding that provides once and for all many of the socalled infrastructure lemmas and definitions required in mechanizations of formal metatheory. The key idea is to employ datatypegeneric programming (DGP) and modular programming techniques to deal with the infrastructure overhead. Using a generic universe for representing a large family of object languages we define datatypegeneric libraries of infrastructure for firstorder representations such as locally nameless or de Bruijn indices. Modules are used to provide templates: a convenient interface between the datatypegeneric libraries and the endusers of GMeta. We conducted case studies based on the POPLmark challenge, and showed that dealing with challenging binding constructs, like the ones found in System F<:, is possible with GMeta. All of GMeta’s generic infrastructure is implemented in the Coq theorem prover. Furthermore, due to GMeta’s modular design, the libraries can be easily used, extended, and customized by users.
A Generic Formal Metatheory Framework for FirstOrder Representations
"... This paper presents GMETA: a generic framework for firstorder representations of variable binding that provides once and for all many of the socalled infrastructure lemmas and definitions required in mechanizations of formal metatheory. The framework employs datatypegeneric programming and modula ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
This paper presents GMETA: a generic framework for firstorder representations of variable binding that provides once and for all many of the socalled infrastructure lemmas and definitions required in mechanizations of formal metatheory. The framework employs datatypegeneric programming and modular programming techniques to provide a universe representing a family of datatypes. This universe is generic in two different ways: it is languagegeneric in the sense that several object languages can be represented within the universe; and it is representationgeneric, meaning that it is parameterizable over the particular choice of firstorder representations for binders (for example, locally nameless or de Bruijn). Using this universe, several libraries providing generic infrastructure lemmas and definitions are implemented. These libraries are used in case studies based on the POPLmark challenge, showing that dealing with challenging binding constructs, like the ones found in System F<:, is possible with GMETA. All of GMETA’s generic infrastructure is implemented in the Coq theorem prover, ensuring the soundness of that infrastructure. Furthermore, due to GMETA’s modular design, the libraries can be easily used, extended and customized by end users. 1.
OF THE UNIVERSITY OF MINNESOTA BY
"... Many people have supported me during the development of this thesis and I owe them all a debt of gratitude. Firstly, I would like to thank my advisor Gopalan Nadathur for his patience and guidance which have played a significant part in my development as a researcher. His willingness to share his op ..."
Abstract
 Add to MetaCart
Many people have supported me during the development of this thesis and I owe them all a debt of gratitude. Firstly, I would like to thank my advisor Gopalan Nadathur for his patience and guidance which have played a significant part in my development as a researcher. His willingness to share his opinions on everything from academic life to playing squash has helped me to develop a perspective and to have fun while doing this. I look forward to continuing my interactions with him far into the future. I am grateful to Dale Miller for sharing with me an excitement for research and an appreciation of the uncertainty that precedes understanding. I have never met anybody else who so enjoys when things seem amiss, because he knows that a new perspective will eventually emerge and bring clarity. This thesis has been heavily influenced by the time I have spent working with Alwen Tiu, David Baelde, Zach Snow, and Xiaochu Qi. Understanding their work has given me a deeper understanding of my own research and its role in the bigger picture. I am thankful for the time I have had with each and every one of them.
The Gourmet and Other Stories of Modern China
"... I know you have faith in me; let me take it with me on the road before me and I will be able to go on forever. “Tang Qiaodi” ..."
Abstract
 Add to MetaCart
(Show Context)
I know you have faith in me; let me take it with me on the road before me and I will be able to go on forever. “Tang Qiaodi”
Union, Intersection, and Refinement Types and Reasoning About Type Disjointness for Secure Protocol Implementations
, 2012
"... We present a new type system for verifying the security of cryptographic protocol implementations. The type system combines prior work on refinement types, with union, intersection, and polymorphic types, and with the novel ability to reason statically about the disjointness of types. The increased ..."
Abstract
 Add to MetaCart
(Show Context)
We present a new type system for verifying the security of cryptographic protocol implementations. The type system combines prior work on refinement types, with union, intersection, and polymorphic types, and with the novel ability to reason statically about the disjointness of types. The increased expressivity enables the analysis of important protocol classes that were previously out of scope for the typebased analyses of protocol implementations. In particular, our types can statically characterize: (i) more usages of asymmetric cryptography, such as signatures of private data and encryptions of authenticated data; (ii) authenticity and integrity properties achieved by showing knowledge of secret data; (iii) applications based on zeroknowledge proofs. The type system comes with a mechanized proof of correctness and an efficient typechecker.
tion]: Logics and Meanings of Programs—Specifying and Verify
"... Programs that treat datatypes with binders, such as theorem provers or higherorder compilers, are regularly used for missioncritical purposes, and must be both reliable and performant. Formally proving such programs using as much automation as possible is highly desirable. In this paper, we propo ..."
Abstract
 Add to MetaCart
(Show Context)
Programs that treat datatypes with binders, such as theorem provers or higherorder compilers, are regularly used for missioncritical purposes, and must be both reliable and performant. Formally proving such programs using as much automation as possible is highly desirable. In this paper, we propose a generic approach to handle datatypes with binders both in the program and its specification in a way that facilitates automated reasoning about such datatypes and also leads to a reasonably efficient code. Our method is implemented in the Why3 environment for program verification. We validate it on the examples of a lambdainterpreter with several reduction strategies and a simple tableauxbased theorem prover.