• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations | Disambiguate

LNgen: Tool support for locally nameless representations (2010)

by B Aydemir, S Weirich
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 16
Next 10 →

Contracts Made Manifest

by Michael Greenberg, Benjamin C. Pierce, Stephanie Weirich , 2009
"... Since Findler and Felleisen [2002] introduced higher-order contracts, many variants of their system have been proposed. Broadly, these fall into two groups: some follow Findler and Felleisen in using latent contracts, purely dynamic checks that are transparent to the type system; others use manifest ..."
Abstract - Cited by 24 (4 self) - Add to MetaCart
Since Findler and Felleisen [2002] introduced higher-order contracts, many variants of their system have been proposed. Broadly, these fall into two groups: some follow Findler and Felleisen in using latent contracts, purely dynamic checks that are transparent to the type system; others use manifest contracts, where refinement types record the most recent check that has been applied. These two approaches are generally assumed to be equivalent—different ways of implementing the same idea, one retaining a simple type system, and the other providing more static information. Our goal is to formalize and clarify this folklore understanding. Our work extends that of Gronski and Flanagan [2007], who defined a latent calculus λC and a manifest calculus λH, gave a translation φ from λC to λH, and proved that if a λC term reduces to a constant, then so does its φ-image. We enrich their account with a translation ψ in the opposite direction and prove an analogous theorem for ψ. More importantly, we generalize the whole framework to dependent contracts, where the predicates in contracts can mention variables from the local context. This extension is both pragmatically crucial, supporting a much more interesting range of contracts, and theoretically challenging. We define dependent versions of λC (following Findler and Felleisen’s semantics) and λH, establish type soundness—a challenging result in itself, for λH—and extend φ and ψ accordingly. Interestingly, the intuition that the two systems are equivalent appears to break down here: we show that ψ preserves behavior exactly, but that a natural extension of φ to the dependent case will sometimes yield terms that blame more because of a subtle difference in the treatment of dependent function contracts when the codomain contract itself abuses the argument. Note to reviewers: This is a preliminary version. It is technically complete, but not yet fully polished. Please do not distribute. 1 1

Meta-theory à la carte

by Benjamin Delaware, Bruno C. D. S. Oliveira, Tom Schrijvers - In POPL ’13 , 2013
"... Formalizing meta-theory, or proofs about programming languages, in a proof assistant has many well-known benefits. However, the considerable effort involved in mechanizing proofs has prevented it from becoming standard practice. This cost can be amortized by reusing as much of an existing formalizat ..."
Abstract - Cited by 13 (3 self) - Add to MetaCart
Formalizing meta-theory, or proofs about programming languages, in a proof assistant has many well-known benefits. However, the considerable effort involved in mechanizing proofs has prevented it from becoming standard practice. This cost can be amortized by reusing as much of an existing formalization as possible when building a new language or extending an existing one. Unfortunately reuse of components is typically ad-hoc, with the language designer cutting and pasting existing definitions and proofs, and expending considerable effort to patch up the results. This paper presents a more structured approach to the reuse of formalizations of programming language semantics through the composition of modular definitions and proofs. The key contribution is the development of an approach to induction for extensible Church encodings which uses a novel reinterpretation of the universal property of folds. These encodings provide the foundation for a framework, formalized in Coq, which uses type classes to automate the composition of proofs from modular components. Several interesting language features, including binders and general recursion, illustrate the capabilities of our framework. We reuse these features to build fully mechanized definitions and proofs for a number of languages, including a version of mini-ML. Bounded induction enables proofs of properties for non-inductive semantic functions, and mediating type classes enable proof adaptation for more feature-rich languages. 1.
(Show Context)

Citation Context

...approach [1]. This involves developing a number of straightforward, but tedious infrastructure lemmas and definitions for each new language. Such tedious infrastructure can be automatically generated =-=[2]-=- or reused from data type-generic definitions [21]. However this typically requires additional tool support. A higher-order representation like PHOAS [7] avoids most infrastructure definitions. While ...

Binders Unbound

by Stephanie Weirich, Brent A. Yorgey, Tim Sheard - In Proc. of the 16th International Conference on Functional Programming (ICFP , 2011
"... Implementors of compilers, program refactorers, theorem provers, proof checkers, and other systems that manipulate syntax know that dealing with name binding is difficult to do well. Operations such as α-equivalence and capture-avoiding substitution seem simple, yet subtle bugs often go undetected. ..."
Abstract - Cited by 11 (1 self) - Add to MetaCart
Implementors of compilers, program refactorers, theorem provers, proof checkers, and other systems that manipulate syntax know that dealing with name binding is difficult to do well. Operations such as α-equivalence and capture-avoiding substitution seem simple, yet subtle bugs often go undetected. Furthermore, their implementations are tedious, requiring “boilerplate ” code that must be updated whenever the object language definition changes. Many researchers have therefore sought to specify binding syntax declaratively, so that tools can correctly handle the details behind the scenes. This idea has been the inspiration for many new systems (such as Beluga, Delphin, FreshML, FreshOCaml, Cαml, FreshLib, and Ott) but there is still room for improvement in expressivity, simplicity and convenience. In this paper, we present a new domain-specific language, UN-BOUND, for specifying binding structure. Our language is particularly expressive—it supports multiple atom types, pattern binders, type annotations, recursive binders, and nested binding (necessary for telescopes, a feature found in dependently-typed languages). However, our specification language is also simple, consisting of just five basic combinators. We provide a formal semantics for this language derived from a locally nameless representation and prove that it satisfies a number of desirable properties. We also present an implementation of our binding specification language as a GHC Haskell library implementing an embedded domain specific language (EDSL). By using Haskell type constructors to represent binding combinators, we implement the EDSL succinctly using datatype-generic programming. Our implementation supports a number of features necessary for practical programming, including flexibility in the treatment of user-defined types, besteffort name preservation (for error messages), and integration with Haskell’s monad transformer library.
(Show Context)

Citation Context

...ward is a testament to the elegance of the locally nameless representation. There is already a lot of work to draw on from the metatheory of locally nameless representations in the single binder case =-=[1, 2]-=-; much of the metatheory here can be seen as an extension of that prior work. 5.1 Local closure One important property of the locally nameless representation is that only some terms are good represent...

Dependent types and program equivalence

by Limin Jia, Jianzhou Zhao, Vilhelm Sjöberg, Stephanie Weirich - In Proceedings of the 37th ACM SIGACT-SIGPLAN Symposium on Principles of Programming Languages (POPL). ACM , 2009
"... The definition of type equivalence is one of the most important design issues for any typed language. In dependentlytyped languages, because terms appear in types, this definition must rely on a definition of term equivalence. In that case, decidability of type checking requires decidability for the ..."
Abstract - Cited by 8 (3 self) - Add to MetaCart
The definition of type equivalence is one of the most important design issues for any typed language. In dependentlytyped languages, because terms appear in types, this definition must rely on a definition of term equivalence. In that case, decidability of type checking requires decidability for the term equivalence relation. Almost all dependently-typed languages require this relation to be decidable. Some, such as Coq, Epigram or Agda, do so by employing analyses to force all programs to terminate. Conversely, others, such as DML, ATS, Ωmega, or Haskell, allow nonterminating computation, but do not allow those terms to appear in types. Instead, they identify a terminating index language and use singleton types to connect indices to computation. In both cases, decidable type checking comes at a cost, in terms of complexity and expressiveness. Conversely, the benefits to be gained by decidable type checking are modest. Termination analyses allow dependently typed programs to verify total correctness properties. However, decidable type checking is not a prerequisite for type safety. Furthermore, decidability does not imply tractability. A decidable approximation of program equivalence may not be useful in practice. Therefore, we take a different approach: instead of a fixed notion for term equivalence, we parameterize our type system with an abstract relation that is not necessarily decidable. We then design a novel set of typing rules that require only weak properties of this abstract relation in the proof of the preservation and progress lemmas. This design provides flexibility: we compare valid instantiations of term equivalence which range from beta-equivalence, to contextual equivalence, to some exotic equivalences.

GMeta: A Generic Formal Metatheory Framework for First-Order Representations

by Gyesik Lee, Bruno C. D. S. Oliveira, Sungkeun Cho, Kwangkeun Yi
"... This paper presents GMeta: a generic framework for first-order representations of variable binding that provides once and for all many of the so-called infrastructure lemmas and definitions required in mechanizations of formal metatheory. The key idea is to employ datatype-generic programming (DGP) ..."
Abstract - Cited by 2 (1 self) - Add to MetaCart
This paper presents GMeta: a generic framework for first-order representations of variable binding that provides once and for all many of the so-called infrastructure lemmas and definitions required in mechanizations of formal metatheory. The key idea is to employ datatype-generic programming (DGP) and modular programming techniques to deal with the infrastructure overhead. Using a generic universe for representing a large family of object languages we define datatype-generic libraries of infrastructure for first-order representations such as locally nameless or de Bruijn indices. Modules are used to provide templates: a convenient interface between the datatype-generic libraries and the endusers of GMeta. We conducted case studies based on the POPLmark challenge, and showed that dealing with challenging binding constructs, like the ones found in System F<:, is possible with GMeta. All of GMeta’s generic infrastructure is implemented in the Coq theorem prover. Furthermore, due to GMeta’s modular design, the libraries can be easily used, extended, and customized by users.
(Show Context)

Citation Context

...es not need to know how to prove many basic infrastructure lemmas, since those are provided by GMeta’s libraries. Finally, we should mention that one advantage of generative approaches such as LNgen (=-=Aydemir and Weirich 2009-=-) is that the cost-of-entry, in terms of using the lemmas and definitions provided by LNgen, is a bit lower than in GMeta. This is because the generated infrastructure is directly defined in terms of ...

A Generic Formal Metatheory Framework for First-Order Representations

by Gyesik Lee, Bruno C. D. S. Oliveira, Sungkeun Cho, Kwangkeun Yi
"... This paper presents GMETA: a generic framework for first-order representations of variable binding that provides once and for all many of the so-called infrastructure lemmas and definitions required in mechanizations of formal metatheory. The framework employs datatype-generic programming and modula ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
This paper presents GMETA: a generic framework for first-order representations of variable binding that provides once and for all many of the so-called infrastructure lemmas and definitions required in mechanizations of formal metatheory. The framework employs datatype-generic programming and modular programming techniques to provide a universe representing a family of datatypes. This universe is generic in two different ways: it is language-generic in the sense that several object languages can be represented within the universe; and it is representation-generic, meaning that it is parameterizable over the particular choice of firstorder representations for binders (for example, locally nameless or de Bruijn). Using this universe, several libraries providing generic infrastructure lemmas and definitions are implemented. These libraries are used in case studies based on the POPLmark challenge, showing that dealing with challenging binding constructs, like the ones found in System F<:, is possible with GMETA. All of GMETA’s generic infrastructure is implemented in the Coq theorem prover, ensuring the soundness of that infrastructure. Furthermore, due to GMETA’s modular design, the libraries can be easily used, extended and customized by end users. 1.
(Show Context)

Citation Context

...2010/8/3GMETA provides much of the tedious infrastructure boilerplate that would constitute a large part of the whole development otherwise. Closest to our work are generative approaches like LNgen (=-=Aydemir and Weirich 2009-=-), which uses an external tool, based on Ott (Sewell et al. 2010) specifications, to generate the infrastructure lemmas and definitions for a particular language automatically. Generative approaches h...

Verified Programs with Binders

by Martin Clochard, Claude Marche, Andrei Paskevich , 2014
"... ..."
Abstract - Add to MetaCart
Abstract not found

CS Study Group FY07 Phase 2 Machine-Checked Metatheory for Security-Oriented Languages

by Pis Stephanie Weirich, Steve Zdancewic
"... DTIC ® has determined on >£<. / c%$/£) that this Technical Document has the Distribution Statement checked below. The current distribution for this document can ..."
Abstract - Add to MetaCart
DTIC ® has determined on &gt;£&lt;. / c%$/£) that this Technical Document has the Distribution Statement checked below. The current distribution for this document can
(Show Context)

Citation Context

...eir statements and proofs follow directlysfrom the syntax of the language and thus are uninteresting artifacts of our methodology.sThe paper "LNgen: Tool Support for Locally Nameless Representations" =-=[5]-=- describes ourstool in detail. LNgen uses the same input language as Ott [49], a tool for translating languagesspecifications written in an intuitive syntax into output for LATEX and proof assistants....

Experience report: Mechanizing Core F � using the locally nameless approach (extended abstract)

by Benoît Montagu
"... For a couple of years, much effort has been put in the development of techniques that ease the mechanization of proofs involving binders. We report such a mechanized development of metatheory, the type soundness of Core F � [3], by a non expert user of Coq [2], using the locally nameless representat ..."
Abstract - Add to MetaCart
For a couple of years, much effort has been put in the development of techniques that ease the mechanization of proofs involving binders. We report such a mechanized development of metatheory, the type soundness of Core F � [3], by a non expert user of Coq [2], using the locally nameless representation of binders and cofinite quantification, with the help of the tools LNgen [1] and Ott [4]. 1. F � and its formal proof in a nutshell Core F � (F-zip) is a variant of System F that allows for more freedom in the structure of programs that make use of existential types, by considering existentials with an open scope. It is equipped with a small-step reduction semantics and a sound type system. The paper proof is neither very informative, nor very difficult, and consists in the subject reduction and the progress properties. The mechanized proof was carried out in about one month by the author, who is not an expert user of Coq. It makes use of LNgen [1] and the experimental locally nameless backend of Ott [4] to reduce the burden of the locally nameless encoding and its infrastructure lemmas. The only complex automation we used is the one provided by the Metatheory library from UPenn, that was of great help. Thus, a clean up of the development as well as clever tactics could certainly reduce the size of the whole proof. Much time is spent in proof search, so that Coq compiles it in about 45 minutes on a recent computer, while type checking takes just a few minutes.
(Show Context)

Citation Context

... of metatheory, the type soundness of Core F � [3], by a non expert user of Coq [2], using the locally nameless representation of binders and cofinite quantification, with the help of the tools LNgen =-=[1]-=- and Ott [4]. 1. F � and its formal proof in a nutshell Core F � (F-zip) is a variant of System F that allows for more freedom in the structure of programs that make use of existential types, by consi...

Union, Intersection, and Refinement Types and Reasoning About Type Disjointness for Secure Protocol Implementations

by Michael Backes, Catalin Hritcu, Matteo Maffei , 2012
"... We present a new type system for verifying the security of cryptographic protocol implementations. The type system combines prior work on refinement types, with union, intersection, and polymorphic types, and with the novel ability to reason statically about the disjointness of types. The increased ..."
Abstract - Add to MetaCart
We present a new type system for verifying the security of cryptographic protocol implementations. The type system combines prior work on refinement types, with union, intersection, and polymorphic types, and with the novel ability to reason statically about the disjointness of types. The increased expressivity enables the analysis of important protocol classes that were previously out of scope for the type-based analyses of protocol implementations. In particular, our types can statically characterize: (i) more usages of asymmetric cryptography, such as signatures of private data and encryptions of authenticated data; (ii) authenticity and integrity properties achieved by showing knowledge of secret data; (iii) applications based on zeroknowledge proofs. The type system comes with a mechanized proof of correctness and an efficient type-checker.
(Show Context)

Citation Context

...e definitions from a 1kLOC long Ott specification, but for the more complex rules we often 11All code size figures include whitespace and comments. 36 needed to patch the output of Ott. We used LNgen =-=[16]-=- to generate an additional 25kLOC of infrastructure lemmas, which proved invaluable when working with the locally nameless representation. During the formalization we found and fixed three relatively ...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University