Results 11  20
of
216
The Theory of LEGO  A Proof Checker for the Extended Calculus of Constructions
, 1994
"... LEGO is a computer program for interactive typechecking in the Extended Calculus of Constructions and two of its subsystems. LEGO also supports the extension of these three systems with inductive types. These type systems can be viewed as logics, and as meta languages for expressing logics, and LEGO ..."
Abstract

Cited by 68 (10 self)
 Add to MetaCart
LEGO is a computer program for interactive typechecking in the Extended Calculus of Constructions and two of its subsystems. LEGO also supports the extension of these three systems with inductive types. These type systems can be viewed as logics, and as meta languages for expressing logics, and LEGO is intended to be used for interactively constructing proofs in mathematical theories presented in these logics. I have developed LEGO over six years, starting from an implementation of the Calculus of Constructions by G erard Huet. LEGO has been used for problems at the limits of our abilities to do formal mathematics. In this thesis I explain some aspects of the metatheory of LEGO's type systems leading to a machinechecked proof that typechecking is decidable for all three type theories supported by LEGO, and to a verified algorithm for deciding their typing judgements, assuming only that they are normalizing. In order to do this, the theory of Pure Type Systems (PTS) is extended and f...
A Typed Pattern Calculus
 ACM Trans. Program. Lang. Syst
, 1996
"... The theory of programming with patternmatching function definitions has been studied mainly in the framework of firstorder rewrite systems. We present a typed functional calculus that emphasizes the strong connection between the structure of whole pattern definitions and their types. In this calcu ..."
Abstract

Cited by 66 (15 self)
 Add to MetaCart
The theory of programming with patternmatching function definitions has been studied mainly in the framework of firstorder rewrite systems. We present a typed functional calculus that emphasizes the strong connection between the structure of whole pattern definitions and their types. In this calculus typechecking guarantees the absence of runtime errors caused by nonexhaustive patternmatching definitions. Its operational semantics is deterministic in a natural way, without the imposition of adhoc solutions such as clause order or "best fit". In the spirit of the CurryHoward isomorphism, we design the calculus as a computational interpretation of the Gentzen sequent proofs for the intuitionistic propositional logic. We prove the basic properties connecting typing and evaluation: subject reduction and strong normalization. We believe that this calculus offers a rational reconstruction of the patternmatching features found in successful functional languages. CNRS and Laboratoire...
Translucent Sums: A Foundation for HigherOrder Module Systems
, 1997
"... The ease of understanding, maintaining, and developing a large program depends crucially on how it is divided up into modules. The possible ways a program can be divided are constrained by the available modular programming facilities ("module system") of the programming language being used. Experien ..."
Abstract

Cited by 58 (0 self)
 Add to MetaCart
The ease of understanding, maintaining, and developing a large program depends crucially on how it is divided up into modules. The possible ways a program can be divided are constrained by the available modular programming facilities ("module system") of the programming language being used. Experience with the StandardML module system has shown the usefulness of functions mapping modules to modules and modules with module subcomponents. For example, functions over modules permit abstract data types (ADTs) to be parameterized by other ADTs, and submodules permit modules to be organized hierarchically. Module systems with such facilities are called higherorder, by analogy with higherorder functions. Previous higherorder module systems can be classified as either opaque or transparent. Opaque systems totally obscure information about the identity of type components of modules, often resulting in overly abstract types. This loss of type identities precludes most interesting uses of hi...
Specifying and Implementing Theorem Provers in a HigherOrder Logic Programming Language
, 1989
"... We argue that a logic programming language with a higherorder intuitionistic logic as its foundation can be used both to naturally specify and implement theorem provers. The language extends traditional logic programming languages by replacing firstorder terms with simplytyped λterms, replacing ..."
Abstract

Cited by 46 (7 self)
 Add to MetaCart
We argue that a logic programming language with a higherorder intuitionistic logic as its foundation can be used both to naturally specify and implement theorem provers. The language extends traditional logic programming languages by replacing firstorder terms with simplytyped λterms, replacing firstorder unification with higherorder unification, and allowing implication and universal quantification in queries and the bodies of clauses. Inference rules for a variety of proof systems can be naturally specified in this language. The higherorder features of the language contribute to a concise specification of provisos concerning variable occurrences in formulas and the discharge of assumptions present in many proof systems. In addition, abstraction in metaterms allows the construction of terms representing object level proofs which capture the notions of abstractions found in many proof systems. The operational interpretations of the connectives of the language provide a set of basic search operations which describe goaldirected search for proofs. To emphasize the generality of the metalanguage, we compare it to another general specification language: the Logical Framework (LF). We describe a translation which compiles a specification of a logic in LF to a set of formulas of our metalanguage, and
A lambda calculus for quantum computation with classical control
 IN PROCEEDINGS OF THE 7TH INTERNATIONAL CONFERENCE ON TYPED LAMBDA CALCULI AND APPLICATIONS (TLCA), VOLUME 3461 OF LECTURE NOTES IN COMPUTER SCIENCE
, 2005
"... ..."
Modeling an algebraic stepper
 Proceedings of the 10th European Symposium on Programming, volume 2028 of Lecture Notes in Computer Science
, 2001
"... Abstract. Programmers rely on the correctness of the tools in their programming environments. In the past, semanticists have studied the correctness of compilers and compiler analyses, which are the most important tools. In this paper, we make the case that other tools, such as debuggers and stepper ..."
Abstract

Cited by 43 (17 self)
 Add to MetaCart
Abstract. Programmers rely on the correctness of the tools in their programming environments. In the past, semanticists have studied the correctness of compilers and compiler analyses, which are the most important tools. In this paper, we make the case that other tools, such as debuggers and steppers, deserve semantic models, too, and that using these models can help in developing these tools. Our concrete starting point is the algebraic stepper in DrScheme, our Scheme programming environment. The algebraic stepper explains a Scheme computation in terms of an algebraic rewriting of the program text. A program is rewritten until it is in a canonical form (if it has one). The canonical form is the final result. The stepper operates within the existing evaluator, by placing breakpoints and by reconstructing source expressions from source information placed on the stack. This approach raises two questions. First, do the runtime breakpoints correspond to the steps of the reduction semantics? Second, does the debugging mechanism insert enough information to reconstruct source expressions? To answer these questions, we develop a highlevel semantic model of the extended compiler and runtime machinery. Rather than modeling the evaluation as a lowlevel machine, we model the relevant lowlevel features of the stepper’s implementation in a highlevel reduction semantics. We expect the approach to apply to other semanticsbased tools. 1 The Correctness of Programming Environment Tools Programming environments provide many tools that process programs semantically. The most common ones are compilers, program analysis tools, debuggers, and profilers. Our DrScheme programming environment [9,8] also provides an algebraic stepper for Scheme. It explains a program’s execution as a sequence of reduction steps based on the ordinary laws of algebra for the functional core [2,
The Realizability Approach to Computable Analysis and Topology
, 2000
"... policies, either expressed or implied, of the NSF, NAFSA, or the U.S. government. ..."
Abstract

Cited by 41 (19 self)
 Add to MetaCart
policies, either expressed or implied, of the NSF, NAFSA, or the U.S. government.
Concurrent Transition Systems
 Theoretical Computer Science
, 1989
"... : Concurrent transition systems (CTS's), are ordinary nondeterministic transition systems that have been equipped with additional concurrency information, specified in terms of a binary residual operation on transitions. Each CTS C freely generates a complete CTS or computation category C , whose ..."
Abstract

Cited by 40 (5 self)
 Add to MetaCart
: Concurrent transition systems (CTS's), are ordinary nondeterministic transition systems that have been equipped with additional concurrency information, specified in terms of a binary residual operation on transitions. Each CTS C freely generates a complete CTS or computation category C , whose arrows are equivalence classes of finite computation sequences, modulo a congruence induced by the concurrency information. The categorical composition on C induces a "prefix" partial order on its arrows, and the computations of C are conveniently defined to be the ideals of this partial order. The definition of computations as ideals has some pleasant properties, one of which is that the notion of a maximal ideal in certain circumstances can serve as a replacement for the more troublesome notion of a fair computation sequence. To illustrate the utility of CTS's, we use them to define and investigate a dataflowlike model of concurrent computation. The model consists of machines, which ...
The CallbyNeed Lambda Calculus
 Journal of Functional Programming
, 1994
"... We present a calculus that captures the operational semantics of callbyneed. The callbyneed lambda calculus is confluent, has a notion of standard reduction, and entails the same observational equivalence relation as the callbyname calculus. The system can be formulated with or without explici ..."
Abstract

Cited by 39 (2 self)
 Add to MetaCart
We present a calculus that captures the operational semantics of callbyneed. The callbyneed lambda calculus is confluent, has a notion of standard reduction, and entails the same observational equivalence relation as the callbyname calculus. The system can be formulated with or without explicit let bindings, admits useful notions of marking and developments, and has a straightforward operational interpretation. Introduction The correspondence between callbyvalue lambda calculi and strict functional languages (such as the pure subset of Standard ML) is quite good; the correspondence between callby name lambda calculi and lazy functional languages (such as Miranda or Haskell) is not so good. Callbyname reevaluates an argument each time it is used, a prohibitive expense. Thus, many lazy languages are implemented using the callbyneed mechanism proposed by Wadsworth (1971), which overwrites an argument with its value the first time it is evaluated, avoiding the need for any s...