Results 11  20
of
87
Nominal rewriting
 Information and Computation
"... Nominal rewriting is based on the observation that if we add support for alphaequivalence to firstorder syntax using the nominalset approach, then systems with binding, including higherorder reduction schemes such as lambdacalculus betareduction, can be smoothly represented. Nominal rewriting ma ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
Nominal rewriting is based on the observation that if we add support for alphaequivalence to firstorder syntax using the nominalset approach, then systems with binding, including higherorder reduction schemes such as lambdacalculus betareduction, can be smoothly represented. Nominal rewriting maintains a strict distinction between variables of the objectlanguage (atoms) and of the metalanguage (variables or unknowns). Atoms may be bound by a special abstraction operation, but variables cannot be bound, giving the framework a pronounced firstorder character, since substitution of terms for variables is not captureavoiding. We show how good properties of firstorder rewriting survive the extension, by giving an efficient rewriting algorithm, a critical pair lemma, and a confluence theorem
Developing Developments
, 1994
"... Confluence of orthogonal rewriting systems can be proved using the Finite Developments Theorem. We present, in a general setting, several adaptations of this proof method for obtaining confluence of `not quite' orthogonal systems. 1. Introduction Rewriting as studied here is based on the an ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
Confluence of orthogonal rewriting systems can be proved using the Finite Developments Theorem. We present, in a general setting, several adaptations of this proof method for obtaining confluence of `not quite' orthogonal systems. 1. Introduction Rewriting as studied here is based on the analogy: rewriting = substitution + rules. This analogy is useful since it enables a clearcut distinction between the `designer' defined substition process, i.e. management of resources, and the `user' defined rewrite rules, of rewriting systems. For example, application of the `user' defined term rewriting rule 2 \Theta x ! x + x to the term 2 \Theta 3 gives rise to the duplication of the term 3 in the result 3 + 3. How this duplication is actually performed (for example, using sharing) depends on the `designer's' implementation of substitution. This decomposition has been shown useful in [OR94, Oos94] in the case of firstorder term rewriting systems (TRSs, [DJ90, Klo92]) and higherorder term r...
Arithmetic as a theory modulo
 Proceedings of RTA’05
, 2005
"... Abstract. We present constructive arithmetic in Deduction modulo with rewrite rules only. In natural deduction and in sequent calculus, the cut elimination theorem and the analysis of the structure of cut free proofs is the key to many results about predicate logic with no axioms: analyticity and no ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
Abstract. We present constructive arithmetic in Deduction modulo with rewrite rules only. In natural deduction and in sequent calculus, the cut elimination theorem and the analysis of the structure of cut free proofs is the key to many results about predicate logic with no axioms: analyticity and nonprovability results, completeness results for proof search algorithms, decidability results for fragments, constructivity results for the intuitionistic case... Unfortunately, the properties of cut free proofs do not extend in the presence of axioms and the cut elimination theorem is not as powerful in this case as it is in pure logic. This motivates the extension of the notion of cut for various axiomatic theories such as arithmetic, Church’s simple type theory, set theory and others. In general, we can say that a new axiom will necessitate a specific extension of the notion of cut: there still is no notion of cut general enough to be applied to any axiomatic theory. Deduction modulo [2, 3] is one attempt, among others, towards this aim.
Comparing Combinatory Reduction Systems and HigherOrder Rewrite Systems
, 1993
"... In this paper two formats of higherorder rewriting are compared: Combinatory Reduction Systems introduced by Klop [Klo80] and Higherorder Rewrite Systems defined by Nipkow [Nipa]. Although it always has been obvious that both formats are closely related to each other, up to now the exact relations ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
In this paper two formats of higherorder rewriting are compared: Combinatory Reduction Systems introduced by Klop [Klo80] and Higherorder Rewrite Systems defined by Nipkow [Nipa]. Although it always has been obvious that both formats are closely related to each other, up to now the exact relationship between them has not been clear. This was an unsatisfying situation since it meant that proofs for much related frameworks were given twice. We present two translations, one from Combinatory Reduction Systems into HigherOrder Rewrite Systems and one vice versa, based on a detailed comparison of both formats. Since the translations are very `neat' in the sense that the rewrite relation is preserved and (almost) reflected, we can conclude that as far as rewrite theory is concerned, Combinatory Reduction Systems and HigherOrder Rewrite Systems are equivalent, the only difference being that Combinatory Reduction Systems employ a more `lazy' evaluation strategy. Moreover, due to this result...
Combinatory Reduction Systems with Explicit Substitution
 REWRITING TECHNIQUES AND APPLICATIONS (RTA), LECTURE NOTES IN COMPUTER SCIENCE
, 1996
"... We generalise the notion of explicit substitution from the lambdacalculus to higher order rewriting, realised by combinatory reduction systems (CRS). In this general framework this is achieved by identifying the "explicit" subclass of CRSs within which rewriting can be implemented effici ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
We generalise the notion of explicit substitution from the lambdacalculus to higher order rewriting, realised by combinatory reduction systems (CRS). In this general framework this is achieved by identifying the "explicit" subclass of CRSs within which rewriting can be implemented efficiently.
For every CRS R we show how to construct an explicit substitution variant, Rx, which is a conservative extension of R (and hence confluent when R is confluent). Furthermore we give a syntactic criterion on the rewrite rules that identifies a large subset of the CRSs, the redexpreserving CRSs, for which we show that Rx preserves strong normalisation of R.
We believe that this is a significant first step towards providing a methodology for reasoning about the operational properties of higherorder rewriting in general, and higherorder program transformations in particular, since confluence ensures correctness of such transformations and preservation of strong normalisation ensures that the transformations are always safe, in both cases independently of the used reduction strategy.
Inductive types in the calculus of algebraic constructions
 FUNDAMENTA INFORMATICAE 65(12) (2005) 61–86 JOURNAL VERSION OF TLCA’03
, 2005
"... In a previous work, we proved that almost all of the Calculus of Inductive Constructions (CIC), the basis of the proof assistant Coq, can be seen as a Calculus of Algebraic Constructions (CAC), an extension of the Calculus of Constructions with functions and predicates defined by higherorder rewrit ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
In a previous work, we proved that almost all of the Calculus of Inductive Constructions (CIC), the basis of the proof assistant Coq, can be seen as a Calculus of Algebraic Constructions (CAC), an extension of the Calculus of Constructions with functions and predicates defined by higherorder rewrite rules. In this paper, we prove that CIC as a whole can be seen as a CAC, and that it can be extended with nonstrictly positive types and inductiverecursive types together with nonfree constructors and patternmatching on defined symbols.
What is a Purely Functional Language?
 Journal of Functional Programming
, 1998
"... Functional programming languages are informally classified into pure and impure languages. The precise meaning of this distinction has been a matter of controversy. We therefore investigate a formal definition of purity. We begin by showing that some proposed definitions that rely on confluence, sou ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
Functional programming languages are informally classified into pure and impure languages. The precise meaning of this distinction has been a matter of controversy. We therefore investigate a formal definition of purity. We begin by showing that some proposed definitions that rely on confluence, soundness of the beta axiom, preservation of pure observational equivalences, and independence of the order of evaluation, do not withstand close scrutiny. We propose instead a definition based on parameterpassing independence. Intuitively, the definition implies that functions are pure mappings from arguments to results; the operational decision of how to pass the arguments is irrelevant. In the context of Haskell, our definition is consistent with the fact that the traditional callbyname denotational semantics coincides with the traditional callbyneed implementation. Furthermore, our definition is compatible with the streambased, continuationbased, and monadbased integration of computa...
A proof of strong normalisation using domain theory
 IN LICS’06
, 2006
"... U. Berger, [11] significantly simplified Tait’s normalisation proof for bar recursion [27], see also [9], replacing Tait’s introduction of infinite terms by the construction of a domain having the property that a term is strongly normalizing if its semantics is. The goal of this paper is to show tha ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
U. Berger, [11] significantly simplified Tait’s normalisation proof for bar recursion [27], see also [9], replacing Tait’s introduction of infinite terms by the construction of a domain having the property that a term is strongly normalizing if its semantics is. The goal of this paper is to show that, using ideas from the theory of intersection types [2, 6, 7, 21] and MartinLöf’s domain interpretation of type theory [18], we can in turn simplify U. Berger’s argument in the construction of such a domain model. We think that our domain model can be used to give modular proofs of strong normalization for various type theory. As an example, we show in some details how it can be used to prove strong normalization for MartinLöf dependent type theory extended with bar recursion, and with some form of proofirrelevance.
Interaction Nets and Term Rewriting Systems
, 1998
"... Term rewriting systems provide a framework in which it is possible to specify and program in a traditional syntax (oriented equations). Interaction nets, on the other hand, provide a graphical syntax for the same purpose, but can be regarded as being closer to an implementation since the reductio ..."
Abstract

Cited by 13 (7 self)
 Add to MetaCart
Term rewriting systems provide a framework in which it is possible to specify and program in a traditional syntax (oriented equations). Interaction nets, on the other hand, provide a graphical syntax for the same purpose, but can be regarded as being closer to an implementation since the reduction process is local and asynchronous, and all the operations are made explicit, including discarding and copying of data. Our aim is to bridge the gap between the above formalisms by showing how to understand interaction nets in a term rewriting framework. This allows us to transfer results from one paradigm to the other, deriving syntactical properties of interaction nets from the (wellstudied) properties of term rewriting systems; in particular concerning termination and modularity. Keywords: term rewriting, interaction nets, termination, modularity. 1 Introduction Term rewriting systems provide a general framework for specifying and reasoning about computation. They can be regarde...