Results 1  10
of
31
On Higher Order Recursive Program Schemes
 In: Proc. of the 19 th International Colloquium on Trees in Algebra and Programming, CAAP'94
"... . We define Higher Order Recursive Program Schemes (HRPSs) by allowing metasubstitutions (as in the calculus) in righthand sides of function and quantifier definitions. A study of several kinds of similarity of redexes makes it possible to lift properties of (first order) Recursive Program Schemes ..."
Abstract

Cited by 20 (16 self)
 Add to MetaCart
. We define Higher Order Recursive Program Schemes (HRPSs) by allowing metasubstitutions (as in the calculus) in righthand sides of function and quantifier definitions. A study of several kinds of similarity of redexes makes it possible to lift properties of (first order) Recursive Program Schemes to the higher order case. The main result is the decidability of weak normalization in HRPSs, which immediately implies that HRPSs do not have full computational power. We analyze the structural properties of HRPSs and introduce several kinds of persistent expression reduction systems (PERSs) that enjoy similar properties. Finally, we design an optimal evaluation procedure for PERSs. 1 Introduction Higher Order Recursive Program Schemes (HRPSs) are recursive definitions of functions, predicates, and quantifiers, considered as rewriting systems. Similar definitions are used when one extends a theory by introducing new symbols [16]. 9aA , (øa(A)=a)A and 9!aA , 9aA 8a8b(A (b=a)A ) a = b) a...
The Longest Perpetual Reductions in Orthogonal Expression Reduction Systems
 In: Proc. of the 3 rd International Conference on Logical Foundations of Computer Science, LFCS'94, A. Nerode and Yu.V. Matiyasevich, eds., Springer LNCS
, 1994
"... We consider reductions in Orthogonal Expression Reduction Systems (OERS), that is, Orthogonal Term Rewriting Systems with bound variables and substitutions, as in the calculus. We design a strategy that for any given term t constructs a longest reduction starting from t if t is strongly normaliza ..."
Abstract

Cited by 18 (8 self)
 Add to MetaCart
We consider reductions in Orthogonal Expression Reduction Systems (OERS), that is, Orthogonal Term Rewriting Systems with bound variables and substitutions, as in the calculus. We design a strategy that for any given term t constructs a longest reduction starting from t if t is strongly normalizable, and constructs an infinite reduction otherwise. The Conservation Theorem for OERSs follows easily from the properties of the strategy. We develop a method for computing the length of a longest reduction starting from a strongly normalizable term. We study properties of pure substitutions and several kinds of similarity of redexes. We apply these results to construct an algorithm for computing lengths of longest reductions in strongly persistent OERSs that does not require actual transformation of the input term. As a corollary, we have an algorithm for computing lengths of longest developments in OERSs. 1 Introduction A strategy is perpetual if, given a term t, it constructs an infinit...
Confluence of Extensional and NonExtensional λcalculi with Explicit Substitutions
 Theoretical Computer Science
"... This paper studies confluence of extensional and nonextensional calculi with explicit substitutions, where extensionality is interpreted by jexpansion. For that, we propose a scheme for explicit substitutions which describes those abstract properties that are sufficient to guarantee confluence. O ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
This paper studies confluence of extensional and nonextensional calculi with explicit substitutions, where extensionality is interpreted by jexpansion. For that, we propose a scheme for explicit substitutions which describes those abstract properties that are sufficient to guarantee confluence. Our method makes it possible to treat at the same time many wellknown calculi such as oe , oe * , OE , s , AE , f , d and dn . Keywords: functional programming, calculi, explicit substitutions, confluence, extensionality. 1 Introduction The calculus is a convenient framework to study functional programming, where the evaluation process is modeled by fireduction. The main mechanism used to perform fireduction is substitution, which consists of the replacement of formal parameters by actual arguments. The correctness of substitution is guaranteed by a systematic renaming of bound variables, inconvenient which can be simply avoided in the calculus `a la de Bruijn by using natur...
Contextsensitive Conditional Expression Reduction Systems
 In Proc. of the International Workshop on Graph Rewriting and Computation, SEGRAGRA'95
, 1995
"... We introduce Contextsensitive Conditional Expression Reduction Systems (CERS) by extending and generalizing the notion of conditional TRS to the higher order case. We justify our framework in two ways. First, we define orthogonality for CERSs and show that the usual results for orthogonal systems ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
We introduce Contextsensitive Conditional Expression Reduction Systems (CERS) by extending and generalizing the notion of conditional TRS to the higher order case. We justify our framework in two ways. First, we define orthogonality for CERSs and show that the usual results for orthogonal systems (finiteness of developments, confluence, permutation equivalence) carry over immediately. This can be used e.g. to infer confluence from the subject reduction property in several typed calculi possibly enriched with patternmatching definitions. Second, we express several proof and transition systems as CERSs. In particular, we give encodings of Hilbertstyle proof systems, Gentzenstyle sequentcalculi, rewrite systems with rule priorities, and the ßcalculus into CERSs. This last encoding is an (important) example of real contextsensitive rewriting. 1 Introduction A term rewriting system is a pair consisting of an alphabet and a set of rewrite rules. The alphabet is used freely to gene...
Relative Normalization in Orthogonal Expression Reduction Systems
 In: Proc. of the 4 th International workshop on Conditional (and Typed) Term Rewriting Systems, CTRS'94, Springer LNCS
, 1994
"... . We study reductions in orthogonal (leftlinear and nonambiguous) Expression Reduction Systems, a formalism for Term Rewriting Systems with bound variables and substitutions. To generalise the normalization theory of Huet and L'evy, we introduce the notion of neededness with respect to a set ..."
Abstract

Cited by 11 (10 self)
 Add to MetaCart
. We study reductions in orthogonal (leftlinear and nonambiguous) Expression Reduction Systems, a formalism for Term Rewriting Systems with bound variables and substitutions. To generalise the normalization theory of Huet and L'evy, we introduce the notion of neededness with respect to a set of reductions \Pi or a set of terms S so that each existing notion of neededness can be given by specifying \Pi or S. We imposed natural conditions on S, called stability, that are sufficient and necessary for each term not in Snormal form (i.e., not in S) to have at least one Sneeded redex, and repeated contraction of Sneeded redexes in a term t to lead to an Snormal form of t whenever there is one. Our relative neededness notion is based on tracing (open) components, which are occurrences of contexts not containing any bound variable, rather than tracing redexes or subterms. 1 Introduction Since a normalizable term, in a rewriting system, may have an infinite reduction, it is important to...
Combining Algebraic Rewriting, Extensional Lambda Calculi, and Fixpoints
"... It is well known that confluence and strong normalization are preserved when combining algebraic rewriting systems with the simply typed lambda calculus. It is equally well known that confluence fails when adding either the usual contraction rule for #, or recursion together with the usual contract ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
It is well known that confluence and strong normalization are preserved when combining algebraic rewriting systems with the simply typed lambda calculus. It is equally well known that confluence fails when adding either the usual contraction rule for #, or recursion together with the usual contraction rule for surjective pairing. We show that confluence and strong normalization are modular properties for the combination of algebraic rewriting systems with typed lambda calculi enriched with expansive extensional rules for # and surjective pairing. We also show how to preserve confluence in a modular way when adding fixpoints to di#erent rewriting systems. This result is also obtained by a simple translation technique allowing to simulate bounded recursion. 1 Introduction Confluence and strong normalization for the combination of lambda calculus and algebraic rewriting systems have been the object of many studies [BT88, JO91, BTG94, HM90], where the modularity of these properties is s...
HigherOrder Rewriting and Partial Evaluation
 REWRITING TECHNIQUES AND APPLICATIONS, LECTURE NOTES IN COMPUTER SCIENCE
, 1998
"... We demonstrate the usefulness of higherorder rewriting techniques for specializing programs, i.e., for partial evaluation. More precisely, we demonstrate how casting program specializers as combinatory reduction systems (CRSs) makes it possible to formalize the corresponding program transformat ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
We demonstrate the usefulness of higherorder rewriting techniques for specializing programs, i.e., for partial evaluation. More precisely, we demonstrate how casting program specializers as combinatory reduction systems (CRSs) makes it possible to formalize the corresponding program transformations as metareductions, i.e., reductions in the internal "substitution calculus." For partialevaluation problems, this means that instead of having to prove on a casebycase basis that one's "twolevel functions" operate properly, one can concisely formalize them as a combinatory reduction system and obtain as a corollary that static reduction does not go wrong and yields a wellformed residual program.
Minimal Relative Normalization in Orthogonal Expression Reduction Systems
 In Proc. of the 16 th International Conference on Foundations of Software Technology and Theoretical Computer Science, FST&TCS'96, Springer LNCS
, 1996
"... . In previous papers, the authors studied normalization relative to desirable sets S of `partial results', where it is shown that such sets must be stable. For example, the sets of normal forms, headnormalforms, and weak headnormalforms in the calculus, are all stable. They showed that, f ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
. In previous papers, the authors studied normalization relative to desirable sets S of `partial results', where it is shown that such sets must be stable. For example, the sets of normal forms, headnormalforms, and weak headnormalforms in the calculus, are all stable. They showed that, for any stable S, Sneeded reductions are Snormalizing. This paper continues the investigation into the theory of relative normalization. In particular, we prove existence of minimal normalizing reductions for regular stable sets of results. All the above mentioned sets are regular. We give a sufficient and necessary criterion for a normalizing reduction (w.r.t. a regular stable S) to be minimal. Finally, we establish a relationship between relative minimal and optimal reductions, revealing a conflict between minimality and optimality: for regular stable sets of results, a term need not possess a reduction that is minimal and optimal at the same time. 1 Introduction The Normalization Theorem in ...
Data Fields
 In Proc. Workshop on Generic Programming, Marstrand
, 1998
"... This position paper describes the data field model, a general model for indexed data structures. The aim of this model is to capture the essence of the style of programming where computing on data structures is expressed by operations directly on the structures rather than operations on the individu ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
This position paper describes the data field model, a general model for indexed data structures. The aim of this model is to capture the essence of the style of programming where computing on data structures is expressed by operations directly on the structures rather than operations on the individual elements. Array and and data parallel languages support this programming style, and functional languages often provide second order operations on lists and other data structures for the same purpose. The data field model is designed to be abstract enough to encompass a wide range of explicitly or implicitly indexed structures. Thus, algorithms which are expressed in terms of data fields and general operations on them will be independent of the choice of structure from this range  i.e., generic w.r.t. this choice. This means that the data field approach has some in common with polytypic programming and the theory of shapes.