Results 1  10
of
12
Operational Techniques in PVS – A Preliminary Evaluation
 In Proceedings of the Australasian Theory Symposium, CATS ’01
, 2001
"... In this paper we present a preliminary analysis of the suitability of using PVS as a tool for developing operational semantics and programming logics in a semiautomatic fashion. To this end we present a formalized proof of the Church–Rosser theorem for a version of the callbyvalue lambda calculu ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
In this paper we present a preliminary analysis of the suitability of using PVS as a tool for developing operational semantics and programming logics in a semiautomatic fashion. To this end we present a formalized proof of the Church–Rosser theorem for a version of the callbyvalue lambda calculus in the spirit of Landin’s ISWIM. The proof is developed in the PVS system, and is used as a test bed or benchmark for evaluating the applicability of that system for carrying out more complex operational arguments. Our approach is relatively unusual in that it is based on the named variable approach, and concentrates on the callbyvalue version of the rule. Although there are numerous computerbased proofs of the Church– Rosser theorem in the literature, all of the existing proofs eliminate the need to treat conversion. The novel aspects of our approach are that: we use the PVS system, especially its builtin abstract data types facility, to verify a version of the Church–Rosser theorem; we formalize a version of the calculus, as it normally appears in textbooks, rather than tailoring it to suit the machine or system; we treat an ISWIM variation on the callbyvalue version of the calculus, rather than the simpler traditional callbyname version. However the main aim of the work reported here was to evaluate PVS as a tool for developing, state of the art, operational based programming logics for realistic programming languages.
Functional Genetic Programming with Combinators
"... Abstract. Prior program representations for genetic programming that incorporated features of modern programming languages solved harder problems than earlier representations, but required more complex genetic operators. We develop the idea of using combinator expressions as a program representation ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Prior program representations for genetic programming that incorporated features of modern programming languages solved harder problems than earlier representations, but required more complex genetic operators. We develop the idea of using combinator expressions as a program representation for genetic programming. This representation makes it possible to evolve programs with a variety of programming language constructs using simple genetic operators. We investigate the effort required to evolve combinatorexpression solutions to several problems: linear regression, even parity on N inputs, and implementation of the stack and queue data structures. Genetic programming with combinator expressions compares favorably to prior approaches, namely the works
Theoretical Foundations for Practical ‘Totally Functional Programming’
, 2007
"... Interpretation is an implicit part of today’s programming; it has great power but is overused and has
significant costs. For example, interpreters are typically significantly hard to understand and hard
to reason about. The methodology of “Totally Functional Programming” (TFP) is a reasoned
attempt ..."
Abstract
 Add to MetaCart
Interpretation is an implicit part of today’s programming; it has great power but is overused and has
significant costs. For example, interpreters are typically significantly hard to understand and hard
to reason about. The methodology of “Totally Functional Programming” (TFP) is a reasoned
attempt to redress the problem of interpretation. It incorporates an awareness of the undesirability
of interpretation with observations that definitions and a certain style of programming appear to
offer alternatives to it. Application of TFP is expected to lead to a number of significant outcomes,
theoretical as well as practical. Primary among these are novel programming languages to lessen or
eliminate the use of interpretation in programming, leading to betterquality software. However,
TFP contains a number of lacunae in its current formulation, which hinder development of these
outcomes. Among others, formal semantics and typesystems for TFP languages are yet to be
discovered, the means to reduce interpretation in programs is to be determined, and a detailed
explication is needed of interpretation, definition, and the differences between the two. Most
important of all however is the need to develop a complete understanding of the nature of
interpretation. In this work, suitable typesystems for TFP languages are identified, and guidance
given regarding the construction of appropriate formal semantics. Techniques, based around the
‘fold’ operator, are identified and developed for modifying programs so as to reduce the amount of
interpretation they contain. Interpretation as a means of languageextension is also investigated.
v
Finally, the nature of interpretation is considered. Numerous hypotheses relating to it considered in
detail. Combining the results of those analyses with discoveries from elsewhere in this work leads
to the proposal that interpretation is not, in fact, symbolbased computation, but is in fact something
more fundamental: computation that varies with input. We discuss in detail various implications of
this characterisation, including its practical application. An often moreuseful property, ‘inherent
interpretiveness’, is also motivated and discussed in depth. Overall, our inquiries act to give
conceptual and theoretical foundations for practical TFP.
Rather comprehensive.
"... by Timothy Kelley. This book provides both the algorithms and the proofs that they work. ..."
Abstract
 Add to MetaCart
(Show Context)
by Timothy Kelley. This book provides both the algorithms and the proofs that they work.
Some Formal Considerations on Gabbay’s Restart Rule in Natural Deduction and GoalDirected Reasoning
"... In this paper we make some observations about Natural Deduction derivations ..."
Abstract
 Add to MetaCart
In this paper we make some observations about Natural Deduction derivations
DOI 10.3233/FI2010306 IOS Press Church–Rosser Made Easy
"... Abstract. The Church–Rosser theorem states that the λcalculus is confluent under βreductions. The standard proof of this result is due to Tait and MartinLöf. In this note, we present an alternative proof based on the notion of acceptable orderings. The technique is easily modified to give conflue ..."
Abstract
 Add to MetaCart
Abstract. The Church–Rosser theorem states that the λcalculus is confluent under βreductions. The standard proof of this result is due to Tait and MartinLöf. In this note, we present an alternative proof based on the notion of acceptable orderings. The technique is easily modified to give confluence of the βηcalculus. Keywords: lambdacalculus, confluence, Church–Rosser theorem
ARITHMETIC COMPUTATIONS AND MEMORY MANAGEMENT USING A BINARY TREE ENCODING OF NATURAL NUMBERS
, 2011
"... Two applications of a binary tree data type based on a simple pairing function (a bijection between natural numbers and pairs of natural numbers) are explored. First, the tree is used to encode natural numbers, and algorithms that perform basic arithmetic computations are presented along with formal ..."
Abstract
 Add to MetaCart
(Show Context)
Two applications of a binary tree data type based on a simple pairing function (a bijection between natural numbers and pairs of natural numbers) are explored. First, the tree is used to encode natural numbers, and algorithms that perform basic arithmetic computations are presented along with formal proofs of their correctness. Second, using this “canonical ” representation as a base type, algorithms for encoding and decoding additional isomorphic data types of other mathematical constructs (sets, sequences, etc.) are also developed. An experimental application to a memory management system is constructed and explored using these isomorphic types. A practical analysis of this system's runtime complexity and space savings are provided, along with a proof of concept framework for both applications of the binary tree type, in the Java programming language.
Completeness of secondorder propositional S4 and H in topological semantics
"... We add propositional quantifiers to the propositional modal logic S4 and to the propositional intuitionsitic logic H, introducing axiom schemes that are the natural analogs to axiom schemes typically used for firstorder quantifiers in classical logic. We show that the resulting logics are sound and ..."
Abstract
 Add to MetaCart
(Show Context)
We add propositional quantifiers to the propositional modal logic S4 and to the propositional intuitionsitic logic H, introducing axiom schemes that are the natural analogs to axiom schemes typically used for firstorder quantifiers in classical logic. We show that the resulting logics are sound and complete for a topological semantics extending, in a natural way, the topological semantics for S4 and for H.
Resource control and strong normalisation
, 2012
"... We introduce the resource control cube, a system consisting of eight intuitionistic lambda calculi with either implicit or explicit control of resources and with either natural deduction or sequent calculus. The four calculi of the cube that correspond to natural deduction have been proposed by Kesn ..."
Abstract
 Add to MetaCart
We introduce the resource control cube, a system consisting of eight intuitionistic lambda calculi with either implicit or explicit control of resources and with either natural deduction or sequent calculus. The four calculi of the cube that correspond to natural deduction have been proposed by Kesner and Renaud and the four calculi that correspond to sequent lambda calculi are introduced in this paper. The presentation is paramatrized with the set of resources (weakening or contraction), which enables a uniform treatment of the eight calculi of the cube. The simply typed resource control cube, on the one hand, expands the CurryHoward correspondence to intuitionistic natural deduction and intuitionistic sequent logic with implicit or explicit structural rules and, on the other hand, is related to substructural logics. We propose a general intersection type system for the resource control cube calculi. Our main contribution is a characterisation of strong normalisation of reductions in this cube. First, we prove that typeability implies strong normalisation in the “natural deduction base ” of the cube by adapting the reducibility method. We then prove that typeability implies strong normalisation in the “sequent base ” of the cube by using a combination of wellorders and a suitable embedding in the “natural deduction base”. Finally, we prove that strong normalisation implies typeability in the cube using head subject expansion. All proofs are general and can be made specific to each calculus of the cube