Results 1  10
of
15
A Naïve Time Analysis and its Theory of Cost Equivalence
 Journal of Logic and Computation
, 1995
"... Techniques for reasoning about extensional properties of functional programs are well understood, but methods for analysing the underlying intensional or operational properties have been much neglected. This paper begins with the development of a simple but useful calculus for time analysis of nons ..."
Abstract

Cited by 44 (7 self)
 Add to MetaCart
(Show Context)
Techniques for reasoning about extensional properties of functional programs are well understood, but methods for analysing the underlying intensional or operational properties have been much neglected. This paper begins with the development of a simple but useful calculus for time analysis of nonstrict functional programs with lazy lists. One limitation of this basic calculus is that the ordinary equational reasoning on functional programs is not valid. In order to buy back some of these equational properties we develop a nonstandard operational equivalence relation called cost equivalence, by considering the number of computation steps as an `observable' component of the evaluation process. We define this relation by analogy with Park's definition of bisimulation in CCS. This formulation allows us to show that cost equivalence is a contextual congruence (and thus is substitutive with respect to the basic calculus) and provides useful proof techniques for establishing costequivalen...
Computational Comonads and Intensional Semantics
, 1991
"... We explore some foundational issues in the development of a theory of intensional semantics. A programming language may be given a variety of semantics, differing in the level of abstraction; one generally chooses the semantics at an abstraction level appropriate for reasoning about a particular kin ..."
Abstract

Cited by 34 (2 self)
 Add to MetaCart
We explore some foundational issues in the development of a theory of intensional semantics. A programming language may be given a variety of semantics, differing in the level of abstraction; one generally chooses the semantics at an abstraction level appropriate for reasoning about a particular kind of program property. Extensional semantics are typically appropriate for proving properties such as partial correctness, but an intensional semantics at a lower abstraction level is required in order to reason about computation strategy and thereby support reasoning about intensional aspects of behavior such as order of evaluation and efficiency. It is obviously desirable to be able to establish sensible relationships between two semantics for the same language, and we seek a general categorytheoretic framework that permits this. Beginning with an "extensional" category, whose morphisms we can think of as functions of some kind, we model a notion of computation as a comonad with certain e...
A Monadic Calculus for Parallel Costing of a Functional Language of Arrays
 EuroPar'97 Parallel Processing, volume 1300 of Lecture Notes in Computer Science
, 1997
"... . Vec is a higherorder functional language of nested arrays, which includes a general folding operation. Static computation of the shape of its programs is used to support a compositional cost calculus based on a cost monad. This, in turn, is based on a cost algebra, whose operations may be customi ..."
Abstract

Cited by 27 (9 self)
 Add to MetaCart
. Vec is a higherorder functional language of nested arrays, which includes a general folding operation. Static computation of the shape of its programs is used to support a compositional cost calculus based on a cost monad. This, in turn, is based on a cost algebra, whose operations may be customized to handle different cost regimes, especially for parallel programming. We present examples based on sequential costing and on the PRAM model of parallel computation. The latter has been implemented in Haskell, and applied to some linear algebra examples. 1 Introduction Secondorder combinators such as map, fold and zip provide programmers with a concise, abstract language for writing skeletons for implicitly parallel programs, as in [Ski94], but there is a hitch. These combinators are defined for list programs (see [BW88]), but efficient implementations (which is the point of parallelism, after all) are based on arrays. This disparity becomes acute when working with nested arrays, which...
Operational Theories of Improvement in Functional Languages (Extended Abstract)
 In Proceedings of the Fourth Glasgow Workshop on Functional Programming
, 1991
"... ) David Sands y Department of Computing, Imperial College 180 Queens Gate, London SW7 2BZ email: ds@uk.ac.ic.doc Abstract In this paper we address the technical foundations essential to the aim of providing a semantic basis for the formal treatment of relative efficiency in functional langu ..."
Abstract

Cited by 23 (9 self)
 Add to MetaCart
(Show Context)
) David Sands y Department of Computing, Imperial College 180 Queens Gate, London SW7 2BZ email: ds@uk.ac.ic.doc Abstract In this paper we address the technical foundations essential to the aim of providing a semantic basis for the formal treatment of relative efficiency in functional languages. For a general class of "functional" computation systems, we define a family of improvement preorderings which express, in a variety of ways, when one expression is more efficient than another. The main results of this paper build on Howe's study of equality in lazy computation systems, and are concerned with the question of when a given improvement relation is subject to the usual forms of (in)equational reasoning (so that, for example, we can improve an expression by improving any subexpression). For a general class of computation systems we establish conditions on the operators of the language which guarantee that an improvement relation is a precongruence. In addition, for...
From SOS Rules to Proof Principles: An Operational Metatheory for Functional Languages
 In Proc. POPL'97, the 24 th ACM SIGPLANSIGACT Symposium on Principles of Programming Languages
, 1997
"... Structural Operational Semantics (SOS) is a widely used formalism for specifying the computational meaning of programs, and is commonly used in specifying the semantics of functional languages. Despite this widespread use there has been relatively little work on the imetatheoryj for such semantics. ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
(Show Context)
Structural Operational Semantics (SOS) is a widely used formalism for specifying the computational meaning of programs, and is commonly used in specifying the semantics of functional languages. Despite this widespread use there has been relatively little work on the imetatheoryj for such semantics. As a consequence the operational approach to reasoning is considered ad hoc since the same basic proof techniques and reasoning tools are reestablished over and over, once for each operational semantics speciøcation. This paper develops some metatheory for a certain class of SOS language speciøcations for functional languages. We deøne a rule format, Globally Deterministic SOS (gdsos), and establish some proof principles for reasoning about equivalence which are sound for all languages which can be expressed in this format. More speciøcally, if the SOS rules for the operators of a language conform to the syntax of the gdsos format, then ffl a syntactic analogy of continuity holds, which rel...
Timing Analysis of Combinational Circuits in Intuitionistic Propositional Logic
 Formal Methods in System Design
, 1999
"... Classical logic has so far been the logic of choice in formal hardware verification. This paper proposes the application of intuitionistic logic to the timing analysis of digital circuits. The intuitionistic setting serves two purposes. The modeltheoretic properties are exploited to handle the s ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
Classical logic has so far been the logic of choice in formal hardware verification. This paper proposes the application of intuitionistic logic to the timing analysis of digital circuits. The intuitionistic setting serves two purposes. The modeltheoretic properties are exploited to handle the secondorder nature of bounded delays in a purely propositional setting without need to introduce explicit time and temporal operators. The proof theoretic properties are exploited to extract quantitative timing information and to reintroduce explicit time in a convenient and systematic way. We present a natural Kripkestyle semantics for intuitionistic propositional logic, as a special case of a Kripke constraint model for Propositional Lax Logic [15], in which validity is validity up to stabilisation, and implication oe comes out as "boundedly gives rise to." We show that this semantics is equivalently characterised by a notion of realisability with stabilisation bounds as realisers...
A Timing Refinement of Intuitionistic Proofs and its Application to the Timing Analysis of Combinational Circuits
 PROCEEDINGS OF THE 5TH INTERNATIONAL WORKSHOP ON THEOREM PROVING WITH ANALYTIC TABLEAUX AND RELATED METHODS
, 1996
"... Up until now classical logic has been the logic of choice in formal hardware verification. This report advances the application of intuitionistic logic to the timing analysis of digital circuits. The intuitionistic setting serves two purposes at the same time. The modeltheoretic properties are e ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
Up until now classical logic has been the logic of choice in formal hardware verification. This report advances the application of intuitionistic logic to the timing analysis of digital circuits. The intuitionistic setting serves two purposes at the same time. The modeltheoretic properties are exploited to handle the secondorder nature of bounded delays in a purely propositional setting without need to introduce explicit time and temporal operators. The proof theoretic properties are exploited to extract quantitative timing information and to reintroduce explicit time in a convenient and systematic way. We present a natural Kripkestyle semantics for intuitionistic propositional logic, as a special case of a Kripke constraint model for Propositional Lax Logic [4], in which validity is validity up to stabilization. We show that this semantics is equivalently characterized in terms of stabilization bounds so that implication oe comes out as "boundedly gives rise to." An int...
Adventures in time and space
 33th ACM Symposium on Principles of Programming Languages
, 2006
"... Abstract. This paper investigates what is essentially a callbyvalue version of PCF under a complexitytheoretically motivated type system. The programming formalism, ATR, has its firstorder programs characterize the polynomialtime computable functions, and its secondorder programs characterize ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Abstract. This paper investigates what is essentially a callbyvalue version of PCF under a complexitytheoretically motivated type system. The programming formalism, ATR, has its firstorder programs characterize the polynomialtime computable functions, and its secondorder programs characterize the type2 basic feasible functionals of Mehlhorn and of Cook and Urquhart. (The ATRtypes are confined to levels 0, 1, and 2.) The type system comes in two parts, one that primarily restricts the sizes of values of expressions and a second that primarily restricts the time required to evaluate expressions. The sizerestricted part is motivated by Bellantoni and Cook’s and Leivant’s implicit characterizations of polynomialtime. The timerestricting part is an affine version of Barber and Plotkin’s DILL. Two semantics are constructed for ATR. The first is a pruning of the naïve denotational semantics for ATR. This pruning removes certain functions that cause otherwise feasible forms of recursion to go wrong. The second semantics is a model for ATR’s time complexity relative to a certain abstract machine. This model provides a setting for complexity recurrences arising from ATR recursions, the solutions of which yield secondorder polynomial time bounds. The timecomplexity semantics is also shown to be sound relative to the costs of interpretation on the abstract machine. 1.
A Denotational Approach to Measuring Complexity in Functional Programs
, 2003
"... ..."
(Show Context)
Diagrammatic Representations in DomainSpecific Languages
, 2000
"... One emerging approach to reducing the labour and costs of software development favours the specialisation of techniques to particular application domains. The rationale is that programs within a given domain often share enough common features and assumptions to enable the incorporation of substantia ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
One emerging approach to reducing the labour and costs of software development favours the specialisation of techniques to particular application domains. The rationale is that programs within a given domain often share enough common features and assumptions to enable the incorporation of substantial support mechanisms into domainspecific programming languages and associated tools. Instead of being machineoriented, algorithmic implementations, programs in many domainspecific languages (DSLs) are rather userlevel, problemoriented specifications of solutions. Taken further, this view suggests that the most appropriate representation of programs in many domains is diagrammatic, in a way which derives from existing design notations in the domain. This thesis conducts an investigation, using mathematical techniques and supported by case studies, of issues arising from the use of diagrammatic representations in DSLs. Its structure is conceptually divided into two parts: the first is co...