Results 1  10
of
30
Universal coalgebra: a theory of systems
, 2000
"... In the semantics of programming, nite data types such as finite lists, have traditionally been modelled by initial algebras. Later final coalgebras were used in order to deal with in finite data types. Coalgebras, which are the dual of algebras, turned out to be suited, moreover, as models for certa ..."
Abstract

Cited by 405 (43 self)
 Add to MetaCart
In the semantics of programming, nite data types such as finite lists, have traditionally been modelled by initial algebras. Later final coalgebras were used in order to deal with in finite data types. Coalgebras, which are the dual of algebras, turned out to be suited, moreover, as models for certain types of automata and more generally, for (transition and dynamical) systems. An important property of initial algebras is that they satisfy the familiar principle of induction. Such a principle was missing for coalgebras until the work of Aczel (NonWellFounded sets, CSLI Leethre Notes, Vol. 14, center for the study of Languages and information, Stanford, 1988) on a theory of nonwellfounded sets, in which he introduced a proof principle nowadays called coinduction. It was formulated in terms of bisimulation, a notion originally stemming from the world of concurrent programming languages. Using the notion of coalgebra homomorphism, the definition of bisimulation on coalgebras can be shown to be formally dual to that of congruence on algebras. Thus, the three basic notions of universal algebra: algebra, homomorphism of algebras, and congruence, turn out to correspond to coalgebra, homomorphism of coalgebras, and bisimulation, respectively. In this paper, the latter are taken
NonDeterministic Kleene Coalgebras
"... In this paper, we present a systematic way of deriving (1) languages of (generalised) regular expressions, and (2) sound and complete axiomatizations thereof, for a wide variety of systems. This generalizes both the results of Kleene (on regular languages and deterministic finite automata) and Miln ..."
Abstract

Cited by 26 (10 self)
 Add to MetaCart
In this paper, we present a systematic way of deriving (1) languages of (generalised) regular expressions, and (2) sound and complete axiomatizations thereof, for a wide variety of systems. This generalizes both the results of Kleene (on regular languages and deterministic finite automata) and Milner (on regular behaviours and finite labelled transition systems), and includes many other systems such as Mealy and Moore machines.
A Coalgebraic Foundation for Linear Time Semantics
 In Category Theory and Computer Science
, 1999
"... We present a coalgebraic approach to trace equivalence semantics based on lifting behaviour endofunctors for deterministic action to Kleisli categories of monads for nondeterministic choice. In Set , this gives a category with ordinary transition systems as objects and with morphisms characterised ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
We present a coalgebraic approach to trace equivalence semantics based on lifting behaviour endofunctors for deterministic action to Kleisli categories of monads for nondeterministic choice. In Set , this gives a category with ordinary transition systems as objects and with morphisms characterised in terms of a linear notion of bisimulation. The final object in this category is the canonical abstract model for trace equivalence and can be obtained by extending the final coalgebra of the deterministic action behaviour to the Kleisli category of the nonempty powerset monad. The corresponding final coalgebra semantics is fully abstract with respect to trace equivalence.
Components as processes: An exercise in coalgebraic modeling
 FMOODS’2000  Formal Methods for Open ObjectOriented Distributed Systems
, 2000
"... Abstract Software components, arising, typically, in systems ’ analysis and design, are characterized by a public interface and a private encapsulated state. They persist (and evolve) in time, according to some behavioural patterns. This paper is an exercise in modeling such components as coalgebras ..."
Abstract

Cited by 15 (6 self)
 Add to MetaCart
(Show Context)
Abstract Software components, arising, typically, in systems ’ analysis and design, are characterized by a public interface and a private encapsulated state. They persist (and evolve) in time, according to some behavioural patterns. This paper is an exercise in modeling such components as coalgebras for some kinds of endofunctors on ¢¡¤ £ , capturing both (interface) types and behavioural aspects. The construction of component categories, cofibred over the interface space, emerges by generalizing the usual notion of a coalgebra morphism. A collection of composition operators as well as a generic notion of bisimilarity, are discussed.
Coinductive Interpreters for Process Calculi
 In Sixth International Symposium on Functional and Logic Programming, volume 2441 of LNCS
, 2002
"... This paper suggests functional programming languages with coinductive types as suitable devices for prototyping process calculi. The proposed approach is independent of any particular process calculus and makes explicit the dierent ingredients present in the design of any such calculi. In partic ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
(Show Context)
This paper suggests functional programming languages with coinductive types as suitable devices for prototyping process calculi. The proposed approach is independent of any particular process calculus and makes explicit the dierent ingredients present in the design of any such calculi. In particular structural aspects of the underlying behaviour model become clearly separated from the interaction structure which de nes the synchronisation discipline. The approach is illustrated by the detailed development in Charity of an interpreter for a family of process languages.
ComponentBased Coalgebraic Specification and Verification in RSL
 in RSL, UNU/IIST Report No. 267, Macau 2002
, 2002
"... Componentbased sofware development has become a popular paradigm in software engineering. From the theoretical point of view, components can be seen as coalgebras. We present a coalgebraic technique for componentbased system specification and verification which is based on RSL, the wide spectrum s ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Componentbased sofware development has become a popular paradigm in software engineering. From the theoretical point of view, components can be seen as coalgebras. We present a coalgebraic technique for componentbased system specification and verification which is based on RSL, the wide spectrum specification language of the RAISE method. A bisimulation relationship between components is defined for reuse of components and used in the behavior verification of specification development. Final coalgebras are used to construct the minimal implementations of given specifications.
Decidability and Complexity Issues for InfiniteState Processes
, 2003
"... This thesis studies decidability and complexity borderlines for algorithmic verification of infinitestate systems. Verification techniques for finitestate processes are a successful approach for the validation of reactive programs operating over finite domains. When infinite data domains are intr ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
This thesis studies decidability and complexity borderlines for algorithmic verification of infinitestate systems. Verification techniques for finitestate processes are a successful approach for the validation of reactive programs operating over finite domains. When infinite data domains are introduced, most of the verification problems become in general undecidable. By imposing certain restrictions on the considered models (while still including infinitestate spaces), it is possible to provide algorithmic decision procedures for a variety of properties. Hence the models of infinitestate systems can be classified according to which verification problems are decidable, and in the positive case according to complexity considerations.
A Semantics for Approximate Program Transformations
, 2013
"... An approximate program transformation is a transformation that can change the semantics of a program within a specified empirical error bound. Such transformations have wide applications: they can decrease computation time, power consumption, and memory usage, and can, in some cases, allow impleme ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
An approximate program transformation is a transformation that can change the semantics of a program within a specified empirical error bound. Such transformations have wide applications: they can decrease computation time, power consumption, and memory usage, and can, in some cases, allow implementations of incomputable operations. Correctness proofs of approximate program transformations are by definition quantitative. Unfortunately, unlike with standard program transformations, there is as of yet no modular way to prove correctness of an approximate transformation itself. Error bounds must be proved for each transformed program individually, and must be reproved each time a program is modified or a different set of approximations are applied. In this paper, we give a semantics that enables quantitative reasoning about a large class of approximate program transformations in a local, composable way. Our semantics is based on a notion of distance between programs that defines what it means for an approximate transformation to be correct up to an error bound. The key insight is that distances between programs cannot in general be formulated in terms of metric spaces and real numbers. Instead, our semantics admits natural notions of distance for each type construct; for example, numbers are used as distances for numerical data, functions are used as distances for functional data, and polymorphic lambdaterms are used as distances for polymorphic data. We then show how our semantics applies to two example approximations: replacing reals with floatingpoint numbers, and loop perforation. 1.