Results 1  10
of
31
Semantic foundations of concurrent constraint programming
, 1990
"... Concurrent constraint programming [Sar89,SR90] is a simple and powerful model of concurrent computation based on the notions of storeasconstraint and process as information transducer. The storeasvaluation conception of von Neumann computing is replaced by the notion that the store is a constr ..."
Abstract

Cited by 263 (27 self)
 Add to MetaCart
Concurrent constraint programming [Sar89,SR90] is a simple and powerful model of concurrent computation based on the notions of storeasconstraint and process as information transducer. The storeasvaluation conception of von Neumann computing is replaced by the notion that the store is a constraint (a finite representation of a possibly infinite set of valuations) which provides partial information about the possible values that variables can take. Instead of “reading” and “writing ” the values of variables, processes may now ask (check if a constraint is entailed by the store) and tell (augment the store with a new constraint). This is a very general paradigm which subsumes (among others) nondeterminate dataflow and the (concurrent) (constraint) logic programming languages. This paper develops the basic ideas involved in giving a coherent semantic account of these languages. Our first contribution is to give a simple and general formulation of the notion that a constraint system is a system of partial information (a la the information systems of Scott). Parameter passing and hiding is handled by borrowing ideas from the cylindric algebras of Henkin, Monk and Tarski to introduce diagonal elements and “cylindrification ” operations (which mimic the projection of information induced by existential quantifiers). The se;ond contribution is to introduce the notion of determinate concurrent constraint programming languages. The combinators treated are ask, tell, parallel composition, hiding and recursion. We present a simple model for this language based on the specificationoriented methodology of [OH86]. The crucial insight is to focus on observing the resting points of a process—those stores in which the process quiesces without producing more information. It turns out that for the determinate language, the set of resting points of a process completely characterizes its behavior on all inputs, since each process can be identified with a closure operator over the underlying constraint system. Very natural definitions of parallel composition, communication and hiding are given. For example, the parallel composition of two agents can be characterized by just the intersection of the sets of constraints associated with them. We also give a complete axiomatization of equality in this model, present
Categorical Logic
 A CHAPTER IN THE FORTHCOMING VOLUME VI OF HANDBOOK OF LOGIC IN COMPUTER SCIENCE
, 1995
"... ..."
Presheaf Models for Concurrency
, 1999
"... In this dissertation we investigate presheaf models for concurrent computation. Our aim is to provide a systematic treatment of bisimulation for a wide range of concurrent process calculi. Bisimilarity is defined abstractly in terms of open maps as in the work of Joyal, Nielsen and Winskel. Their wo ..."
Abstract

Cited by 45 (19 self)
 Add to MetaCart
In this dissertation we investigate presheaf models for concurrent computation. Our aim is to provide a systematic treatment of bisimulation for a wide range of concurrent process calculi. Bisimilarity is defined abstractly in terms of open maps as in the work of Joyal, Nielsen and Winskel. Their work inspired this thesis by suggesting that presheaf categories could provide abstract models for concurrency with a builtin notion of bisimulation. We show how
Proof nets and the complexity of processing centerembedded constructions
 Journal of Logic, Language and Information
, 1998
"... Abstract. This paper shows how proof nets can be used to formalize the notion of “incomplete dependency ” used in psycholinguistic theories of the unacceptability of centerembedded constructions. Such theories of human language processing can usually be restated in terms of geometrical constraints ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
Abstract. This paper shows how proof nets can be used to formalize the notion of “incomplete dependency ” used in psycholinguistic theories of the unacceptability of centerembedded constructions. Such theories of human language processing can usually be restated in terms of geometrical constraints on proof nets. The paper ends with a discussion of the relationship between these constraints and incremental semantic interpretation. 1.
A computerchecked verification of Milner's scheduler
 PROCEEDINGS OF THE 2 ND INTERNATIONAL SYMPOSIUM ON THEORETICAL ASPECTS OF COMPUTER SOFTWARE, TACS '94
, 1994
"... We present an equational verification of Milner's scheduler, which we checked by computer. To our knowledge this is the first time that the scheduler is proofchecked for a general number n of scheduled processes. ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
We present an equational verification of Milner's scheduler, which we checked by computer. To our knowledge this is the first time that the scheduler is proofchecked for a general number n of scheduled processes.
Relating ResourceBased Semantics to Categorial Semantics
, 1997
"... This paper shows that a significant fragment of the glue approach can be reformulated to separate out the meaning composition in a way that is very similar to that of the categorial approaches. Specifically, we show the following: ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
This paper shows that a significant fragment of the glue approach can be reformulated to separate out the meaning composition in a way that is very similar to that of the categorial approaches. Specifically, we show the following:
Type analysis and data structure selection
 Constructing Programs from Specifications
, 1991
"... types such as sets and maps with efficient data structures. Their transformation rests on the discovery of finite universal sets, called bases, to be used for avoiding data replication and for creating aggregate data structures that implement associative access by simpler cursor or pointer access. T ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
types such as sets and maps with efficient data structures. Their transformation rests on the discovery of finite universal sets, called bases, to be used for avoiding data replication and for creating aggregate data structures that implement associative access by simpler cursor or pointer access. The SETL implementation used global analysis similar to classical dataflow for typings and for set inclusion and membership relationships to determine bases. However, the optimized data structures selected by this optmization did not include a primitive linked list or array, and all optimized data structures retained some degree of hashing. Hence, this heuristic approach only resulted in an expected improvement in performance over default implementations. The analysis was complicated by SETL’s imperative style, weak typing, and low level control structures. The implemented optimizer was large (about 20,000 lines of SETL source
A modular typechecking algorithm for type theory with singleton types and proof irrelevance
 IN TLCA’09, VOLUME 5608 OF LNCS
, 2009
"... ..."