Results 1  10
of
40
An Efficient Unification Algorithm
 TRANSACTIONS ON PROGRAMMING LANGUAGES AND SYSTEMS (TOPLAS)
, 1982
"... The unification problem in firstorder predicate calculus is described in general terms as the solution of a system of equations, and a nondeterministic algorithm is given. A new unification algorithm, characterized by having the acyclicity test efficiently embedded into it, is derived from the nond ..."
Abstract

Cited by 336 (1 self)
 Add to MetaCart
The unification problem in firstorder predicate calculus is described in general terms as the solution of a system of equations, and a nondeterministic algorithm is given. A new unification algorithm, characterized by having the acyclicity test efficiently embedded into it, is derived from the nondeterministic one, and a PASCAL implementation is given. A comparison with other wellknown unification algorithms shows that the algorithm described here performs well in all cases.
A Foundation for Actor Computation
 Journal of Functional Programming
, 1998
"... We present an actor language which is an extension of a simple functional language, and provide a precise operational semantics for this extension. Actor configurations represent open distributed systems, by which we mean that the specification of an actor system explicitly takes into account the in ..."
Abstract

Cited by 222 (51 self)
 Add to MetaCart
We present an actor language which is an extension of a simple functional language, and provide a precise operational semantics for this extension. Actor configurations represent open distributed systems, by which we mean that the specification of an actor system explicitly takes into account the interface with external components. We study the composability of such systems. We define and study various notions of testing equivalence on actor expressions and configurations. The model we develop provides fairness. An important result is that the three forms of equivalence, namely, convex, must, and may equivalences, collapse to two in the presence of fairness. We further develop methods for proving laws of equivalence and provide example proofs to illustrate our methodology.
Experiments with DiscriminationTree Indexing and Path Indexing for Term Retrieval
 JOURNAL OF AUTOMATED REASONING
, 1990
"... This article addresses the problem of indexing and retrieving firstorder predicate calculus terms in the context of automated deduction programs. The four retrieval operations of concern are to find variants, generalizations, instances, and terms that unify with a given term. Discriminationtree ..."
Abstract

Cited by 43 (0 self)
 Add to MetaCart
This article addresses the problem of indexing and retrieving firstorder predicate calculus terms in the context of automated deduction programs. The four retrieval operations of concern are to find variants, generalizations, instances, and terms that unify with a given term. Discriminationtree indexing is reviewed, and several variations are presented. The pathindexing method is also reviewed. Experiments were conducted on large sets of terms to determine how the properties of the terms affect the performance of the two indexing methods. Results of the experiments are presented.
Variable Precision Logic
 Artificial Intelligence
, 1986
"... V;rriablc precision logic is concerned with probh.'ms (ff reaoning with incomplete information md under time constrdnts. It offers mechanisms for hmdhng tradeoffs between the precision of ixffcrcnces mid the computatiomd cihcicncy tff deriving them. Of the two mapects of precision, the specificity ..."
Abstract

Cited by 33 (6 self)
 Add to MetaCart
V;rriablc precision logic is concerned with probh.'ms (ff reaoning with incomplete information md under time constrdnts. It offers mechanisms for hmdhng tradeoffs between the precision of ixffcrcnces mid the computatiomd cihcicncy tff deriving them. Of the two mapects of precision, the specificity of conclusions mid the certainty (ff bclicf in them, we address here primarily thc latter, md mnploy censored production rules as m underlying reprcsentation md cmnputation nhanism. Such les e created by augmenting ordim production nles witli l exception condition, d e written the t}rm if A then B unless C, where C is the exception condition.
Interpretation in Design: The Problem Of Tacit And Explicit . . .
, 1993
"... This work analyzes the central role of interpretation in nonroutine design. Based on this analysis, a theory of computer support for interpretation in cooperative design is constructed. The theory is grounded in studies of design and interpretation. It is illustrated by mechanisms provided by a sof ..."
Abstract

Cited by 31 (14 self)
 Add to MetaCart
This work analyzes the central role of interpretation in nonroutine design. Based on this analysis, a theory of computer support for interpretation in cooperative design is constructed. The theory is grounded in studies of design and interpretation. It is illustrated by mechanisms provided by a software substrate for computerbased design environments, applied to a sample task of lunar habitat design. Computer support of
On Memory Limitations In Natural Language Processing
, 1980
"... This paper though will not discuss bound anaphora. Righi Node Raising  133  Section 9.3. I (488) '1 took and you went ..."
Abstract

Cited by 29 (1 self)
 Add to MetaCart
This paper though will not discuss bound anaphora. Righi Node Raising  133  Section 9.3. I (488) '1 took and you went
The Paradigms of Programming
 Communications of the ACM
, 1979
"... tee) cited Professor Floyd for "helping to found the following important subfields of computer science: the theory of parsing, the semantics of programming languages, automatic program verification, automatic program synthesis, and analysis of algorithms." Professor Floyd, who received bo ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
tee) cited Professor Floyd for "helping to found the following important subfields of computer science: the theory of parsing, the semantics of programming languages, automatic program verification, automatic program synthesis, and analysis of algorithms." Professor Floyd, who received both his A.B. and B.S. from the University of Chicago in 1953 and 1958, respectively, is a selftaught computer scientist. His study of computing began in 1956, when as a nightoperator for an IBM 650, he found the time to learn about programming between loads of card hoppers. Floyd implemented one of the first Algol 60 compilers, finishing his work on this project in 1962. In the process, he did some early work on compiler optimization. Subsequently, in the
Type inference and semiunification
 In Proceedings of the ACM Conference on LISP and Functional Programming (LFP ) (Snowbird
, 1988
"... In the last ten years declarationfree programming languages with a polymorphic typing discipline (ML, B) have been developed to approximate the flexibility and conciseness of dynamically typed languages (LISP, SETL) while retaining the safety and execution efficiency of conventional statically type ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
In the last ten years declarationfree programming languages with a polymorphic typing discipline (ML, B) have been developed to approximate the flexibility and conciseness of dynamically typed languages (LISP, SETL) while retaining the safety and execution efficiency of conventional statically typed languages (Algol68, Pascal). These polymorphic languages can be type checked at compile time, yet allow functions whose arguments range over a variety of types. We investigate several polymorphic type systems, the most powerful of which, termed MilnerMycroft Calculus, extends the socalled letpolymorphism found in, e.g., ML with a polymorphic typing rule for recursive definitions. We show that semiunification, the problem of solving inequalities over firstorder terms, characterizes type checking in the MilnerMycroft Calculus to polynomial time, even in the restricted case where nested definitions are disallowed. This permits us to extend some infeasibility results for related combinatorial problems to type inference and to correct several claims and statements in the literature. We prove the existence of unique most general solutions of term inequalities, called most general semiunifiers, and present an algorithm for computing them that terminates for all known inputs due to a novel “extended occurs check”. We conjecture this algorithm to be