Results 1 
4 of
4
Unification: A multidisciplinary survey
 ACM Computing Surveys
, 1989
"... The unification problem and several variants are presented. Various algorithms and data structures are discussed. Research on unification arising in several areas of computer science is surveyed, these areas include theorem proving, logic programming, and natural language processing. Sections of the ..."
Abstract

Cited by 105 (0 self)
 Add to MetaCart
The unification problem and several variants are presented. Various algorithms and data structures are discussed. Research on unification arising in several areas of computer science is surveyed, these areas include theorem proving, logic programming, and natural language processing. Sections of the paper include examples that highlight particular uses
Explaining Type Inference
 Science of Computer Programming
, 1995
"... Type inference is the compiletime process of reconstructing missing type information in a program based on the usage of its variables. ML and Haskell are two languages where this aspect of compilation has enjoyed some popularity, allowing type information to be omitted while static type checking is ..."
Abstract

Cited by 53 (0 self)
 Add to MetaCart
Type inference is the compiletime process of reconstructing missing type information in a program based on the usage of its variables. ML and Haskell are two languages where this aspect of compilation has enjoyed some popularity, allowing type information to be omitted while static type checking is still performed. Type inference may be expected to have some application in the prototyping and scripting languages which are becoming increasingly popular. A difficulty with type inference is the confusing and sometimes counterintuitive diagnostics produced by the type checker as a result of type errors. A modification of the HindleyMilner type inference algorithm is presented, which allows the specific reasoning which led to a program variable having a particular type to be recorded for type explanation. This approach is close to the intuitive process used in practice for debugging type errors. 1 Introduction Type inference refers to the compiletime process of reconstructing missing t...
Matching and Unification for the ObjectOriented Symbolic Computation System AlgBench
 In Proc. of the 3rd Intern. Symposium on Design and Implementation of Symbolic Computation Systems (DISCO'93), SpringerVerlag, LNCS 722
, 1993
"... . Term matching has become one of the most important primitive operations for symbolic computation. This paper describes the extension of the objectoriented symbolic computation system AlgBench with pattern matching and unification facilities. The various pattern objects are organized in subclasses ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
. Term matching has become one of the most important primitive operations for symbolic computation. This paper describes the extension of the objectoriented symbolic computation system AlgBench with pattern matching and unification facilities. The various pattern objects are organized in subclasses of the class of the composite expressions. This leads to a clear design and to a distributed implementation of the pattern matcher in the subclasses. New pattern object classes can consequently be added easily to the system. Huet's and our simple mark and retract algorithm for standard unification as well as Stickel's algorithm for associative commutative unification have been implemented in an objectoriented style. Unifiers are selected at runtime. We extend Mathematica's typeconstrained pattern matching by taking into account inheritance information from a userdefined hierarchy of object types. The argument unification is basically instance variable unification. The improvement of the ...
Computational Complexity in Natural Language
"... We have become so used to viewing natural language in computational terms that we need occasionally to remind ourselves of the methodological commitment this view entails. That commitment is this: we assume that to understand linguistic tasks—tasks such as recognizing sentences, determining their st ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We have become so used to viewing natural language in computational terms that we need occasionally to remind ourselves of the methodological commitment this view entails. That commitment is this: we assume that to understand linguistic tasks—tasks such as recognizing sentences, determining their structure, extracting their meaning, and manipulating the information they contain—is to discover the algorithms required to perform those tasks, and to investigate their computational properties. To be sure, the physical realization of the corresponding processes in humans is a legitimate study too, but one from which the computational investigation of language may be pursued in Splendid Isolation. Complexity Theory is the mathematical study of the resources—both in time and space—required to perform computational tasks. What bounds can we place—from above or below—on the number of steps taken to compute suchandsuch a function, or a function belonging to suchandsuch a class? What bounds can we place on the amount of memory required? It is not surprising, therefore, that in the study of natural language, complexitytheoretic issues abound. Since any computational task can be the object of complexitytheoretic investigation, it would be hopeless even to attempt a complete survey of Complexity Theory in the study of natural language. We focus therefore on a selection of topics in natural language where there has been a particular accumulation of complexitytheoretic results. Section 2 discusses parsing and recognition; Section 3 discusses the computation of logical form; and Section 4 discusses the problem of determining logical relationships between sentences in natural language. But we begin with a brief review of the Complexity Theory itself. A draft chapter for the Blackwell Computational Linguistics and Natural Language Processing Handbook, edited by Alex Clark, Chris Fox and Shalom Lappin.