Results 1  10
of
17
InductiveDataType Systems
, 2002
"... In a previous work ("Abstract Data Type Systems", TCS 173(2), 1997), the leI two authors presented a combined lmbined made of a (strongl normal3zG9 alrmal rewrite system and a typed #calA#Ik enriched by patternmatching definitions folnitio a certain format,calat the "General Schema", whichgenera ..."
Abstract

Cited by 755 (22 self)
 Add to MetaCart
In a previous work ("Abstract Data Type Systems", TCS 173(2), 1997), the leI two authors presented a combined lmbined made of a (strongl normal3zG9 alrmal rewrite system and a typed #calA#Ik enriched by patternmatching definitions folnitio a certain format,calat the "General Schema", whichgeneral39I theusual recursor definitions fornatural numbers and simil9 "basic inductive types". This combined lmbined was shown to bestrongl normalIk39f The purpose of this paper is toreformul33 and extend theGeneral Schema in order to make it easil extensibl3 to capture a more general cler of inductive types, cals, "strictly positive", and to ease the strong normalgAg9Ik proof of theresulGGg system. Thisresul provides a computation model for the combination of anal"DAfGI specification language based on abstract data types and of astrongl typed functional language with strictly positive inductive types.
Type Inference with Polymorphic Recursion
 Transactions on Programming Languages and Systems
, 1991
"... The DamasMilner Calculus is the typed Acalculus underlying the type system for ML and several other strongly typed polymorphic functional languages such as Mirandal and Haskell. Mycroft has extended its problematic monomorphic typing rule for recursive definitions with a polymorphic typing rule. H ..."
Abstract

Cited by 135 (0 self)
 Add to MetaCart
The DamasMilner Calculus is the typed Acalculus underlying the type system for ML and several other strongly typed polymorphic functional languages such as Mirandal and Haskell. Mycroft has extended its problematic monomorphic typing rule for recursive definitions with a polymorphic typing rule. He proved the resulting type system, which we call the MilnerMycroft Calculus, sound with respect to Milner’s semantics, and showed that it preserves the principal typing property of the DamasMilner Calculus. The extension is of practical significance in typed logic programming languages and, more generally, in any language with (mutually) recursive definitions. In this paper we show that the type inference problem for the MilnerMycroft Calculus is logspace equivalent to semiunification, the problem of solving subsumption inequations between firstorder terms. This result has been proved independently by Kfoury et al. In connection with the recently established undecidability of semiunification this implies that typability in the MilnerMycroft Calculus is undecidable. We present some reasons why type inference with polymorphic recursion appears to be practical despite its undecidability. This also sheds some light on the observed practicality of ML
Records for Logic Programming
 Journal of Logic Programming
, 1994
"... CFT is a new constraint system providing records as logical data structure for constraint (logic) programming. It can be seen as a generalization of the rational tree system employed in Prolog II, where finergrained constraints are used, and where subtrees are identified by keywords rather than by ..."
Abstract

Cited by 95 (17 self)
 Add to MetaCart
CFT is a new constraint system providing records as logical data structure for constraint (logic) programming. It can be seen as a generalization of the rational tree system employed in Prolog II, where finergrained constraints are used, and where subtrees are identified by keywords rather than by position. CFT is defined by a firstorder structure consisting of socalled feature trees. Feature trees generalize the ordinary trees corresponding to firstorder terms by having their edges labeled with field names called features. The mathematical semantics given by the feature tree structure is complemented with a logical semantics given by five axiom schemes, which we conjecture to comprise a complete axiomatization of the feature tree structure. We present a decision method for CFT, which decides entailment / disentailment between possibly existentially quantified constraints. Since CFT satisfies the independence property, our decision method can also be employed for checking the sat...
Efficient Type Inference for HigherOrder BindingTime Analysis
 In Functional Programming and Computer Architecture
, 1991
"... Bindingtime analysis determines when variables and expressions in a program can be bound to their values, distinguishing between early (compiletime) and late (runtime) binding. Bindingtime information can be used by compilers to produce more efficient target programs by partially evaluating prog ..."
Abstract

Cited by 91 (4 self)
 Add to MetaCart
Bindingtime analysis determines when variables and expressions in a program can be bound to their values, distinguishing between early (compiletime) and late (runtime) binding. Bindingtime information can be used by compilers to produce more efficient target programs by partially evaluating programs at compiletime. Bindingtime analysis has been formulated in abstract interpretation contexts and more recently in a typetheoretic setting. In a typetheoretic setting bindingtime analysis is a type inference problem: the problem of inferring a completion of a λterm e with bindingtime annotations such that e satisfies the typing rules. Nielson and Nielson and Schmidt have shown that every simply typed λterm has a unique completion ê that minimizes late binding in TML, a monomorphic type system with explicit bindingtime annotations, and they present exponential time algorithms for computing such minimal completions. 1 Gomard proves the same results for a variant of his twolevel λcalculus without a socalled “lifting ” rule. He presents another algorithm for inferring completions in this somewhat restricted type system and states that it can be implemented in time O(n 3). He conjectures that the completions computed are minimal.
Extensions and Applications of Higherorder Unification
, 1990
"... ... unification problems. Then, in this framework, we develop a new unification algorithm for acalculus with dependent function (II) types. This algorithm is especially useful as it provides for mechanization in the very expressive Logical Framework (LF). The development (objectlanguages). The ric ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
... unification problems. Then, in this framework, we develop a new unification algorithm for acalculus with dependent function (II) types. This algorithm is especially useful as it provides for mechanization in the very expressive Logical Framework (LF). The development (objectlanguages). The rich structure of a typedcalculus,asopposedtotraditional,rst generalideaistouseacalculusasametalanguageforrepresentingvariousotherlanguages thelattercase,thealgorithmisincomplete,thoughstillquiteusefulinpractice. Thelastpartofthedissertationprovidesexamplesoftheusefulnessofthealgorithms.The algorithmrstfordependentproduct()types,andsecondforimplicitpolymorphism.In involvessignicantcomplicationsnotarisingHuet'scorrespondingalgorithmforthesimply orderabstractsyntaxtrees,allowsustoexpressrules,e.g.,programtransformationand typedcalculus,primarilybecauseitmustdealwithilltypedterms.Wethenextendthis Wecanthenuseunicationinthemetalanguagetomechanizeapplicationoftheserules.
Types in Functional Unification Grammars
, 1990
"... Functional Unification Grammars (FUGs) are popular for natural language applications because the formalism uses very few primitives and is uniform and expressive. In our work on text generation, we have found that it also has annoying limitations: it is not suited for the expression of simple, yet v ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
Functional Unification Grammars (FUGs) are popular for natural language applications because the formalism uses very few primitives and is uniform and expressive. In our work on text generation, we have found that it also has annoying limitations: it is not suited for the expression of simple, yet very common, taxonomic relations and it does not allow the specification of completeness conditions. We have implemented an extension of traditional functional unification. This extension addresses these limitations whiIe preserving the desirable properties of FUGs. It is based on the notions of typed features and typed constituents. We show the advantages of this extension in the context of a grammar used for text generation.
A Composite Domain for Freeness, Sharing, and Compoundness Analysis of Logic Programs
 Department of Computer Science
, 1994
"... Accurate sharing and freeness properties of program variables have been inferred in the past by means of combined dataflow analyses of logic programs. Groundness, linearity, and structural information must be taken into account in order to obtain sufficient precision. Abstract equation systems are o ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Accurate sharing and freeness properties of program variables have been inferred in the past by means of combined dataflow analyses of logic programs. Groundness, linearity, and structural information must be taken into account in order to obtain sufficient precision. Abstract equation systems are one of the formalisms previously proposed to represent all these properties of the runtime values of program variables. The present work is concerned with the specification of correct and practical operations on the domain of abstract equation systems, which constitute a prerequisite for the domain to be suited for an implementation in the framework of abstract interpretation. The main technical contribution of the paper is the presentation of a novel and powerful algorithm for resolving an abstract equation system, the formal proof of its correctness, and a study of the invariance of certain linearity and freeness properties under the application of relevant most general unifiers. Keywords: Logic programming, abstract interpretation, mode analysis. 1
Decidability of Bounded HigherOrder Unification
, 2002
"... It is shown that unifiability of terms in the simply typed lambda calculus with beta and eta rules becomes decidable if there is a bound on the number of bound variables and lambdas in a unifier in etaexpanded betanormal form. ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
It is shown that unifiability of terms in the simply typed lambda calculus with beta and eta rules becomes decidable if there is a bound on the number of bound variables and lambdas in a unifier in etaexpanded betanormal form.
Elimination of Negation in Term Algebras
 In Mathematical Foundations of Computer Science
, 1991
"... We give an informal review of the problem of eliminating negation in term algebras and its applications. The initial results appear to be very specialized with complex combinatorial proofs. Nevertheless they have applications and relevance to a number of important areas: unification, learning, ab ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We give an informal review of the problem of eliminating negation in term algebras and its applications. The initial results appear to be very specialized with complex combinatorial proofs. Nevertheless they have applications and relevance to a number of important areas: unification, learning, abstract data types and rewriting systems, constraints and constructive negation in logic languages. 1 Initial Motivation: Learning Plotkin [36] proposed a formal model for inductive inference which was based upon Popplestone 's suggestion that Since unification is useful in automatic deduction, its dual might prove helpful for induction. A similar formalism was independently introduced by Reynolds [38], who was more concerned with its algebraic properties than with its applications. The algebraic properties were further investigated by Huet [9, 10], who also studied the case of the infinitary Herbrand universe. The key result in this theory is that, for any set of terms, there exists a...
On Inductive Inference of Cyclic Structures
 Annals of Mathematics and Artificial Intelligence, volume F. J.C. Baltzer Scientific Pub
, 2000
"... We examine the problem of inductive inference in the domain of pointerbased data structures. We show how these data structures can be formalized as rational trees. Our main technical results concern the expressiveness of a language of rational term expressions. These results place limitations on te ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We examine the problem of inductive inference in the domain of pointerbased data structures. We show how these data structures can be formalized as rational trees. Our main technical results concern the expressiveness of a language of rational term expressions. These results place limitations on techniques of inductive inference for this description language. The results are also relevant to implementation of negation in logic programming languages. 1 1 Introduction Consider the class of data structures consisting of records and pointers among them, such pointers occurring as fields of a record. We call this the class of pointerbased data structures . It contains some acyclic data structures, such as lists, but most data structures of this sort are cyclic: doublylinked lists, circular lists, threaded search trees as well as other, more ad hoc, structures used, for example, in objectoriented databases where cyclicity in the data is reflected in the data structure. Clearly this cla...