Results 11  20
of
163
Explaining Type Inference
 Science of Computer Programming
, 1995
"... Type inference is the compiletime process of reconstructing missing type information in a program based on the usage of its variables. ML and Haskell are two languages where this aspect of compilation has enjoyed some popularity, allowing type information to be omitted while static type checking is ..."
Abstract

Cited by 53 (0 self)
 Add to MetaCart
Type inference is the compiletime process of reconstructing missing type information in a program based on the usage of its variables. ML and Haskell are two languages where this aspect of compilation has enjoyed some popularity, allowing type information to be omitted while static type checking is still performed. Type inference may be expected to have some application in the prototyping and scripting languages which are becoming increasingly popular. A difficulty with type inference is the confusing and sometimes counterintuitive diagnostics produced by the type checker as a result of type errors. A modification of the HindleyMilner type inference algorithm is presented, which allows the specific reasoning which led to a program variable having a particular type to be recorded for type explanation. This approach is close to the intuitive process used in practice for debugging type errors. 1 Introduction Type inference refers to the compiletime process of reconstructing missing t...
Polymorphism and Type Inference in Database Programming
"... In order to find a static type system that adequately supports database languages, we need to express the most general type of a program that involves database operations. This can be achieved through an extension to the type system of ML that captures the polymorphic nature of field selection, toge ..."
Abstract

Cited by 39 (10 self)
 Add to MetaCart
In order to find a static type system that adequately supports database languages, we need to express the most general type of a program that involves database operations. This can be achieved through an extension to the type system of ML that captures the polymorphic nature of field selection, together with a technique that generalizes relational operators to arbitrary data structures. The combination provides a statically typed language in which generalized relational databases may be cleanly represented as typed structures. As in ML types are inferred, which relieves the programmer of making the type assertions that may be required in a complex database environment. These extensions may also be used to provide static polymorphic typechecking in objectoriented languages and databases. A problem that arises with objectoriented databases is the apparent need for dynamic typechecking when dealing with queries on heterogeneous collections of objects. An extension of the type system needed for generalized relational operations can also be used for manipulating collections of dynamically typed values in a statically typed language. A prototype language based on these ideas has been implemented. While it lacks a proper treatment of persistent data, it demonstrates that a wide variety of database structures can be cleanly represented in a polymorphic programming language.
The Integration of Functions into Logic Programming: A Survey
, 1994
"... Functional and logic programming are the most important declarative programming paradigms, and interest in combining them has grown over the last decade. Early research concentrated on the definition and improvement of execution principles for such integrated languages, while more recently efficient ..."
Abstract

Cited by 36 (0 self)
 Add to MetaCart
Functional and logic programming are the most important declarative programming paradigms, and interest in combining them has grown over the last decade. Early research concentrated on the definition and improvement of execution principles for such integrated languages, while more recently efficient implementations of these execution principles have been developed so that these languages became relevant for practical applications. In this paper we survey the development of the operational semantics as well as
Graphbased Implementation of a Functional Logic Language
, 1989
"... We investigate the development of a graph reduction machine for a higherorder functional logic language by extension of an appropriate architecture for purely functional languages. To execute logic programs the machine must be capable of performing unification and backtracking. We show the integ ..."
Abstract

Cited by 34 (14 self)
 Add to MetaCart
We investigate the development of a graph reduction machine for a higherorder functional logic language by extension of an appropriate architecture for purely functional languages. To execute logic programs the machine must be capable of performing unification and backtracking. We show the integration of these mechanisms in a programmed (functional) graph reduction machine. The new machine has been implemented on a transputer system.
The complexity of type inference for higherorder typed lambda calculi
 In. Proc. 18th ACM Symposium on the Principles of Programming Languages
, 1991
"... We analyse the computational complexity of type inference for untyped X,terms in the secondorder polymorphic typed Xcalculus (F2) invented by Girard and Reynolds, as well as higherorder extensions F3,F4,...,/ ^ proposed by Girard. We prove that recognising the i^typable terms requires exponential ..."
Abstract

Cited by 28 (11 self)
 Add to MetaCart
We analyse the computational complexity of type inference for untyped X,terms in the secondorder polymorphic typed Xcalculus (F2) invented by Girard and Reynolds, as well as higherorder extensions F3,F4,...,/ ^ proposed by Girard. We prove that recognising the i^typable terms requires exponential time, and for Fa the problem is nonelementary. We show as well a sequence of lower bounds on recognising the i^typable terms, where the bound for Fk+1 is exponentially larger than that for Fk. The lower bounds are based on generic simulation of Turing Machines, where computation is simulated at the expression and type level simultaneously. Nonaccepting computations are mapped to nonnormalising reduction sequences, and hence nontypable terms. The accepting computations are mapped to typable terms, where higherorder types encode reduction sequences, and firstorder types encode the entire computation as a circuit, based on a unification simulation of Boolean logic. A primary technical tool in this reduction is the composition of polymorphic functions having different domains and ranges. These results are the first nontrivial lower bounds on type inference for the Girard/Reynolds
On the Expressiveness of Purely Functional I/O Systems
, 1989
"... Functional programming languages have traditionally lacked complete, flexible, and yet referentially transparent I/O mechanisms. Previous proposals for I/O have used either the notion of lazy streams or continuations to model interaction with the external world. We discuss and generalize these mo ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
Functional programming languages have traditionally lacked complete, flexible, and yet referentially transparent I/O mechanisms. Previous proposals for I/O have used either the notion of lazy streams or continuations to model interaction with the external world. We discuss and generalize these models and introduce a third, which we call the systems model, to perform I/O. The expressiveness of the styles are compared by means of an example. We then give a series of surprisingly simple translations between the three models, demonstrating that they are not as different as their programming styles suggest, and implying that the styles could be mixed within a single program. The need to express nondeterministic behavior in a functional language is well recognized. So is the problem of doing so without destroying referential transparency. We survey past approaches to this problem, and suggest a solution in the context of the I/O models described. The I/O system of the purely func...
Rank 2 Intersection Type Assignment in Term Rewriting Systems
 Fundamenta Informaticae
, 1996
"... A notion of type assignment on Curryfied Term Rewriting Systems is introduced that uses Intersection Types of Rank 2, and in which all function symbols are assumed to have a type. Type assignment will consist of specifying derivation rules that describe how types can be assigned to terms, using the ..."
Abstract

Cited by 22 (14 self)
 Add to MetaCart
A notion of type assignment on Curryfied Term Rewriting Systems is introduced that uses Intersection Types of Rank 2, and in which all function symbols are assumed to have a type. Type assignment will consist of specifying derivation rules that describe how types can be assigned to terms, using the types of function symbols. Using a modified unification procedure, for each term the principal pair (of basis and type) will be defined in the following sense: from these all admissible pairs can be generated by chains of operations on pairs, consisting of the operations substitution, copying, and weakening. In general, given an arbitrary typeable CuTRS, the subject reduction property does not hold. Using the principal type for the lefthand side of a rewrite rule, a sufficient and decidable condition will be formulated that typeable rewrite rules should satisfy in order to obtain this property.
Strong Normalization of Explicit Substitutions via Cut Elimination in Proof Nets
, 1997
"... In this paper, we show the correspondence existing between normalization in calculi with explicit substitution and cut elimination in sequent calculus for Linear Logic, via Proof Nets. This correspondence allows us to prove that a typed version of the #xcalculus [30, 29] is strongly normalizing, as ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
In this paper, we show the correspondence existing between normalization in calculi with explicit substitution and cut elimination in sequent calculus for Linear Logic, via Proof Nets. This correspondence allows us to prove that a typed version of the #xcalculus [30, 29] is strongly normalizing, as well as of all the calculi isomorphic to it such as # # [24], # s [19], # d [21], and # f [11]. In order to achieve this result, we introduce a new notion of reduction in Proof Nets: this extended reduction is still confluent and strongly normalizing, and is of interest of its own, as it correspond to more identifications of proofs in Linear Logic that differ by inessential details. These results show that calculi with explicit substitutions are really an intermediate formalism between lambda calculus and proof nets, and suggest a completely new way to look at the problems still open in the field of explicit substitutions.