Results 11  20
of
35
Finite Model Theory In The Simply Typed Lambda Calculus
, 1994
"... Church's simply typed calculus is a very basic framework for functional programming language research. However, it is common to augment this framework with additional programming constructs, because its expressive power for functions over the domain of Church numerals is very limited. In this thesi ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
Church's simply typed calculus is a very basic framework for functional programming language research. However, it is common to augment this framework with additional programming constructs, because its expressive power for functions over the domain of Church numerals is very limited. In this thesis: (1) We reexamine the expressive power of the "pure" simply typed calculus, but over encodings of finite relational structures, i. e., finite models or databases . In this novel framework the simply typed calculus expresses all elementary functions from finite models to finite models. In addition, many common database query languages, e. g., relational algebra, Datalog : , and the Abiteboul/Beeri complex object algebra, can be embedded into it. The embeddings are feasible in the sense that the terms corresponding to PTIME queries can be evaluated in polynomial time. (2) We examine fixedorder fragments of the simply typed calculus to determine machine independent characterizations of complexity classes. For this we augment the calculus with atomic constants and equality among atomic constants. We show that over ordered structures, the order 3, 4, 5, and 6 fragments express exactly the firstorder, PTIME, PSPACE, and EXPTIME queries, respectively, and we conjecture that for general k 1, order 2 k + 4 expresses exactly the kEXPTIME queries and order 2 k + 5 expresses exactly the kEXPSPACE queries. (3) We also reexamine other functional characterizations of PTIME and we show that method schemas with ordered objects express exactly PTIME. This is a firstorder framework proposed for objectoriented databasesas opposed to the above higherorder frameworks. In summary, this research provides a link between finite model theory (and thus computational complexity), dat...
Dependency Analysis for Standard ML
 ACM Transactions on Programming Languages and Systems
, 1998
"... Automatic dependency analysis is a useful addition to a system like CM, our compilation manager for Standard ML of New Jersey. It relieves the programmer from the tedious and errorprone task of having to specify compilation dependencies by hand and thereby makes its usage more userfriendly. But de ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Automatic dependency analysis is a useful addition to a system like CM, our compilation manager for Standard ML of New Jersey. It relieves the programmer from the tedious and errorprone task of having to specify compilation dependencies by hand and thereby makes its usage more userfriendly. But dependency analysis is not easy as the general problem for Standard ML is NPcomplete. Therefore, CM has to impose certain restrictions on the programming language to recover tractability. We prove the NPcompleteness result, discuss the restrictions on ML that are used by CM, and provide the resulting analysis algorithms. 1 Introduction For programs written in Standard ML [MTH90, MTHM97], the order of compilation matters. But the task of maintaining order within collections of sources can be tedious. Therefore, CM [Blu95], the compilation manager for Standard ML of New Jersey [AM91], o#ers automatic dependency analysis. CM provides a language for specifying the semantic structure of large pr...
An Analysis of the CoreML Language: Expressive Power and Type Reconstruction
 In Proc. 21st Int'l Coll. Automata, Languages, and Programming
, 1994
"... CoreML is a basic subset of most functional programming languages. It consists of the simply typed (or monomorphic) calculus, simply typed equality over atomic constants, and let as the only polymorphic construct. We present a synthesis of recent results which characterize this "toy" language' ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
CoreML is a basic subset of most functional programming languages. It consists of the simply typed (or monomorphic) calculus, simply typed equality over atomic constants, and let as the only polymorphic construct. We present a synthesis of recent results which characterize this "toy" language's expressive power as well as its type reconstruction (or type inference) problem. More specifically: (1) CoreML can express exactly the ELEMENTARY queries, where a program input is a database encoded as a term and a query program is a term whose application to the input normalizes to the output database. In addition, it is possible to express all the PTIME queries so that this normalization process is polynomial in the input size. (2) The polymorphism of let can be explained using a simple algorithmic reduction to monomorphism, and provides flexibility, without affecting expressibility. Algorithms for type reconstruction offer the additional convenience of static typing without type declarations. Given polymorphism, the price of this convenience is an increase in complexity from lineartime in the size of the program typed (without let) to completeness in exponentialtime (with let).
Type Inference for FirstClass Messages with Feature Constraints
 International Journal of Foundations of Computer Science
, 1998
"... We present a constraint system OF of feature trees that is appropriate to specify and implement type inference for firstclass messages. OF extends traditional systems of feature constraints by a selection constraint xhyiz "by firstclass feature tree" y, in contrast to the standard selection con ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We present a constraint system OF of feature trees that is appropriate to specify and implement type inference for firstclass messages. OF extends traditional systems of feature constraints by a selection constraint xhyiz "by firstclass feature tree" y, in contrast to the standard selection constraint x[ f ]y "by fixed feature" f . We investigate the satisfiability problem of OF and show that it can be solved in polynomial time, and even in quadratic time in an important special case. We compare OF with Treinen's constraint system EF of feature constraints with firstclass features, which has an NPcomplete satisfiability problem. This comparison yields that the satisfiability problem for OF with negation is NPhard. Based on OF we give a simple account of type inference for firstclass messages in the spirit of Nishimura's recent proposal, and we show that it has polynomial time complexity: We also highlight an immediate extension that is desirable but makes type inference NPhard.
Haskellstyle Overloading is NPhard
 In Proceedings of the 1994 International Conference on Computer Languages
, 1994
"... Extensions of the ML type system, based on constrained type schemes, have been proposed for languages with overloading. Type inference in these systems requires solving the following satisfiability problem. Given a set of type assumptions C over finite types and a type basis A, is there is a substit ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Extensions of the ML type system, based on constrained type schemes, have been proposed for languages with overloading. Type inference in these systems requires solving the following satisfiability problem. Given a set of type assumptions C over finite types and a type basis A, is there is a substitution S that satisfies C in that A ` CS is derivable? Under arbitrary overloading, the problem is undecidable. Haskell limits overloading to a form similar to that proposed by Kaes called parametric overloading. We formally characterize parametric overloading in terms of a regular tree language and prove that although decidable, satisfiability is NPhard when overloading is parametric. 1 Introduction A practical limitation of the ML type system is that it prohibits global overloading in a programming language by restricting to at most one the number of assumptions per identifier in a type context, a limitation noted by Milner himself [Mil78]. Suppose we wish to assert that a free identifier...
Type Inference for Recursive Definitions
 In Proc. 14th Ann. IEEE Symp. Logic in Comput. Sci
, 2000
"... We consider type systems that combine universal types, recursive types, and object types. We study type inference in these systems under a rank restriction, following Leivant's notion of rank. To motivate our work, we present several examples showing how our systems can be used to type programs enco ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We consider type systems that combine universal types, recursive types, and object types. We study type inference in these systems under a rank restriction, following Leivant's notion of rank. To motivate our work, we present several examples showing how our systems can be used to type programs encountered in practice. We show that type inference in the rankk system is decidable for k # 2 and undecidable for k # 3. (Similar results based on different techniques are known to hold for System F, without recursive types and object types.) Our undecidability result is obtained by a reduction from a particular adaptation (which we call "regular") of the semiunification problem and whose undecidability is, interestingly, obtained by methods totally different from those used in the case of standard (or finite) semiunification. Keywords: type systems, type inference, lambda calculus, unification, software specification. 1 Introduction 1.1 Background and Motivation Type inference, the ...
Principal Typing and Mutual Recursion
, 2001
"... As pointed out by Damas[Dam84], the DamasMilner system (ML) has principal types, but not principal typings. Damas also dened in his thesis a slightly modied version of ML, that we call ML 0 , which, given a typing context and an expression, derives exactly the same types, and provided an algorith ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
As pointed out by Damas[Dam84], the DamasMilner system (ML) has principal types, but not principal typings. Damas also dened in his thesis a slightly modied version of ML, that we call ML 0 , which, given a typing context and an expression, derives exactly the same types, and provided an algorithm (named as T) that infers principal typings for ML 0 . This work extends each of ML 0 and T with a new rule for typing mutually recursive letbindings. The proposed rule can type more expressions than the corresponding rule used in ML, by allowing mutually recursive denitions to be used polymorphically by other denitions. 1
Natural semantics as a static program analysis framework
 ACM TRANSACTIONS ON PROGRAMMING LANGUAGES AND SYSTEMS (TOPLAS
, 2004
"... Natural semantics specifications have become mainstream in the formal specification of programming language semantics during the last ten years. In this paper, we set up sorted natural semantics as a specification framework which is able to express static semantic information of programming language ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Natural semantics specifications have become mainstream in the formal specification of programming language semantics during the last ten years. In this paper, we set up sorted natural semantics as a specification framework which is able to express static semantic information of programming languages declaratively in a uniform way and allows at the same time to generate corresponding analyses. Such static semantic information comprises contextsensitive properties which are checked in the semantic analysis phase of compilers as well as further static program analyses such as e.g. classical data and control flow analyses or type and effect systems. The latter require fixed point analyses to determine their solutions. We show that, given a sorted natural semantics specification, we can generate the corresponding analysis. Therefore, we classify the solution of such an analysis by the notion of a proof tree. We show that a proof tree can be computed by solving an equivalent residuation problem. In case of the semantic analysis, this solution can be found by a basic algorithm. We show that its efficiency can be enhanced using solution strategies. We also demonstrate our prototype implementation of the basic algorithm which proves its applicability in practical situations. With the results of this paper, we have established natural semantics as a framework which closes the gap between declarative and
A more direct algorithm for type inference in the rank2 fragment of the secondorder λcalculus
, 2006
"... We present an algorithm for rank2 type inference in the secondorder λcalculus. Our algorithm differs from the wellknown algorithm of Kfoury and Wells in that it employs only a quadratically fewer type variables and inequalities. Our algorithm consists of a translation from a λterm to an instanc ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We present an algorithm for rank2 type inference in the secondorder λcalculus. Our algorithm differs from the wellknown algorithm of Kfoury and Wells in that it employs only a quadratically fewer type variables and inequalities. Our algorithm consists of a translation from a λterm to an instance of RASUP (a decidable superset of ASUP) in which the variables correspond more directly to features in the original term. We claim that our construction, being simpler and more direct, is more amenable to proof and extension. 1.
Reflections on complexity of ML type reconstruction
, 1997
"... This is a collection of some more or less chaotic remarks on the ML type system, definitely not sufficient to fill a research paper of reasonable quality, but perhaps interesting enough to be written down as a note. At the beginning the idea was to investigate the complexity of type reconstruction a ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This is a collection of some more or less chaotic remarks on the ML type system, definitely not sufficient to fill a research paper of reasonable quality, but perhaps interesting enough to be written down as a note. At the beginning the idea was to investigate the complexity of type reconstruction and typability in bounded order fragments of ML. Unexpectedly the problem turned out to be hard, and finally I obtained only partial results. I do not feel like spending more time on this topic, so the text is not polished, the proofs  if included at all  are only sketched and of rather poor mathematical quality. I believe however, that some remarks, especially those of "philosophical" nature, shed some light on the ML type system and may be of some value to the reader interested especially in the interaction between theory and practice of ML type reconstruction. 1 Introduction The ML type system was developed by Robin Milner in the late seventies [26, 3], but was influenced by much ol...