Results 1  10
of
16
A New Approach to Generic Functional Programming
 In The 27th Annual ACM SIGPLANSIGACT Symposium on Principles of Programming Languages
, 1999
"... This paper describes a new approach to generic functional programming, which allows us to define functions generically for all datatypes expressible in Haskell. A generic function is one that is defined by induction on the structure of types. Typical examples include pretty printers, parsers, and co ..."
Abstract

Cited by 96 (13 self)
 Add to MetaCart
This paper describes a new approach to generic functional programming, which allows us to define functions generically for all datatypes expressible in Haskell. A generic function is one that is defined by induction on the structure of types. Typical examples include pretty printers, parsers, and comparison functions. The advanced type system of Haskell presents a real challenge: datatypes may be parameterized not only by types but also by type constructors, type definitions may involve mutual recursion, and recursive calls of type constructors can be arbitrarily nested. We show that despite this complexitya generic function is uniquely defined by giving cases for primitive types and type constructors (such as disjoint unions and cartesian products). Given this information a generic function can be specialized to arbitrary Haskell datatypes. The key idea of the approach is to model types by terms of the simply typed calculus augmented by a family of recursion operators. While co...
A Descriptive Approach to LanguageTheoretic Complexity
, 1996
"... Contents 1 Language Complexity in Generative Grammar 3 Part I The Descriptive Complexity of Strongly ContextFree Languages 11 2 Introduction to Part I 13 3 Trees as Elementary Structures 15 4 L 2 K;P and SnS 25 5 Definability and NonDefinability in L 2 K;P 35 6 Conclusion of Part I 57 DRAFT ..."
Abstract

Cited by 52 (3 self)
 Add to MetaCart
Contents 1 Language Complexity in Generative Grammar 3 Part I The Descriptive Complexity of Strongly ContextFree Languages 11 2 Introduction to Part I 13 3 Trees as Elementary Structures 15 4 L 2 K;P and SnS 25 5 Definability and NonDefinability in L 2 K;P 35 6 Conclusion of Part I 57 DRAFT 2 / Contents Part II The Generative Capacity of GB Theories 59 7 Introduction to Part II 61 8 The Fundamental Structures of GB Theories 69 9 GB and Nondefinability in L 2 K;P 79 10 Formalizing XBar Theory 93 11 The Lexicon, Subcategorization, Thetatheory, and Case Theory 111 12 Binding and Control 119 13 Chains 131 14 Reconstruction 157 15 Limitations of the Interpretation 173 16 Conclusion of Part II 179 A Index of Definitions 183 Bibliography DRAFT 1<
Macro Tree Transducers, Attribute Grammars, and MSO Definable Tree Translations
 Inform. and Comput
, 1998
"... A characterization is given of the class of tree translations definable in monadic second order logic (MSO), in terms of macro tree transducers. The first main result is that the MSO definable tree translations are exactly those tree translations realized by macro tree transducers (MTTs) with reg ..."
Abstract

Cited by 46 (20 self)
 Add to MetaCart
A characterization is given of the class of tree translations definable in monadic second order logic (MSO), in terms of macro tree transducers. The first main result is that the MSO definable tree translations are exactly those tree translations realized by macro tree transducers (MTTs) with regular lookahead that are single use restricted. For this the single use restriction known from attribute grammars is generalized to MTTs. Since MTTs are closed under regular lookahead, this implies that every MSO definable tree translation can be realized by an MTT. The second main result is that the class of MSO definable tree translations can also be obtained by restricting MTTs with regular lookahead to be finite copying, i.e., to require that each input subtree is processed only a bounded number of times. The single use restriction is a rather strong, static restriction on the rules of an MTT, whereas the finite copying restriction is a more liberal, dynamic restriction on the ...
Generalizing Generalized Tries
, 1999
"... A trie is a search tree scheme that employs the structure of search keys to organize information. Tries were originally devised as a means to represent a collection of records indexed by strings over a fixed alphabet. Based on work by C.P. Wadsworth and others, R.H. Connelly and F.L. Morris generali ..."
Abstract

Cited by 31 (8 self)
 Add to MetaCart
A trie is a search tree scheme that employs the structure of search keys to organize information. Tries were originally devised as a means to represent a collection of records indexed by strings over a fixed alphabet. Based on work by C.P. Wadsworth and others, R.H. Connelly and F.L. Morris generalized the concept to permit indexing by elements of an arbitrary monomorphic datatype. Here we go one step further and define tries and operations on tries generically for arbitrary firstorder polymorphic datatypes. The derivation is based on techniques recently developed in the context of polytypic programming. It is well known that for the implementation of generalized tries nested datatypes and polymorphic recursion are needed. Implementing tries for polymorphic datatypes places even greater demands on the type system: it requires rank2 type signatures and higherorder polymorphic nested datatypes. Despite these requirements the definition of generalized tries for polymorphic datatypes is...
Logical aspects of set constraints
 in Proc. 1993 Conf. Computer Science Logic
, 1993
"... Abstract. Set constraints are inclusion relations between sets of ground terms over a ranked alphabet. They have been used extensively in program analysis and type inference. Here we present an equational axiomatization of the algebra of set constraints. Models of these axioms are called termset alg ..."
Abstract

Cited by 27 (5 self)
 Add to MetaCart
Abstract. Set constraints are inclusion relations between sets of ground terms over a ranked alphabet. They have been used extensively in program analysis and type inference. Here we present an equational axiomatization of the algebra of set constraints. Models of these axioms are called termset algebras. They are related to the Boolean algebras with operators of Jonsson and Tarski. We also de ne a family of combinatorial models called topological term automata, which are essentially the term automata studied by Kozen,Palsberg, and Schwartzbach endowed with a topology such that all relevant operations are continuous. These models are similar to Kripke frames for modal or dynamic logic. We establish a Stone duality between termset algebras and topological term automata, and use this to derive a completeness theorem for a related multidimensional modal logic. Finally, weproveasmall model property by ltration, and argue that this result contains the essence of several algorithms appearing in the literature on set constraints. 1
Explicit Cyclic Substitutions
, 1993
"... In this paper we consider rewrite systems that describe the lambdacalculus enriched with recursive and nonrecursive local definitions by generalizing the `explicit substitutions' used by Abadi, Cardelli, Curien, and Lévy [1] to describe sharing in lambdaterms. This leads to `explicit cyclic subst ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
In this paper we consider rewrite systems that describe the lambdacalculus enriched with recursive and nonrecursive local definitions by generalizing the `explicit substitutions' used by Abadi, Cardelli, Curien, and Lévy [1] to describe sharing in lambdaterms. This leads to `explicit cyclic substitutions' that can describe the mutual sharing of local recursive definitions. We demonstrate how this may be used to describe standard binding constructions (let and letrec)  directly using substitution and fixed point induction as well as using `smallstep' rewriting semantics where substitution is interleaved with the mechanics of the following betareductions. With this we hope to contribute to the synthesis of denotational and operational specifications of sharing and recursion.
Satisfying Subtype Inequalities in Polynomial Space
, 1997
"... This paper studies the complexity of type inference in lambdacalculus with subtyping. Type inference is equivalent to solving systems of subtype inequalities. We consider simple types ordered structurally from an arbitrary set of base subtype assumptions. In this case, we give a PSPACE upper bound. ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
This paper studies the complexity of type inference in lambdacalculus with subtyping. Type inference is equivalent to solving systems of subtype inequalities. We consider simple types ordered structurally from an arbitrary set of base subtype assumptions. In this case, we give a PSPACE upper bound. Together with the known lower bound, this result settles completely the complexity of type inference over simple types, which is PSPACEcomplete. We use a technique of independent theoretical interest that simplifies existing methods developed in the literature. Finally the algorithm, although mainly theoretical, can lead to a slight practical improvement of existing implementations.
Macro Tree Translations of Linear Size Increase are MSO Definable
 SIAM J. Comput
, 2001
"... Abstract. The first main result is that if a macro tree translation is of linear size increase, i.e., if the size of every output tree is linearly bounded by the size of the corresponding input tree, then the translation is MSO definable (i.e., definable in monadic secondorder logic). This gives a ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
Abstract. The first main result is that if a macro tree translation is of linear size increase, i.e., if the size of every output tree is linearly bounded by the size of the corresponding input tree, then the translation is MSO definable (i.e., definable in monadic secondorder logic). This gives a new characterization of the MSO definable tree translations in terms of macro tree transducers: they are exactly the macro tree translations of linear size increase. The second main result is that given a macro tree transducer, it can be decided whether or not its translation is MSO definable, and if it is then an equivalent MSO transducer can be constructed. Similar results hold for attribute grammars, which define a subclass of the macro tree translations.
Polytypic Programming With Ease
, 1999
"... A functional polytypic program is one that is parameterised by datatype. Since polytypic functions are defined by induction on types rather than by induction on values they typically operate on a higher level of abstraction than their monotypic counterparts. However, polytypic programming is not nec ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
A functional polytypic program is one that is parameterised by datatype. Since polytypic functions are defined by induction on types rather than by induction on values they typically operate on a higher level of abstraction than their monotypic counterparts. However, polytypic programming is not necessarily more complicated than conventional programming. We show that a polytypic function is uniquely defined by its action on constant functors, projection functors, sums, and products. This information is sufficient to specialize a polytypic function to arbitrary polymorphic datatypes, including mutually recursive datatypes and nested datatypes. The key idea is to use infinite trees as index sets for polytypic functions and to interpret datatypes as algebraic trees. This approach appears both to be simpler, more general, and more efficient than previous ones which are based on the initial algebra semantics of datatypes. Polytypic functions enjoy polytypic properties. We show that wellkno...