Results 1  10
of
24
Monadic Parser Combinators
, 1996
"... In functional programming, a popular approach to building recursive descent parsers is to model parsers as functions, and to define higherorder functions (or combinators) that implement grammar constructions such as sequencing, choice, and repetition. Such parsers form an instance of a monad, an al ..."
Abstract

Cited by 59 (2 self)
 Add to MetaCart
In functional programming, a popular approach to building recursive descent parsers is to model parsers as functions, and to define higherorder functions (or combinators) that implement grammar constructions such as sequencing, choice, and repetition. Such parsers form an instance of a monad, an algebraic structure from mathematics that has proved useful for addressing a number of computational problems. The purpose of this article is to provide a stepbystep tutorial on the monadic approach to building functional parsers, and to explain some of the benefits that result from exploiting monads. No prior knowledge of parser combinators or of monads is assumed. Indeed, this article can also be viewed as a first introduction to the use of monads in programming.
Lag, Drag, Void and Use  Heap Profiling and SpaceEfficient Compilation Revisited
 In Proc. Intl. Conf. on Functional Programming
, 1996
"... The context for this paper is functional computation by graph reduction. Our overall aim is more efficient use of memory. The specific topic is the detection of dormant cells in the live graph  those retained in heap memory though not actually playing a useful role in computation. We describe a p ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
The context for this paper is functional computation by graph reduction. Our overall aim is more efficient use of memory. The specific topic is the detection of dormant cells in the live graph  those retained in heap memory though not actually playing a useful role in computation. We describe a profiler that can identify heap consumption by such `useless' cells. Unlike heap profilers based on traversals of the live heap, this profiler works by examining cells postmortem. The new profiler has revealed a surprisingly large proportion of `useless' cells, even in some programs that previously seemed spaceefficient such as the bootstrapping Haskell compiler nhc. 1 Introduction A typical computation by graph reduction involves a large and changing population of heapmemory cells. Taking a census of this population at regular intervals can be very instructive, both for functional programmers and for functionallanguage implementors. A heap profiler [RW93] records population counts for ...
Combinator parsing: A short tutorial
 Language Engineering and Rigorous Software Development, International LerNet ALFA Summer School 2008, Piriapolis, Uruguay, February 24–March 1, 2008, Revised Tutorial Lectures. Volume 5520 of Lecture Notes in Computer Science., SpringerVerlag (2009) 252–
"... Abstract. There are numerous ways to implement a parser for a given syntax; using parser combinators is a powerful approach to parsing which derives much of its power and expressiveness from the type system and semantics of the host programming language. This tutorial begins with the construction of ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
Abstract. There are numerous ways to implement a parser for a given syntax; using parser combinators is a powerful approach to parsing which derives much of its power and expressiveness from the type system and semantics of the host programming language. This tutorial begins with the construction of a small library of parsing combinators. This library introduces the basics of combinator parsing and, more generally, demonstrates how domain specific embedded languages are able to leverage the facilities of the host language. After having constructed our small combinator library, we investigate some shortcomings of the naïve implementation introduced in the first part, and incrementally develop an implementation without these problems. Finally we discuss some further extensions of the presented library and compare our approach with similar libraries. 1
Total Parser Combinators
, 2009
"... A monadic parser combinator library which guarantees termination of parsing, while still allowing many forms of left recursion, is described. The library’s interface is similar to that of many other parser combinator libraries, with two important differences: one is that the interface clearly specif ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
A monadic parser combinator library which guarantees termination of parsing, while still allowing many forms of left recursion, is described. The library’s interface is similar to that of many other parser combinator libraries, with two important differences: one is that the interface clearly specifies which parts of the constructed parsers may be infinite, and which parts have to be finite, using a combination of induction and coinduction; and the other is that the parser type is unusually informative. The library comes with a formal semantics, using which it is proved that the parser combinators are as expressive as possible. The implementation
Heap Compression and Binary I/O in Haskell
 In 2nd ACM Haskell Workshop
, 1997
"... Two new facilities for Haskell are described: compression of data values in memory, and a new scheme for binary I/O. These facilities, although they can be used individually, can also be combined because they use the same binary representations for values. Heap compression in memory is valuable beca ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Two new facilities for Haskell are described: compression of data values in memory, and a new scheme for binary I/O. These facilities, although they can be used individually, can also be combined because they use the same binary representations for values. Heap compression in memory is valuable because it enables programs to run on smaller machines, or conversely allows programs to store more data in the same amount of memory. Binary I/O is valuable because it makes the file storage and retrieval of heap data structures smooth and painless. The combination of heap compression and binary I/O allows data transfer to be both fast and spaceefficient. All the facilities described have been implemented in a variant of R ojemo's nhc compiler. Example applications are demonstrated, with performance results for space and speed. 1 Introduction 1.1 Data representation Implementors of lazy functional languages tend to use an internal representation of data which is uniform, based on graphs of...
New Dimensions in Heap Profiling
 JOURNAL OF FUNCTIONAL PROGRAMMING
, 1996
"... Firstgeneration heap profilers for lazy functional languages have proved to be effective tools for locating some kinds of space faults, but in other cases they cannot provide sufficient information to solve the problem. This paper describes the design, implementation and use of a new profiler that ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Firstgeneration heap profilers for lazy functional languages have proved to be effective tools for locating some kinds of space faults, but in other cases they cannot provide sufficient information to solve the problem. This paper describes the design, implementation and use of a new profiler that goes beyond the twodimensional `who produces what' view of heap cells to provide information about their more dynamic and structural attributes. Specifically, the new profiler can distinguish between cells according to their eventual lifetime, or on the basis of the closure retainers by virtue of which they remain part of the live heap. A bootstrapping Haskell compiler (nhc) hosts the implementation: among examples of the profiler's use we include selfapplication to nhc. Another example is the original heapprofiling case study clausify, which now consumes even less memory and is much faster.
Lag, Drag and PostMortem Heap Profiling
"... The context for this paper is functional computation by graph reduction. Our overall aim is more efficient use of memory. The specific topic is the detection of dormant cells in the live graph  those retained in heap memory though not actually playing a useful role in computation. We describe a p ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The context for this paper is functional computation by graph reduction. Our overall aim is more efficient use of memory. The specific topic is the detection of dormant cells in the live graph  those retained in heap memory though not actually playing a useful role in computation. We describe a profiler that can identify heap consumption by such `useless' cells. Unlike heap profilers based on traversals of the live heap, this profiler works by examining cells postmortem. Early experience with applications confirms our suspicions about spaceefficiency: as usual, it is even worse than expected! 1 Introduction A typical computation by graph reduction involves a large and changing population of heapmemory cells. Taking a census of this population at regular intervals can be very instructive, both for functional programmers and for functionallanguage implementors. A heap profiler [RW93] records population counts for different classes of cells (eg. representing different values, creat...
Structurally Recursive Descent Parsing
, 2008
"... Recursive descent parsing does not terminate for left recursive grammars. We turn recursive descent parsing into structurally recursive descent parsing, acceptable by total dependently typed languages like Agda, by using the type system to rule out left recursion. The resulting library retains much ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Recursive descent parsing does not terminate for left recursive grammars. We turn recursive descent parsing into structurally recursive descent parsing, acceptable by total dependently typed languages like Agda, by using the type system to rule out left recursion. The resulting library retains much of the flavour of ordinary “list of successes ” combinator parsers. In particular, the type indices used to rule out left recursion can in many cases be inferred automatically, so that
Partial parsing: combining choice with commitment
"... Abstract. Parser combinators, often monadic, are a venerable and widelyused solution to read data from some external format. However, the capability to return a partial parse has, until now, been largely missing. When only a small portion of the entire data is desired, it has been necessary either t ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. Parser combinators, often monadic, are a venerable and widelyused solution to read data from some external format. However, the capability to return a partial parse has, until now, been largely missing. When only a small portion of the entire data is desired, it has been necessary either to parse the entire input in any case, or to break up the grammar into smaller pieces and move some work outside the world of combinators. This paper presents a technique for mixing lazy, demanddriven, parsing with strict parsing, all within the same set of combinators. The grammar specification remains complete and unbroken, yet only sufficient input is consumed to satisfy the result demanded. It is built on a combination of applicative and monadic parsers. Monadic parsing alone is insufficient to allow a choice operator to coexist with the early commitment needed for lazy results. Applicative parsing alone can give partial results, but does not permit contextsensitive grammars. But used together, we gain both partiality and a flexible ease of use. Performance results demonstrate that partial parsing is often faster and more spaceefficient than strict parsing, but never worse. The tradeoff is that partiality has consequences when dealing with illformed input. 1