Results 1 
7 of
7
Structuring DepthFirst Search Algorithms in Haskell
, 1995
"... Depthfirst search is the key to a wide variety of graph algorithms. In this paper we express depthfirst search in a lazy functional language, obtaining a lineartime implementation. Unlike traditional imperative presentations, we use the structuring methods of functional languages to construct alg ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
Depthfirst search is the key to a wide variety of graph algorithms. In this paper we express depthfirst search in a lazy functional language, obtaining a lineartime implementation. Unlike traditional imperative presentations, we use the structuring methods of functional languages to construct algorithms from individual reusable components. This style of algorithm construction turns out to be quite amenable to formal proof, which we exemplify through a calculationalstyle proof of a far from obvious stronglyconnected components algorithm. Classifications: Computing Paradigms (functional programming) ; Environments, Implementations, and Experience (programming, graph algorithms). 1 Introduction The importance of depthfirst search (DFS) for graph algorithms was established twenty years ago by Tarjan (1972) and Hopcroft and Tarjan (1973) in their seminal work. They demonstrated how depthfirst search could be used to construct a variety of efficient graph algorithms. In practice, this...
Inductive Graphs and Functional Graph Algorithms
, 2001
"... We propose a new style of writing graph algorithms in functional languages which is based on an alternative view of graphs as inductively defined data types. We show how this graph model can be implemented efficiently, and then we demonstrate how graph algorithms can be succinctly given by recursive ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
We propose a new style of writing graph algorithms in functional languages which is based on an alternative view of graphs as inductively defined data types. We show how this graph model can be implemented efficiently, and then we demonstrate how graph algorithms can be succinctly given by recursive function definitions based on the inductive graph view. We also regard this as a contribution to the teaching of algorithms and data structures in functional languages since we can use the functionalstyle graph algorithms instead of the imperative algorithms that are dominant today. Keywords: Graphs in Functional Languages, Recursive Graph Algorithms, Teaching Graph Algorithms in Functional Languages
Lazy DepthFirst Search and Linear Graph Algorithms in Haskell
 Glasgow Workshop on Functional Programming
, 1994
"... Depthfirst search is the key to a wide variety of graph algorithms. In this paper we explore the implementation of depth first search in a lazy functional language. For the first time in such languages we obtain a lineartime implementation. But we go further. Unlike traditional imperative presenta ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Depthfirst search is the key to a wide variety of graph algorithms. In this paper we explore the implementation of depth first search in a lazy functional language. For the first time in such languages we obtain a lineartime implementation. But we go further. Unlike traditional imperative presentations, algorithms are constructed from individual components, which may be reused to create new algorithms. Furthermore, the style of program is quite amenable to formal proof, which we exemplify through a calculationalstyle proof of a stronglyconnected components algorithm. 1 Introduction Graph algorithms have long been a challenge to programmers of lazy functional languages. It has not been at all clear how to express such algorithms without using side effects to achieve efficiency. For example, many texts provide implementations of search algorithms which are quadratic in the size of the graph (see Paulson (1991), Holyer (1991), or Harrison (1993)), compared with the standard linear im...
Functional Graph Algorithms with DepthFirst Search (Preliminary Summary)
 in Glasgow Functional Programming Workshop
, 1993
"... Performing a depthfirst search of a graph is one of the fundamental approaches for solving a variety of graph algorithms. Implementing depthfirst search efficiently in a pure functional language has only become possible with the advent of imperative functional programming. In this paper we mix the ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Performing a depthfirst search of a graph is one of the fundamental approaches for solving a variety of graph algorithms. Implementing depthfirst search efficiently in a pure functional language has only become possible with the advent of imperative functional programming. In this paper we mix the techniques of pure functional programming in the same cauldron as depthfirst search to yield a more lucid approach to viewing a variety of graph algorithms. This claim will be illustrated with several examples. 1 Introduction Graph algorithms have long been a challenge to functional programmers. It has not been at all clear how to express such algorithms without using side effects to achieve efficiency. For example, many texts provide implementations of search algorithms which are quadratic in the size of the graph (see [7, 3, 2], for instance), compared with the standard linear implementations given for imperative languages (see [1], for instance). In this paper we implement a variety of ...
The resource constrained shortest path problem implemented in a lazy functional language
 Journal of Functional Programming
, 1994
"... The resource constrained shortest path problem is an NPhard problem for which many ingenious algorithms have been developed. These algorithms are usually implemented in FORTRAN or another imperative programming language. We have implemented some of the simpler algorithms in a lazy functional langua ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
The resource constrained shortest path problem is an NPhard problem for which many ingenious algorithms have been developed. These algorithms are usually implemented in FORTRAN or another imperative programming language. We have implemented some of the simpler algorithms in a lazy functional language. Benefits accrue in the software engineering of the implementations. Our implementations have been applied to a standard benchmark of data files, which is available from the Operational Research Library of Imperial College, London. The performance of the lazy functional implementations, even with the comparatively simple algorithms that we have used, is competitive with a reference FORTRAN implementation.
Mapping and Visiting in Functional and Objectoriented Programming
, 2008
"... Mapping and visiting represent different programming styles for traversals of collections of data. Mapping is rooted in the functional programming paradigm, and visiting is rooted in the objectoriented programming paradigm. This paper explores the similarities and differences between mapping and vi ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Mapping and visiting represent different programming styles for traversals of collections of data. Mapping is rooted in the functional programming paradigm, and visiting is rooted in the objectoriented programming paradigm. This paper explores the similarities and differences between mapping and visiting, seen across the traditions in the two different programming paradigms. The paper is concluded with recommendations for mapping and visiting in programming languages that support both the functional and the objectoriented paradigms.
Examination committee:
"... Many objectoriented programming languages provide type safety by allowing programmers to introduce distinct object types. In the case of Java this also introduces a considerable or even prohibitive cost, especially when dealing with small objects over primitive types. Consequently, Java library imp ..."
Abstract
 Add to MetaCart
Many objectoriented programming languages provide type safety by allowing programmers to introduce distinct object types. In the case of Java this also introduces a considerable or even prohibitive cost, especially when dealing with small objects over primitive types. Consequently, Java library implementations typically abuse primitive types and are not type safe in practice. We present a solution that allows type safety in Java with little, if any, performance penalty, hence allowing for development of safe and efficicient applications and libraries. We present a solution that provides the safety of objectoriented code, but avoids all overhead when the full generality and expressive capabilities of objects are not required. This is accomplished by treating named objects as primitive types during compilation. This allows for reusable and easily maintainable Java code that rivals natively compiled languages in efficiency. The proposed technique differs from the previous work in that distinction between objects is made by name, rather than implementation. Software implemented using the approach results in an order of magnitude improvement in execution speed and space use. It is likely that a native implementation and integration of our technique will also improve compilation time and ease of use, thus encouraging developers to use opaque object types in Java.