Results 1 
7 of
7
Nesl: A Nested DataParallel Language
, 1990
"... This report describes Nesl, a stronglytyped, applicative, dataparallel language. Nesl is intended to be used as a portable interface for programming a variety of parallel and vector supercomputers, and as a basis for teaching parallel algorithms. Parallelism is supplied through a simple set of dat ..."
Abstract

Cited by 134 (4 self)
 Add to MetaCart
This report describes Nesl, a stronglytyped, applicative, dataparallel language. Nesl is intended to be used as a portable interface for programming a variety of parallel and vector supercomputers, and as a basis for teaching parallel algorithms. Parallelism is supplied through a simple set of dataparallel constructs based on vectors, including a mechanism for applying any function over the elements of a vector in parallel, and a broad set of parallel functions that manipulate vectors. Nesl fully supports nested vectors and nested parallelismthe ability to take a parallel function and then apply it over multiple instances in parallel. Nested parallelism is important for implementing algorithms with complex and dynamically changing data structures, such as required in many graph or sparse matrix algorithms. Nesl also provides a mechanism for calculating the asymptotic running time for a program on various parallel machine models, including the parallel random access machine (PRAM...
NESL: A nested dataparallel language (version 2.6
, 1993
"... The views and conclusions contained in this document are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of Wright Laboratory or the U. S. Government. Keywords: Dataparallel, parallel algorithms, supe ..."
Abstract

Cited by 95 (7 self)
 Add to MetaCart
The views and conclusions contained in this document are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of Wright Laboratory or the U. S. Government. Keywords: Dataparallel, parallel algorithms, supercomputers, nested parallelism, This report describes Nesl, a stronglytyped, applicative, dataparallel language. Nesl is intended to be used as a portable interface for programming a variety of parallel and vector computers, and as a basis for teaching parallel algorithms. Parallelism is supplied through a simple set of dataparallel constructs based on sequences, including a mechanism for applying any function over the elements of a sequence in parallel and a rich set of parallel functions that manipulate sequences. Nesl fully supports nested sequences and nested parallelism—the ability to take a parallel function and apply it over multiple instances in parallel. Nested parallelism is important for implementing algorithms with irregular nested loops (where the inner loop lengths depend on the outer iteration) and for divideandconquer algorithms. Nesl also provides a performance model for calculating the asymptotic performance of a program on
Approximate Pattern Matching with Samples
 In Proc. of ISAAC'94
, 1994
"... . We simplify in this paper the algorithm by Chang and Lawler for the approximate string matching problem, by adopting the concept of sampling. We have a more general analysis of expected time with the simplified algorithm for the onedimensional case under a nonuniform probability distribution ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
. We simplify in this paper the algorithm by Chang and Lawler for the approximate string matching problem, by adopting the concept of sampling. We have a more general analysis of expected time with the simplified algorithm for the onedimensional case under a nonuniform probability distribution, and we show that our method can easily be generalized to the twodimensional approximate pattern matching problem with sublinear expected time. 1 Introduction Since the inaugural papers on string matching algorithms were published by Knuth, Morris and Pratt[11] and Boyer and Moore [5], the problem diversified into various directions. Let us call string matching onedimensional pattern matching. One is twodimensional pattern matching and the other is approximate pattern matching where up to k differences are allowed for a match. Yet another theme is twodimensional approximate pattern matching. There are numerous papers in these new research areas. We cite just a few of them to compare...
Optimally Fast Parallel Algorithms for Preprocessing and Pattern Matching in One and Two Dimensions
, 1993
"... All algorithms below are optimal alphabetindependent parallel CRCW PRAM algorithms. In one dimension: Given a pattern string of length m for the stringmatching problem, we design an algorithm that computes a deterministic sample of a sufficiently long substring in constant time. This problem use ..."
Abstract

Cited by 19 (10 self)
 Add to MetaCart
All algorithms below are optimal alphabetindependent parallel CRCW PRAM algorithms. In one dimension: Given a pattern string of length m for the stringmatching problem, we design an algorithm that computes a deterministic sample of a sufficiently long substring in constant time. This problem used to be a bottleneck in the pattern preprocessing for one and twodimensional pattern matching. The best previous time bound was O(log 2 m= log log m). We use this algorithm to obtain the following results. 1. Improving the preprocessing of the constanttime text search algorithm [12] from O(log 2 m= log log m) to O(log log m), which is now best possible. 2. A constanttime deterministic stringmatching algorithm in the case that the text length n satisfies n = \Omega\Gamma m 1+ffl ) for a constant ffl ? 0. 3. A simple probabilistic stringmatching algorithm that has constant time with high probability for random input. 4. A constant expected time LasVegas algorithm for computing t...
WorkTimeOptimal Parallel Algorithms for String Problems (Extended Abstract)
 In Proc. 27th ACM Symp. on the Theory of Computing
, 1995
"... ) Artur Czumaj Zvi Galil y Leszek G¸asieniec z Kunsoo Park x Wojciech Plandowski  Abstract A parallel algorithm is workoptimal if it uses the smallest possible work; a workoptimal algorithm is worktime optimal if it also uses the smallest possible time. We design worktimeoptimal al ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
) Artur Czumaj Zvi Galil y Leszek G¸asieniec z Kunsoo Park x Wojciech Plandowski  Abstract A parallel algorithm is workoptimal if it uses the smallest possible work; a workoptimal algorithm is worktime optimal if it also uses the smallest possible time. We design worktimeoptimal algorithm for a number of string processing problems on the EREWPRAM and the hypercube. They include string matching and two dimensional pattern matching. No such algorithms have been known before for any of these problems. 1 Introduction We call a parallel algorithm workoptimal if it has smallest possible work. Notice that this definition is stricter than the one requiring only the same work as the best known sequential algorithm and it requires proving a lower bound. In most cases workoptimality means either linear work or O(n log n) work because no higher lower bounds are known. We call a workoptimal algo Heinz Nixdorf Institute, University of Paderborn, D33095 Paderborn, Germany....
WorkTime Optimal Parallel Prefix Matching (Extended Abstract)
 Proceedings of E.S.A
, 1994
"... ) Leszek Gasieniec 1? Kunsoo Park 2?? 1 Instytut Informatyki, Uniwersytet Warszawski Banacha 2, 02097 Warszawa, Poland 2 Department of Computer Engineering, Seoul National University Seoul 151742, Korea Abstract. Consider the prefix matching problem: Given a pattern P of length m and a ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
) Leszek Gasieniec 1? Kunsoo Park 2?? 1 Instytut Informatyki, Uniwersytet Warszawski Banacha 2, 02097 Warszawa, Poland 2 Department of Computer Engineering, Seoul National University Seoul 151742, Korea Abstract. Consider the prefix matching problem: Given a pattern P of length m and a text T of length n, find for all positions i in T the longest prefix of P starting at i. We present a parallel algorithm for the prefix matching problem over general alphabets whose text search takes optimal O(ff(m)) time and preprocessing takes optimal O(log log m) time, where ff(m) is the inverse Ackermann function. An\Omega (log log m) lower bound for the prefix matching problem is implied by the same lower bound for string matching. However, the lower bound is applied only to preprocessing of the pattern and the searching phase can be faster. We prove an\Omega (ff(m)) lower bound for any linearwork searching phase. Therefore our algorithm is worktime optimal in both prepr...
Class Notes : Programming Parallel Algorithms
 Notes for the DAGS'93 School on Parallel Programming
, 1993
"... These are the lecture notes for CS 15840B, a handson class in programming parallel algorithms. The class was taught in the fall of 1992 by Guy Blelloch, using the programming language NESL. It stressed the clean and concise expression of a variety of parallel algorithms. About 35 graduate students ..."
Abstract
 Add to MetaCart
These are the lecture notes for CS 15840B, a handson class in programming parallel algorithms. The class was taught in the fall of 1992 by Guy Blelloch, using the programming language NESL. It stressed the clean and concise expression of a variety of parallel algorithms. About 35 graduate students attended the class, of whom 28 took it for credit. These notes were written by students in the class, and were then reviewed and organized by Guy Blelloch and Jonathan Hardwick. The sample NESL code has been converted from the older LISPstyle syntax into the new MLstyle syntax. These notes are not in a polished form, and probably contain several errors and omissions, particularly with respect to references in the literature. Corrections are welcome.