Results 1  10
of
11
NESL: A Nested DataParallel Language
 CARNEGIE MELLON UNIVERSITY
, 1992
"... This report describes NESL, a stronglytyped, applicative, dataparallel language. NESL is intended to be used as a portable interface for programming a variety of parallel and vector supercomputers, and as a basis for teaching parallel algorithms. Parallelism is supplied through a simple set of dat ..."
Abstract

Cited by 144 (4 self)
 Add to MetaCart
(Show Context)
This report describes NESL, a stronglytyped, applicative, dataparallel language. NESL is intended to be used as a portable interface for programming a variety of parallel and vector supercomputers, and as a basis for teaching parallel algorithms. Parallelism is supplied through a simple set of dataparallel constructs based on vectors, including a mechanism for applying any function over the elements of a vector in parallel, and a broad set of parallel functions that manipulate vectors. NESL fully supports nested vectors and nested parallelismthe ability to take a parallel function and then apply it over multiple instances in parallel. Nested parallelism is important for implementing algorithms with complex and dynamically changing data structures, such as required in many graph or sparse matrix algorithms. NESL also provides a mechanism for calculating the asymptotic running time for a program on various parallel machine models, including the parallel random access machine (PRAM).
NESL: A nested dataparallel language (version 2.6
, 1993
"... The views and conclusions contained in this document are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of Wright Laboratory or the U. S. Government. Keywords: Dataparallel, parallel algorithms, supe ..."
Abstract

Cited by 97 (7 self)
 Add to MetaCart
The views and conclusions contained in this document are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of Wright Laboratory or the U. S. Government. Keywords: Dataparallel, parallel algorithms, supercomputers, nested parallelism, This report describes Nesl, a stronglytyped, applicative, dataparallel language. Nesl is intended to be used as a portable interface for programming a variety of parallel and vector computers, and as a basis for teaching parallel algorithms. Parallelism is supplied through a simple set of dataparallel constructs based on sequences, including a mechanism for applying any function over the elements of a sequence in parallel and a rich set of parallel functions that manipulate sequences. Nesl fully supports nested sequences and nested parallelism—the ability to take a parallel function and apply it over multiple instances in parallel. Nested parallelism is important for implementing algorithms with irregular nested loops (where the inner loop lengths depend on the outer iteration) and for divideandconquer algorithms. Nesl also provides a performance model for calculating the asymptotic performance of a program on
Approximate Pattern Matching with Samples
 In Proc. of ISAAC'94
, 1994
"... . We simplify in this paper the algorithm by Chang and Lawler for the approximate string matching problem, by adopting the concept of sampling. We have a more general analysis of expected time with the simplified algorithm for the onedimensional case under a nonuniform probability distribution ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
(Show Context)
. We simplify in this paper the algorithm by Chang and Lawler for the approximate string matching problem, by adopting the concept of sampling. We have a more general analysis of expected time with the simplified algorithm for the onedimensional case under a nonuniform probability distribution, and we show that our method can easily be generalized to the twodimensional approximate pattern matching problem with sublinear expected time. 1 Introduction Since the inaugural papers on string matching algorithms were published by Knuth, Morris and Pratt[11] and Boyer and Moore [5], the problem diversified into various directions. Let us call string matching onedimensional pattern matching. One is twodimensional pattern matching and the other is approximate pattern matching where up to k differences are allowed for a match. Yet another theme is twodimensional approximate pattern matching. There are numerous papers in these new research areas. We cite just a few of them to compare...
Optimally Fast Parallel Algorithms for Preprocessing and Pattern Matching in One and Two Dimensions
, 1993
"... All algorithms below are optimal alphabetindependent parallel CRCW PRAM algorithms. In one dimension: Given a pattern string of length m for the stringmatching problem, we design an algorithm that computes a deterministic sample of a sufficiently long substring in constant time. This problem use ..."
Abstract

Cited by 19 (10 self)
 Add to MetaCart
All algorithms below are optimal alphabetindependent parallel CRCW PRAM algorithms. In one dimension: Given a pattern string of length m for the stringmatching problem, we design an algorithm that computes a deterministic sample of a sufficiently long substring in constant time. This problem used to be a bottleneck in the pattern preprocessing for one and twodimensional pattern matching. The best previous time bound was O(log 2 m= log log m). We use this algorithm to obtain the following results. 1. Improving the preprocessing of the constanttime text search algorithm [12] from O(log 2 m= log log m) to O(log log m), which is now best possible. 2. A constanttime deterministic stringmatching algorithm in the case that the text length n satisfies n = \Omega\Gamma m 1+ffl ) for a constant ffl ? 0. 3. A simple probabilistic stringmatching algorithm that has constant time with high probability for random input. 4. A constant expected time LasVegas algorithm for computing t...
WorkTimeOptimal Parallel Algorithms for String Problems (Extended Abstract)
 In Proc. 27th ACM Symp. on the Theory of Computing
, 1995
"... ) Artur Czumaj Zvi Galil y Leszek G¸asieniec z Kunsoo Park x Wojciech Plandowski  Abstract A parallel algorithm is workoptimal if it uses the smallest possible work; a workoptimal algorithm is worktime optimal if it also uses the smallest possible time. We design worktimeoptimal al ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
) Artur Czumaj Zvi Galil y Leszek G¸asieniec z Kunsoo Park x Wojciech Plandowski  Abstract A parallel algorithm is workoptimal if it uses the smallest possible work; a workoptimal algorithm is worktime optimal if it also uses the smallest possible time. We design worktimeoptimal algorithm for a number of string processing problems on the EREWPRAM and the hypercube. They include string matching and two dimensional pattern matching. No such algorithms have been known before for any of these problems. 1 Introduction We call a parallel algorithm workoptimal if it has smallest possible work. Notice that this definition is stricter than the one requiring only the same work as the best known sequential algorithm and it requires proving a lower bound. In most cases workoptimality means either linear work or O(n log n) work because no higher lower bounds are known. We call a workoptimal algo Heinz Nixdorf Institute, University of Paderborn, D33095 Paderborn, Germany....
WorkTime Optimal Parallel Prefix Matching (Extended Abstract)
 Proceedings of E.S.A
, 1994
"... ) Leszek Gasieniec 1? Kunsoo Park 2?? 1 Instytut Informatyki, Uniwersytet Warszawski Banacha 2, 02097 Warszawa, Poland 2 Department of Computer Engineering, Seoul National University Seoul 151742, Korea Abstract. Consider the prefix matching problem: Given a pattern P of length m and a ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
) Leszek Gasieniec 1? Kunsoo Park 2?? 1 Instytut Informatyki, Uniwersytet Warszawski Banacha 2, 02097 Warszawa, Poland 2 Department of Computer Engineering, Seoul National University Seoul 151742, Korea Abstract. Consider the prefix matching problem: Given a pattern P of length m and a text T of length n, find for all positions i in T the longest prefix of P starting at i. We present a parallel algorithm for the prefix matching problem over general alphabets whose text search takes optimal O(ff(m)) time and preprocessing takes optimal O(log log m) time, where ff(m) is the inverse Ackermann function. An\Omega (log log m) lower bound for the prefix matching problem is implied by the same lower bound for string matching. However, the lower bound is applied only to preprocessing of the pattern and the searching phase can be faster. We prove an\Omega (ff(m)) lower bound for any linearwork searching phase. Therefore our algorithm is worktime optimal in both prepr...
Multiple Filtration and Approximate Pattern Matching’
, 1995
"... P. A. Pe~zner’. ~ and M. S. water ma^^'.^ Abstract. Given a text of length n and a query of length q, we present an algorithm for finding all locations of mtuples in the text and in the query that differ by at most k mismatches. This problem is motivated by the dotmatrix constructions for seq ..."
Abstract
 Add to MetaCart
P. A. Pe~zner’. ~ and M. S. water ma^^'.^ Abstract. Given a text of length n and a query of length q, we present an algorithm for finding all locations of mtuples in the text and in the query that differ by at most k mismatches. This problem is motivated by the dotmatrix constructions for sequence comparison and optimal oligonucleotide probe selection routinely used in molecular biology. In the case q = m the problem coincides with the classical approximate string matching with k mismatches problem. We present a new approach to this problem based on multiple hashing, which may have advantages over some sophisticated and theoretically efficient methods that have been proposed. This paper describes a twostage process. The first stage (multiple filtration) uses a new technique to preselect roughly similar mtuples. The second stage compares these mtuples using an accurate method. We demonstrate the advantages of multiple filtration in comparison with other techniques for approximate pattern matching. Key Wok String matching, Computational molecular biology. 1. Introduction. Suppose we are given a string of length n, T[1*. n], called the text, a shorter string of length q, Q[1*q], called the query, and integers k and m. The substring matching problem with kmismatches [CL] is to find all “starting”
unknown title
, 2000
"... Faster algorithms for string matching with k mismatches www.elsevier.com/locate/jalgor ..."
Abstract
 Add to MetaCart
(Show Context)
Faster algorithms for string matching with k mismatches www.elsevier.com/locate/jalgor
© 1997 SpringerVerlag New York Inc. Detecting False Matches in StringMatching Algorithms 1
"... Abstract. Consider a text string of length n, a pattern string of length m, and a match vector of length n which declares each location in the text to be either a mismatch (the pattern does not occur beginning at that location in the text) or a potential match (the pattern may occur beginning at tha ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Consider a text string of length n, a pattern string of length m, and a match vector of length n which declares each location in the text to be either a mismatch (the pattern does not occur beginning at that location in the text) or a potential match (the pattern may occur beginning at that location in the text). Some of the potential matches could be false, i.e., the pattern may not occur beginning at some location in the text declared to be a potential match. We investigate the complexity of two problems in this context, namely, checking if there is any false match, and identifying all the false matches in the match vector. We present an algorithm on the CRCW PRAM that checks if there exists a false match in O(1) time using O(n) processors. This algorithm does not require preprocessing the pattern. Therefore, checking for false matches is provably simpler than string matching since string matching takes �(log log m) time on the CRCW PRAM. We use this simple algorithm to convert the Karp–Rabin Monte Carlo type stringmatching algorithm into a Las Vegas type algorithm without asymptotic loss in complexity. We also present an efficient algorithm for identifying all the false matches and, as a consequence, show that stringmatching algorithms take �(log log m) time even given the flexibility to output a few false matches. Key Words. Parallel algorithms, Randomized (Las Vegas) string matching, Checking string matching algorithms. 1. Introduction. Given
Class Notes : Programming Parallel Algorithms
 Notes for the DAGS'93 School on Parallel Programming
, 1993
"... These are the lecture notes for CS 15840B, a handson class in programming parallel algorithms. The class was taught in the fall of 1992 by Guy Blelloch, using the programming language NESL. It stressed the clean and concise expression of a variety of parallel algorithms. About 35 graduate students ..."
Abstract
 Add to MetaCart
These are the lecture notes for CS 15840B, a handson class in programming parallel algorithms. The class was taught in the fall of 1992 by Guy Blelloch, using the programming language NESL. It stressed the clean and concise expression of a variety of parallel algorithms. About 35 graduate students attended the class, of whom 28 took it for credit. These notes were written by students in the class, and were then reviewed and organized by Guy Blelloch and Jonathan Hardwick. The sample NESL code has been converted from the older LISPstyle syntax into the new MLstyle syntax. These notes are not in a polished form, and probably contain several errors and omissions, particularly with respect to references in the literature. Corrections are welcome.