## Parallel Dynamic Programming (1992)

Citations: | 18 - 1 self |

### BibTeX

@TECHREPORT{Galil92paralleldynamic,

author = {Zvi Galil and Kunsoo Park},

title = {Parallel Dynamic Programming},

institution = {},

year = {1992}

}

### Years of Citing Articles

### OpenURL

### Abstract

We study the parallel computation of dynamic programming. We consider four important dynamic programming problems which have wide application, and that have been studied extensively in sequential computation: (1) the 1D problem, (2) the gap problem, (3) the parenthesis problem, and (4) the RNA problem. The parenthesis problem has fast parallel algorithms; almost no work has been done for parallelizing the other three. We present a unifying framework for the parallel computation of dynamic programming. We use two well-known methods, the closure method and the matrix product method, as general paradigms for developing parallel algorithms. Combined with various techniques, they lead to a number of new results. Our main results are optimal sublinear-time algorithms for the 1D, parenthesis, and RNA problems.

### Citations

2541 |
The Design and Analysis of Computer Algorithms
- Aho, Hopcroft, et al.
- 1974
(Show Context)
Citation Context ...wellknown methods, but go further to improve the total number of operations. In Sections 4, 5, 6 we use the closure method for Problems 2, 3, 4. 2. The Closure Method 2.1. Closed semirings Aho et al. =-=[2]-=- introduced a closed semiring SR = (S; +; \Delta; 0; 1), where S is a set of elements, + and \Delta are binary operations on S, and 0; 1 are additive and multiplicative identities, respectively. A not... |

288 |
A survey of parallel algorithms for shared-memory machines
- Karp, Ramachandran
- 1988
(Show Context)
Citation Context ...3. The parenthesis problem: O(n 3=4 log n) time in optimal O(n 3 ) operations. 4. The RNA problem: O( p n log n) time in optimal O(n 4 ) operations. Our model of parallel computation is the CREW PRAM =-=[6,14]-=-. The PRAM has a collection of identical processors and a separate collection of memory cells, and any processor can access any memory cell in unit time. The CREW (concurrent read exclusive write) PRA... |

240 | The parallel evaluation of generic arithmetic expressions
- BRENT
- 1974
(Show Context)
Citation Context ...in unit time. The CREW (concurrent read exclusive write) PRAM allows several processors to read the same memory cell at once, but disallows concurrent writes to a cell. The following theorem by Brent =-=[4]-=- is useful in analyzing parallel algorithms, since it allows us to count only the time and total number of operations. Theorem 1. [4] If a parallel computation can be performed in time t using q opera... |

90 | RNA secondary structure: a complete mathematical analysis
- Waterman, Smith
- 1978
(Show Context)
Citation Context ...i; 0] and D[0; j] for 0si; jsn, compute D[i; j] = min 0p!i 0q!j fD[p; q] + w(p; q; i; j)g for 1si; jsn: (4) This problem has been used to compute the secondary structure of RNA without multiple loops =-=[19]-=-. It will be called the RNA problem. A fifth important dynamic programming problem is the edit distance problem [1,3,9]. In the edit distance problem an entry of its dynamic programming table depends ... |

76 |
Finding the maximum, merging, and sorting in a parallel computation model
- SHILOACH, VISHKIN
- 1981
(Show Context)
Citation Context ...e with n= log n CREW processors. Faster CRCW algorithms for the four problems can be obtained by replacing the complexity of the minimum finding by O(log log n) time with n= log log n CRCW processors =-=[17]-=-. In the following two sections we give two general approaches, the closure method and the matrix product method, for the 1D problem. These approaches are based on wellknown methods, but go further to... |

63 |
Notes on searching in multidimensional monotone arrays
- Aggarwal, Park
- 1988
(Show Context)
Citation Context ...blem has been used to compute the secondary structure of RNA without multiple loops [19]. It will be called the RNA problem. A fifth important dynamic programming problem is the edit distance problem =-=[1,3,9]-=-. In the edit distance problem an entry of its dynamic programming table depends on O(1) entries. Note that an entry of D depends on O(n) entries in Problems 1, 2, 3, and on O(n 2 ) entries in Problem... |

54 |
Efficient parallel algorithms for string editing and related problems
- Apostolico, Atallah, et al.
- 1990
(Show Context)
Citation Context ...blem has been used to compute the secondary structure of RNA without multiple loops [19]. It will be called the RNA problem. A fifth important dynamic programming problem is the edit distance problem =-=[1,3,9]-=-. In the edit distance problem an entry of its dynamic programming table depends on O(1) entries. Note that an entry of D depends on O(n) entries in Problems 1, 2, 3, and on O(n 2 ) entries in Problem... |

49 |
Speeding up dynamic programming with applications to molecular biology
- Galil, Giancarlo
- 1989
(Show Context)
Citation Context ...[p; j] + w 0 (p; i)g for 0sism 0sjsn : (2) We assume that m and n are of the same order of magnitude. This is the problem of computing the edit distance when allowing gaps of insertions and deletions =-=[8]-=-. It will be called the gap problem. The gap problem arises in molecular biology, geology, and speech recognition. Problem 3. Given w and D[i; i + 1] for 0si ! n, compute D[i; j] = min i!r!j fD[i; r] ... |

35 |
The least weight subsequence problem
- Hirschberg, Larmore
- 1987
(Show Context)
Citation Context ...ems. Problem 1. Given a real-valued function w and D[0], compute D[j] = min 0i!j fD[i] + w(i; j)g for 1sjsn: (1) This problem was called the least weight subsequence problem by Hirschberg and Larmore =-=[12]-=-. It will be called the 1D problem. Its applications include an optimum paragraph formation problem and the problem of finding a minimum height B-tree. Problem 2. Given w, w 0 , s ij , and D[0; 0] = 0... |

30 | Parallel Algorithmic Techniques for Combinatorial Computation
- Eppstein, Galil
- 1988
(Show Context)
Citation Context ...3. The parenthesis problem: O(n 3=4 log n) time in optimal O(n 3 ) operations. 4. The RNA problem: O( p n log n) time in optimal O(n 4 ) operations. Our model of parallel computation is the CREW PRAM =-=[6,14]-=-. The PRAM has a collection of identical processors and a separate collection of memory cells, and any processor can access any memory cell in unit time. The CREW (concurrent read exclusive write) PRA... |

30 |
Efficient dynamic programming using quadrangle inequalities
- Yao
- 1980
(Show Context)
Citation Context ... the minimum cost of parenthesizingsn elements. Its applications include the matrix chain product, the construction of optimal binary search trees, and the maximum perimeter inscribed polygon problem =-=[20]-=-. Problem 4. Given w and D[i; 0] and D[0; j] for 0si; jsn, compute D[i; j] = min 0p!i 0q!j fD[p; q] + w(p; q; i; j)g for 1si; jsn: (4) This problem has been used to compute the secondary structure of ... |

29 | Dynamic programming with convexity, concavity and sparsity
- Galil, Park
- 1992
(Show Context)
Citation Context ...blem has been used to compute the secondary structure of RNA without multiple loops [19]. It will be called the RNA problem. A fifth important dynamic programming problem is the edit distance problem =-=[1,3,9]-=-. In the edit distance problem an entry of its dynamic programming table depends on O(1) entries. Note that an entry of D depends on O(n) entries in Problems 1, 2, 3, and on O(n 2 ) entries in Problem... |

26 |
Sequence comparison with mixed convex and concave costs
- Eppstein
- 1990
(Show Context)
Citation Context ...Theorem 2. The 1D problem is solved in O( p n log n) time using optimal O(n 2 ) operations. 2.4. Case w(i; j) = g(j \Gamma i) When the cost function w(i; j) is a function of the difference j \Gamma i =-=[5]-=-, we can solve the 1D problem more efficiently. In this case H becomes an upper Toeplitz matrix (i.e., H(i; j) = H(0; j \Gamma i) for all isj). Lemma 4. For any ks1, H k is upper Toeplitz. Proof . By ... |

26 |
On E cient Parallel Computation for Some Dynamic Programming Problems
- Rytter
- 1988
(Show Context)
Citation Context ...rom Problems 1, 2, 4 because this problem is to find a binary tree of minimum cost, while the three problems are to find a path of minimum length. First, we give a clean version of Rytter's algorithm =-=[16]-=-, and then improve the total number of operations. Let f(i; j) be the cost of the optimal binary tree rooted at (i; j) (i.e., f(i; j) = D[i; j]). We define a partial tree T to be a tree rooted at some... |

17 |
Parallel dynamic programming
- Huang, Liu, et al.
- 1990
(Show Context)
Citation Context ... in the case of CRCW processors. For the parenthesis problem Rytter gave an O(log 2 n) time algorithm with n 6 = log n CREW processors, which was later improved to n 6 = log 5 n by Viswanathan et al. =-=[18]-=-. Huang et al. [13] also gave an O( p n log n) time algorithm with n 3:5 = log n CREW processors. To the best of our knowledge, no work has explicitly dealt with Problems 1, 2, 4. Greenberg et al. [10... |

13 |
An efficient formula for linear recurrences
- Fiduccia
- 1985
(Show Context)
Citation Context ...tive identity). Thus A i = A for all i. Now we can compute B = A n by successive squarings, but we still have O(n 3 ) operations because the last squaring itself requires O(n 3 ) operations. Lemma 5. =-=[7]-=- Let v 1\Gamman = (0; . . . ; 0; 1) and v i+1 = A \Delta v i for is1 \Gamma n. Then A n = [v n ; vn\Gamma1 ; . . . ; v 1 ]. Thus the last column of A n is v 1 = (a 1 ; . . . ; an ), and A n is compute... |

8 |
A Sublinear Parallel Algorithm for Some Dynamic Programming Problems
- Huang, Liu, et al.
- 1992
(Show Context)
Citation Context ...W processors. For the parenthesis problem Rytter gave an O(log 2 n) time algorithm with n 6 = log n CREW processors, which was later improved to n 6 = log 5 n by Viswanathan et al. [18]. Huang et al. =-=[13]-=- also gave an O( p n log n) time algorithm with n 3:5 = log n CREW processors. To the best of our knowledge, no work has explicitly dealt with Problems 1, 2, 4. Greenberg et al. [10] solved a linear r... |

7 |
Burstall: Deriving very Efficient Algorithms for Evaluating Linear Recurrence Relations Using the Program Transformation Technique
- Pettorossi, M
- 1982
(Show Context)
Citation Context ...nother method called program transformation: If we transform Recurrence 1 so that D[2j] depends on D[j]; D[j \Gamma 1]; . . . ; D[0], then we can compute D in O(log 2 n) time. Pettorossi and Burstall =-=[15]-=- derived such transformation, but for the 1D problem the computation based on the transformation turns out to be equivalent to that of the matrix product method. There are many open problems. Among th... |

6 |
Computing Fibonacci Numbers (and Similarly Defined Functions
- Gries, Levin
(Show Context)
Citation Context ...) and v i+1 = A \Delta v i for is1 \Gamma n. Then A n = [v n ; vn\Gamma1 ; . . . ; v 1 ]. Thus the last column of A n is v 1 = (a 1 ; . . . ; an ), and A n is computed by the following rules (also in =-=[11]-=-): 1. For i ! n and j ! n, A n (i; j) = a i \Delta A n (1; j + 1) +A n (i + 1; j + 1). 2. For j ! n, A n (n; j) = an \Delta A n (1; j + 1). These rules give a dynamic programming recurrence for comput... |

5 |
Efficient parallel algorithms for linear recurrence computation
- Greenberg, Ladner, et al.
- 1982
(Show Context)
Citation Context ...18]. Huang et al. [13] also gave an O( p n log n) time algorithm with n 3:5 = log n CREW processors. To the best of our knowledge, no work has explicitly dealt with Problems 1, 2, 4. Greenberg et al. =-=[10]-=- solved a linear recurrence in O(log 2 n) time with n 3 = log 2 n CREW processors, which solves the 1D problem as a special case in the same complexity. In this paper we present a unifying framework f... |