Results 1  10
of
23
Perspectives of Monge Properties in Optimization
, 1995
"... An m × n matrix C is called Monge matrix if c ij + c rs c is + c rj for all 1 i ! r m, 1 j ! s n. In this paper we present a survey on Monge matrices and related Monge properties and their role in combinatorial optimization. Specifically, we deal with the following three main topics: (i) funda ..."
Abstract

Cited by 54 (3 self)
 Add to MetaCart
An m × n matrix C is called Monge matrix if c ij + c rs c is + c rj for all 1 i ! r m, 1 j ! s n. In this paper we present a survey on Monge matrices and related Monge properties and their role in combinatorial optimization. Specifically, we deal with the following three main topics: (i) fundamental combinatorial properties of Monge structures, (ii) applications of Monge properties to optimization problems and (iii) recognition of Monge properties.
Improved Complexity Bounds For Location Problems On The Real Line
 Operations Research Letters
, 1991
"... In this note we apply recent results in dynamic programming to improve the complexity bounds of several median and coverage location models on the real line. ..."
Abstract

Cited by 30 (3 self)
 Add to MetaCart
In this note we apply recent results in dynamic programming to improve the complexity bounds of several median and coverage location models on the real line.
Speeding up Dynamic Programming
 In Proc. 29th Symp. Foundations of Computer Science
, 1988
"... this paper we consider the problem of computing two similar recurrences: the onedimensional case ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
this paper we consider the problem of computing two similar recurrences: the onedimensional case
Linear and O(n log n) Time MinimumCost Matching Algorithms for Quasiconvex Tours (Extended Abstract)
"... Samuel R. Buss # Peter N. Yianilos + Abstract Let G be a complete, weighted, undirected, bipartite graph with n red nodes, n # blue nodes, and symmetric cost function c(x, y) . A maximum matching for G consists of min{n, n # edges from distinct red nodes to distinct blue nodes. Our objective is ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
Samuel R. Buss # Peter N. Yianilos + Abstract Let G be a complete, weighted, undirected, bipartite graph with n red nodes, n # blue nodes, and symmetric cost function c(x, y) . A maximum matching for G consists of min{n, n # edges from distinct red nodes to distinct blue nodes. Our objective is to find a minimumcost maximum matching, i.e. one for which the sum of the edge costs has minimal value. This is the weighted bipartite matching problem; or as it is sometimes called, the assignment problem.
Constructing Huffman Trees in Parallel
 SIAM Journal on Computing
, 1995
"... We present a parallel algorithm for the Huffman Coding problem. We reduce the Huffman Coding problem to the Concave Least Weight Subsequence problem and give a parallel algorithm that solves the latter problem in O( p n log n) time with n processors on a CREW PRAM. This leads to the first sublinea ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
We present a parallel algorithm for the Huffman Coding problem. We reduce the Huffman Coding problem to the Concave Least Weight Subsequence problem and give a parallel algorithm that solves the latter problem in O( p n log n) time with n processors on a CREW PRAM. This leads to the first sublinear time o(n 2 )total work parallel algorithm for Huffman Coding. This reduction of the Huffman Coding problem to the Concave Least Weight Subsequence problem also yields an alternative O(n log n)time (or linear time  for a sorted input sequence) algorithm for Huffman Coding. This research was supported by NSF grant CCR9112067. y Part of this work was done while the author was visiting the University of California, Riverside. 1 Introduction Throughout this paper, a tree is a regular binary tree (i.e. a binary tree in which each internal node has two children). The level of a node in a tree is its distance from the root. The problem of constructing a Huffman tree is, given a seque...
Approximate Regular Expression Pattern Matching with Concave Gap Penalties
 ALGORITHMICA
, 1992
"... Given a sequence A of length M and a regular expression R of length P , an approximate regular expression pattern matching algorithm computes the score of the optimal alignment between A and one of the sequences B exactly matched by R. An alignment between sequences A = a 1 a 2 : : : aM and B = b ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
Given a sequence A of length M and a regular expression R of length P , an approximate regular expression pattern matching algorithm computes the score of the optimal alignment between A and one of the sequences B exactly matched by R. An alignment between sequences A = a 1 a 2 : : : aM and B = b 1 b 2 : : : b N is a list of ordered pairs, !(i 1 ; j 1 ); (i 2 ; j 2 ); : : : (i t ; j t )? such that i k ! i k+1 and j k ! j k+1 . In this case, the alignment aligns symbols a i k and b jk , and leaves blocks of unaligned symbols, or gaps, between them. A scoring scheme S associates costs for each aligned symbol pair and each gap. The alignment's score is the sum of the associated costs, and an optimal alignment is one of minimal score. There are a variety of schemes for scoring alignments. In a concave gappenalty scoring scheme S = fffi; wg, a function ffi (a; b) gives the score of each aligned pair of symbols a and b, and a concave function w(k) gives the score of a gap of lengt...
Parallel Construction Of Optimal Alphabetic Trees
 PROCEEDINGS OF THE 5 TH ACM SYMPOSIUM ON PARALLEL ALGORITHMS AND ARCHITECTURES
, 1993
"... A parallel algorithm is given which constructs an optimal alphabetic tree in O(log³ n) time with n² log n processors. The construction is basically a parallelization of the GarsiaWachs version [5] of the Hutucker algorithm [8]. The best previous NC algorithm for the problem uses n 6 = log O(1) ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
A parallel algorithm is given which constructs an optimal alphabetic tree in O(log³ n) time with n² log n processors. The construction is basically a parallelization of the GarsiaWachs version [5] of the Hutucker algorithm [8]. The best previous NC algorithm for the problem uses n 6 = log O(1) n processors. [15] Our method is an extension of techniques used first in [3] and later used in [13] for the Huffman coding problem, which can be viewed as the alphabetic tree problem for the special case of a monotone weight sequence. In this paper, we extend to the case of certain "almost monotone " sequences, which we call "sorted regular valleys." The processing of such subsequences depends on a quadrangle inequality, while the total number of global iterations depends on a kind of tree contraction. Altogether we can view our algorithmic approach as (quadrangle inequality + tree contraction). An optimal alphabetic tree is a special case of an optimal binary search tree where all...
Approximation of Staircases By Staircases
, 1992
"... The simplest nontrivial monotone functions are "staircases." The problem arises: what is the best approximation of some monotone function f(x) by a staircase with M jumps? In particular: what if f(x) is itself a staircase with N , N ? M , steps? This paper considers algorithms for solving, and theo ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
The simplest nontrivial monotone functions are "staircases." The problem arises: what is the best approximation of some monotone function f(x) by a staircase with M jumps? In particular: what if f(x) is itself a staircase with N , N ? M , steps? This paper considers algorithms for solving, and theorems relating to, this problem. All of the algorithms we propose are spaceoptimal up to a constant factor and and also runtimeoptimal except for at most a logarithmic factor. One application of our results is to "data compression" of probability distributions. We find yet another remarkable property of Monge's inequality, called the "concave cost as a function of zigzag number theorem." This property leads to new ways to get speedups in certain 1dimensional dynamic programming problems satisfying this inequality. Keywords  Histograms, data compression, cumulative distribution functions, approximation, monotone functions, dynamic programming, Monge's quadrangle inequality, concave cost...
Efficient Algorithms for Sequence Analysis with Concave and Convex Gap Costs
, 1989
"... EFFICIENT ALGORITHMS FOR SEQUENCE ANALYSIS WITH CONCAVE AND CONVEX GAP COSTS David A. Eppstein We describe algorithms for two problems in sequence analysis: sequence alignment with gaps (multiple consecutive insertions and deletions treated as a unit) and RNA secondary structure with single loops ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
EFFICIENT ALGORITHMS FOR SEQUENCE ANALYSIS WITH CONCAVE AND CONVEX GAP COSTS David A. Eppstein We describe algorithms for two problems in sequence analysis: sequence alignment with gaps (multiple consecutive insertions and deletions treated as a unit) and RNA secondary structure with single loops only. We make the assumption that the gap cost or loop cost is a convex or concave function of the length of the gap or loop, and show how this assumption may be used to develop e#cient algorithms for these problems. We show how the restriction to convex or concave functions may be relaxed, and give algorithms for solving the problems when the cost functions are neither convex nor concave, but can be split into a small number of convex or concave functions. Finally we point out some sparsity in the structure of our sequence analysis problems, and describe how we may take advantage of that sparsity to further speed up our algorithms. CONTENTS 1. Introduction ............................1 ...
Online Maintenance of kMedians and kCovers on a Line
"... The standard dynamic programming solution to finding k medians on a line with n nodes requires O(kn²) time. Dynamic programming speedup techniques, e.g., use of the quadrangle inequality or properties of totally monotone matrices, can reduce this to O(kn) time but these techniques are inherent ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
The standard dynamic programming solution to finding k medians on a line with n nodes requires O(kn²) time. Dynamic programming speedup techniques, e.g., use of the quadrangle inequality or properties of totally monotone matrices, can reduce this to O(kn) time but these techniques are inherently static. The major result of this paper is to show that we can maintain the dynamic programming speedup in an online setting where points are added from left to right on a line. Computing the