Results 1  10
of
62
Approximation Algorithms for Disjoint Paths Problems
, 1996
"... The construction of disjoint paths in a network is a basic issue in combinatorial optimization: given a network, and specified pairs of nodes in it, we are interested in finding disjoint paths between as many of these pairs as possible. This leads to a variety of classical NPcomplete problems for w ..."
Abstract

Cited by 139 (0 self)
 Add to MetaCart
The construction of disjoint paths in a network is a basic issue in combinatorial optimization: given a network, and specified pairs of nodes in it, we are interested in finding disjoint paths between as many of these pairs as possible. This leads to a variety of classical NPcomplete problems for which very little is known from the point of view of approximation algorithms. It has recently been brought into focus in work on problems such as VLSI layout and routing in highspeed networks; in these settings, the current lack of understanding of the disjoint paths problem is often an obstacle to the design of practical heuristics.
Optimal and Sublogarithmic Time Randomized Parallel Sorting Algorithms
 SIAM JOURNAL ON COMPUTING
, 1989
"... We assume a parallel RAM model which allows both concurrent reads and concurrent writes of a global memory. Our main result is an optimal randomized parallel algorithm for INTEGER SORT (i.e., for sorting n integers in the range [1; n]). Our algorithm costs only logarithmic time and is the first know ..."
Abstract

Cited by 61 (12 self)
 Add to MetaCart
We assume a parallel RAM model which allows both concurrent reads and concurrent writes of a global memory. Our main result is an optimal randomized parallel algorithm for INTEGER SORT (i.e., for sorting n integers in the range [1; n]). Our algorithm costs only logarithmic time and is the first known that is optimal: the product of its time and processor bounds is upper bounded by a linear function of the input size. We also give a deterministic sublogarithmic time algorithm for prefix sum. In addition we present a sublogarithmic time algorithm for obtaining a random permutation of n elements in parallel. And finally, we present sublogarithmic time algorithms for GENERAL SORT and INTEGER SORT. Our sublogarithmic GENERAL SORT algorithm is also optimal.
Ranking Queries on Uncertain Data: A Probabilistic Threshold Approach
 Computer Science Department, Florida State University
, 2008
"... Uncertain data is inherent in a few important applications such as environmental surveillance and mobile object tracking. Topk queries (also known as ranking queries) are often natural and useful in analyzing uncertain data in those applications. In this paper, we study the problem of answering pro ..."
Abstract

Cited by 60 (14 self)
 Add to MetaCart
Uncertain data is inherent in a few important applications such as environmental surveillance and mobile object tracking. Topk queries (also known as ranking queries) are often natural and useful in analyzing uncertain data in those applications. In this paper, we study the problem of answering probabilistic threshold topk queries on uncertain data, which computes uncertain records taking a probability of at least p to be in the topk list where p is a user specified probability threshold. We present an efficient exact algorithm, a fast sampling algorithm, and a Poisson approximation based algorithm. An empirical study using real and synthetic data sets verifies the effectiveness of probabilistic threshold topk queries and the efficiency of our methods.
Learning against opponents with bounded memory
 In IJCAI
, 2005
"... Recently, a number of authors have proposed criteria for evaluating learning algorithms in multiagent systems. While welljustified, each of these has generally given little attention to one of the main challenges of a multiagent setting: the capability of the other agents to adapt and learn as wel ..."
Abstract

Cited by 40 (3 self)
 Add to MetaCart
Recently, a number of authors have proposed criteria for evaluating learning algorithms in multiagent systems. While welljustified, each of these has generally given little attention to one of the main challenges of a multiagent setting: the capability of the other agents to adapt and learn as well. We propose extending existing criteria to apply to a class of adaptive opponents with bounded memory which we describe. We then show an algorithm that provably achieves an ɛbest response against this richer class of opponents while simultaneously guaranteeing a minimum payoff against any opponent and performing well in selfplay. This new algorithm also demonstrates strong performance in empirical tests against a variety of opponents in a wide range of environments. 1
Runtime Analysis of a Simple Ant Colony Optimization Algorithm
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY, REPORT NO. 84 (2006)
, 2006
"... Ant Colony Optimization (ACO) has become quite popular in recent years. In contrast to many successful applications, the theoretical foundation of this randomized search heuristic is rather weak. Building up such a theory is demanded to understand how these heuristics work as well as to come up with ..."
Abstract

Cited by 31 (9 self)
 Add to MetaCart
Ant Colony Optimization (ACO) has become quite popular in recent years. In contrast to many successful applications, the theoretical foundation of this randomized search heuristic is rather weak. Building up such a theory is demanded to understand how these heuristics work as well as to come up with better algorithms for certain problems. Up to now, only convergence results have been achieved showing that optimal solutions can be obtained in finite time. We present the first runtime analysis of an ACO algorithm, which transfers many rigorous results with respect to the runtime of a simple evolutionary algorithm to our algorithm. Moreover, we examine the choice of the evaporation factor, a crucial parameter in ACO algorithms, in detail. By deriving new lower bounds on the tails of sums of independent Poisson trials, we determine the effect of the evaporation factor almost completely and prove a phase transition from exponential to polynomial runtime.
Computation in Noisy Radio Networks
 in Proc. 9th Ann. ACMSIAM Symp. on Discrete Algorithms
"... In this paper we examine noisy radio (broadcast) networks in which every bit transmitted has a certain probability to be flipped. Each processor has some initial input bit, and the goal is to compute a function of the initial inputs. In this model we show a protocol to compute any threshold function ..."
Abstract

Cited by 29 (0 self)
 Add to MetaCart
In this paper we examine noisy radio (broadcast) networks in which every bit transmitted has a certain probability to be flipped. Each processor has some initial input bit, and the goal is to compute a function of the initial inputs. In this model we show a protocol to compute any threshold function using only a linear number of transmissions. 1 Introduction The influence of noise (or faults) on the complexity of computation was studied in many contexts. In particular people were interested in random noise. In a typical such scenario, it is assumed that the outcome of each operation is noisy with some fixed probability p and all the faults are independent. Usually, if t is the number of operations performed by the computation, then by repeating each operation O(log t) times and taking the majority of the results, one can ensure a constant probability of error at the cost of O(t log t) operations. It is desirable however to obtain a cost of O(t) (i.e., increase only by a constant fa...
Conservative Statistical PostElection Audits
 THE ANNALS OF APPLIED STATISTICS
, 2008
"... There are many sources of error in counting votes on election day: the apparent winner might not be the rightful winner. Hand tallies of the votes in a random sample of precincts can be used to test the hypothesis that a full manual recount would find a different outcome. This paper develops a conse ..."
Abstract

Cited by 24 (13 self)
 Add to MetaCart
There are many sources of error in counting votes on election day: the apparent winner might not be the rightful winner. Hand tallies of the votes in a random sample of precincts can be used to test the hypothesis that a full manual recount would find a different outcome. This paper develops a conservative sequential test based on the votecounting errors found in a hand tally of a simple or stratified random sample of precincts. The procedure includes a natural escalation: If the hypothesis that the apparent outcome is incorrect is not rejected at stage s, more precincts are audited. Eventually, either the hypothesis is rejected—and the apparent outcome is confirmed—or all precincts have been audited and the true outcome is known. The test uses a priori bounds on the overstatement of the margin that could result from error in each precinct. Such bounds can be derived from the reported counts in each precinct and upper bounds on the number of votes cast in each precinct. The test allows errors in different precincts to be treated differently to reflect voting technology or precinct sizes. It is not optimal, but it is conservative: the chance of erroneously confirming the outcome of a contest if a full manual recount would show a different outcome is no larger than the nominal significance level. The approach also gives a conservative Pvalue for the hypothesis that a full manual recount would find a different outcome, given the errors found in a fixed size sample. This is illustrated with two contests from November, 2006: the U.S. Senate race in Minnesota and a school board race for the Sausalito Marin City School District in California, a small contest in which voters could vote for up to three candidates.
ON HOEFFDING’S INEQUALITIES
, 2004
"... ... 13–30], several inequalities for tail probabilities of sums Mn = X1 +···+ Xn of bounded independent random variables Xj were proved. These inequalities had a considerable impact on the development of probability and statistics, and remained unimproved until 1995 when Talagrand [Inst. Hautes Étud ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
... 13–30], several inequalities for tail probabilities of sums Mn = X1 +···+ Xn of bounded independent random variables Xj were proved. These inequalities had a considerable impact on the development of probability and statistics, and remained unimproved until 1995 when Talagrand [Inst. Hautes Études Sci. Publ. Math. 81 (1995a) 73–205] inserted certain missing factors in the bounds of two theorems. By similar factors, a third theorem was refined by Pinelis [Progress in Probability 43 (1998) 257–314] and refined (and extended) by me. In this article, I introduce a new type of inequality. Namely, I show that P{Mn ≥ x}≤cP{Sn ≥ x}, wherecis an absolute constant and Sn = ε1 +···+εn is a sum of independent identically distributed Bernoulli random variables (a random variable is called Bernoulli if it assumes at most two values). The inequality holds for those x ∈ R where the survival function x ↦ → P{Sn ≥ x} has a jump down. For the remaining x the inequality still holds provided that the function between the adjacent jump points is interpolated linearly or loglinearly. If it is necessary, to estimate P{Sn ≥ x} special bounds can be used for binomial probabilities. The results extend to martingales with bounded differences. It is apparent that Theorem 1.1 of this article is the most important. The inequalities have applications to measure concentration, leading to results of the type where, up to an absolute constant, the measure concentration is dominated by the concentration in a simplest appropriate model, such results will be considered elsewhere.
How Much Can Hardware Help Routing?
"... . We study the extent to which complex hardware can speed up routing. Specifically, we consider the following questions. How much does adaptive routing improve over oblivious routing? How much does randomness help? How does it help if each node can have a large number of neighbors? What benefit is a ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
. We study the extent to which complex hardware can speed up routing. Specifically, we consider the following questions. How much does adaptive routing improve over oblivious routing? How much does randomness help? How does it help if each node can have a large number of neighbors? What benefit is available if a node can send packets to several neighbors within a single time step? Some of these features require complex networking hardware, and it is thus important to investigate whether the performance justifies the investment. By varying these hardware parameters, we obtain a hierarchy of time bounds for worstcase permutation routing.
Hypercubic Sorting Networks
 SIAM J. Comput
, 1998
"... . This paper provides an analysis of a natural dround tournamentover n = 2 d players, and demonstrates that the tournament possesses a surprisingly strong ranking property. The ranking property of this tournament is used to design efficient sorting algorithms for a variety of different models of ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
. This paper provides an analysis of a natural dround tournamentover n = 2 d players, and demonstrates that the tournament possesses a surprisingly strong ranking property. The ranking property of this tournament is used to design efficient sorting algorithms for a variety of different models of parallel computation: (i) a comparator network of depth c \Delta lg n, c 7:44, that sorts the vast majority of the n! possible input permutations, (ii) an O(lg n)depth hypercubic comparator network that sorts the vast majority of permutations, (iii) a hypercubic sorting network with nearly logarithmic depth, (iv) an O(lgn)time randomized sorting algorithm for any hypercubic machine (other such algorithms have been previously discovered, but this algorithm has a significantly smaller failure probability than any previously known algorithm), and (v) a randomized algorithm for sorting n O(m)bit records on an (n lg n)node omega machine in O(m + lg n) bit steps. Key words. parallel sort...