Results 1  10
of
32
Randomized Algorithms
, 1995
"... Randomized algorithms, once viewed as a tool in computational number theory, have by now found widespread application. Growth has been fueled by the two major benefits of randomization: simplicity and speed. For many applications a randomized algorithm is the fastest algorithm available, or the simp ..."
Abstract

Cited by 1884 (38 self)
 Add to MetaCart
Randomized algorithms, once viewed as a tool in computational number theory, have by now found widespread application. Growth has been fueled by the two major benefits of randomization: simplicity and speed. For many applications a randomized algorithm is the fastest algorithm available, or the simplest, or both. A randomized algorithm is an algorithm that uses random numbers to influence the choices it makes in the course of its computation. Thus its behavior (typically quantified as running time or quality of output) varies from
Static Scheduling Algorithms for Allocating Directed Task Graphs to Multiprocessors
, 1999
"... Devices]: Modes of ComputationParallelism and concurrency General Terms: Algorithms, Design, Performance, Theory Additional Key Words and Phrases: Automatic parallelization, DAG, multiprocessors, parallel processing, software tools, static scheduling, task graphs This research was supported ..."
Abstract

Cited by 212 (4 self)
 Add to MetaCart
Devices]: Modes of ComputationParallelism and concurrency General Terms: Algorithms, Design, Performance, Theory Additional Key Words and Phrases: Automatic parallelization, DAG, multiprocessors, parallel processing, software tools, static scheduling, task graphs This research was supported by the Hong Kong Research Grants Council under contract numbers HKUST 734/96E, HKUST 6076/97E, and HKU 7124/99E. Authors' addresses: Y.K. Kwok, Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam Road, Hong Kong; email: ykwok@eee.hku.hk; I. Ahmad, Department of Computer Science, The Hong Kong University of Science and Technology, Clear Water Bay, Hong Kong. Permission to make digital / hard copy of part or all of this work for personal or classroom use is granted without fee provided that the copies are not made or distributed for profit or commercial advantage, the copyright notice, the title of the publication, and its date appear, and notice is given that copying is by permission of the ACM, Inc. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and / or a fee. 2000 ACM 03600300/99/12000406 $5.00 ACM Computing Surveys, Vol. 31, No. 4, December 1999 1.
Optimal Composition of RealTime Systems
 ARTIFICIAL INTELLIGENCE
, 1996
"... Realtime systems are designed for environments in which the utility of actions is strongly timedependent. Recent work by Dean, Horvitz and others has shown that anytime algorithms are a useful tool for realtime system design, since they allow computation time to be traded for decision quality. In ..."
Abstract

Cited by 115 (22 self)
 Add to MetaCart
Realtime systems are designed for environments in which the utility of actions is strongly timedependent. Recent work by Dean, Horvitz and others has shown that anytime algorithms are a useful tool for realtime system design, since they allow computation time to be traded for decision quality. In order to construct complex systems, however, we need to be able to compose larger systems from smaller, reusable anytime modules. This paper addresses two basic problems associated with composition: how to ensure the interruptibility of the composed system
Backwards Analysis of Randomized Geometric Algorithms
 Trends in Discrete and Computational Geometry, volume 10 of Algorithms and Combinatorics
, 1992
"... The theme of this paper is a rather simple method that has proved very potent in the analysis of the expected performance of various randomized algorithms and data structures in computational geometry. The method can be described as "analyze a randomized algorithm as if it were running backward ..."
Abstract

Cited by 58 (0 self)
 Add to MetaCart
The theme of this paper is a rather simple method that has proved very potent in the analysis of the expected performance of various randomized algorithms and data structures in computational geometry. The method can be described as "analyze a randomized algorithm as if it were running backwards in time, from output to input." We apply this type of analysis to a variety of algorithms, old and new, and obtain solutions with optimal or near optimal expected performance for a plethora of problems in computational geometry, such as computing Delaunay triangulations of convex polygons, computing convex hulls of point sets in the plane or in higher dimensions, sorting, intersecting line segments, linear programming with a fixed number of variables, and others. 1 Introduction The curious phenomenon that randomness can be used profitably in the solution of computational tasks has attracted a lot of attention from researchers in recent years. The approach has proved useful in such diverse area...
FROM FINDING MAXIMUM FEASIBLE SUBSYSTEMS OF LINEAR SYSTEMS TO FEEDFORWARD NEURAL NETWORK DESIGN
, 1994
"... ..."
Derandomization in Computational Geometry
, 1996
"... We survey techniques for replacing randomized algorithms in computational geometry by deterministic ones with a similar asymptotic running time. 1 Randomized algorithms and derandomization A rapid growth of knowledge about randomized algorithms stimulates research in derandomization, that is, repla ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
We survey techniques for replacing randomized algorithms in computational geometry by deterministic ones with a similar asymptotic running time. 1 Randomized algorithms and derandomization A rapid growth of knowledge about randomized algorithms stimulates research in derandomization, that is, replacing randomized algorithms by deterministic ones with as small decrease of efficiency as possible. Related to the problem of derandomization is the question of reducing the amount of random bits needed by a randomized algorithm while retaining its efficiency; the derandomization can be viewed as an ultimate case. Randomized algorithms are also related to probabilistic proofs and constructions in combinatorics (which came first historically), whose development has similarly been accompanied by the effort to replace them by explicit, nonrandom constructions whenever possible. Derandomization of algorithms can be seen as a part of an effort to map the power of randomness and explain its role. ...
Lower Space Bounds for Randomized Computation
, 1994
"... It is a fundamental open problem in the randomized computation how to separate different randomized time or randomized small space classes (cf., e.g., [KV 87], [KV 88]). In this paper we study lower space bounds for randomized computation, and prove lower space bounds up to log n for the specific se ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
It is a fundamental open problem in the randomized computation how to separate different randomized time or randomized small space classes (cf., e.g., [KV 87], [KV 88]). In this paper we study lower space bounds for randomized computation, and prove lower space bounds up to log n for the specific sets computed by the Monte Carlo Turing machines. This enables us for the first time, to separate randomized space classes below log n (cf. [KV 87], [KV 88]), allowing us to separate, say, the randomized space O (1) from the randomized space O (log n). We prove also lower space bounds up to log log n and log n, respectively, for specific sets computed by probabilistic Turing machines, and oneway probabilistic Turing machines. 1 Department of Computer Science, University of Latvia, LV1459 Riga. Research partially supported by Grant No. 93599 from the Latvian Council of Science. 2 Department of Computer Science, University of Bonn, 53117 Bonn. Research partially supported by ...
On the Power of Randomized Ordered Branching Programs
, 1997
"... We define the notion of a randomized branching program in the natural way similar to the definition of a randomized circuit. We exhibit an explicit boolean function fn : f0; 1g n ! f0; 1g for which we prove that: 1) fn can be computed by polynomial size randomized readonce ordered branching progr ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
We define the notion of a randomized branching program in the natural way similar to the definition of a randomized circuit. We exhibit an explicit boolean function fn : f0; 1g n ! f0; 1g for which we prove that: 1) fn can be computed by polynomial size randomized readonce ordered branching program with a small onesided error; 2) fn cannot be computed in polynomial size by nondeterministic ordered read A ktimes branching program for k = o(n= log n) (any nondeterministic ordered read A ktimes branching program that computes function fn has the size no less than 2 (n\Gamma1)=(2k\Gamma1) ). By read A ktimes branching program we define branching program with the property: no input variable appears more than k times on any consistent accepting computation path in the program. 1 Preliminaries and definitions Different models of branching program introduced in [18, 19], has been studied extensively in the last decade (see [25]). A survey of known lower bounds for different...
Some Recent Human/Computer Discoveries in Science and What Accounts for Them
"... We have recently reported several human/computer discoveries in biology, chemistry and physics that have appeared in domain science journals. One may ask what accounts for these findings, e.g., whether they share a common pattern. My conclusion is that each finding involves a new representation of t ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
We have recently reported several human/computer discoveries in biology, chemistry and physics that have appeared in domain science journals. One may ask what accounts for these findings, e.g., whether they share a common pattern. My conclusion is that each finding involves a new representation of the scientific task: the problem spaces searched were unlike previous task problem spaces. Such new representations need not be wholly new to the history of science; rather, they can draw on useful representational pieces from elsewhere in natural or computer science. This account contrasts with earlier explanations of machine discovery based on the expertsystems view. My analysis also suggests a broader potential role for (AI) computer scientists in the practice of natural science.