Results 1  10
of
11
Optimal and Sublogarithmic Time Randomized Parallel Sorting Algorithms
 SIAM JOURNAL ON COMPUTING
, 1989
"... We assume a parallel RAM model which allows both concurrent reads and concurrent writes of a global memory. Our main result is an optimal randomized parallel algorithm for INTEGER SORT (i.e., for sorting n integers in the range [1; n]). Our algorithm costs only logarithmic time and is the first know ..."
Abstract

Cited by 62 (12 self)
 Add to MetaCart
We assume a parallel RAM model which allows both concurrent reads and concurrent writes of a global memory. Our main result is an optimal randomized parallel algorithm for INTEGER SORT (i.e., for sorting n integers in the range [1; n]). Our algorithm costs only logarithmic time and is the first known that is optimal: the product of its time and processor bounds is upper bounded by a linear function of the input size. We also give a deterministic sublogarithmic time algorithm for prefix sum. In addition we present a sublogarithmic time algorithm for obtaining a random permutation of n elements in parallel. And finally, we present sublogarithmic time algorithms for GENERAL SORT and INTEGER SORT. Our sublogarithmic GENERAL SORT algorithm is also optimal.
How to make replicated data secure
 Advances in Cryptology  CRYPTO
, 1988
"... Many distributed systems manage some form of longlived data, such as files or data bases. The performance and faulttolerance of such systems may be enhanced if the repositories for the data are physically distributed. Nevertheless, distribution makes security more difficult, since it may be diffic ..."
Abstract

Cited by 44 (1 self)
 Add to MetaCart
Many distributed systems manage some form of longlived data, such as files or data bases. The performance and faulttolerance of such systems may be enhanced if the repositories for the data are physically distributed. Nevertheless, distribution makes security more difficult, since it may be difficult to ensure that each repository is physically secure, particularly if the number of repositories is large. This paper proposes new techniques for ensuring the security of longlived, physically distributed data. These techniques adapt replication protocols for faulttolerance to the more demanding requirements of security. For a given threshold value, one set of protocols ensures that an adversary cannot ascertain the state of a data object by observing the contents of fewer than a threshold of repositories. These protocols are cheap; the message traffic needed to tolerate a given number of compromised repositories is only slightly more than the message traffic needed to tolerate the same number of failures. A second set of protocols ensures that an object’s state cannot be altered by an adversary who can modify the contents of fewer than a threshold of repositories. These protocols are more expensive; to tolerate t1 compromised repositories, clients executing certain operations must communicate with t1 additional sites.
Synthesizers and Their Application to the Parallel Construction of PseudoRandom Functions
, 1995
"... A pseudorandom function is a fundamental cryptographic primitive that is essential for encryption, identification and authentication. We present a new cryptographic primitive called pseudorandom synthesizer and show how to use it in order to get a parallel construction of a pseudorandom function. ..."
Abstract

Cited by 42 (10 self)
 Add to MetaCart
A pseudorandom function is a fundamental cryptographic primitive that is essential for encryption, identification and authentication. We present a new cryptographic primitive called pseudorandom synthesizer and show how to use it in order to get a parallel construction of a pseudorandom function. We show several NC¹ implementations of synthesizers based on concrete intractability assumptions as factoring and the DiffieHellman assumption. This yields the first parallel pseudorandom functions (based on standard intractability assumptions) and the only alternative to the original construction of Goldreich, Goldwasser and Micali. In addition, we show parallel constructions of synthesizers based on other primitives such as weak pseudorandom functions or trapdoor oneway permutations. The security of all our constructions is similar to the security of the underlying assumptions. The connection with problems in Computational Learning Theory is discussed.
Linear Congruential Generators over Elliptic Curves
 Preprint CS94 143 , Dept. of Comp. Sci., Cornegie Mellon Univ
, 1994
"... Random numbers are useful in many applications such as Monte Carlo simulation, randomized algorithms, games, and password generation. It is important to be able to prove facts about about pseudorandom number generators, both about the distribution and the predictability of the pseudorandom numbers. ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Random numbers are useful in many applications such as Monte Carlo simulation, randomized algorithms, games, and password generation. It is important to be able to prove facts about about pseudorandom number generators, both about the distribution and the predictability of the pseudorandom numbers. I discuss a pseudorandom number generator based on elliptic curves taken over finite fields. This class of generators can produce provably good pseudorandom numbers. Also, I prove that the analog of a faster pseudorandom number generator embedded in an elliptic curve fails to produce good pseudorandom numbers. This report was submitted in partial fulfillment of the requirements for the Senior Honors Research Program in the School of Computer Science at Carnegie Mellon University. Keywords: cryptography, pseudorandom number generation, elliptic curves, linear congruential generators 1 Introduction Random numbers are useful in many applications such as Monte Carlo simulation, random...
About PolynomialTime "unpredictable" Generators
"... Socalled "perfect" or "unpredictable" pseudorandom generators have been proposed recently by people from the area of cryptology. Many people got aware of them from an optimistic article in the New York Times (Gleick (1988)). These generators are usually based on nonlinear recurr ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Socalled "perfect" or "unpredictable" pseudorandom generators have been proposed recently by people from the area of cryptology. Many people got aware of them from an optimistic article in the New York Times (Gleick (1988)). These generators are usually based on nonlinear recurrences modulo some integer m. Under some (yet unproven) complexity assumptions, it has been proven that no polynomialtime statistical test can distinguish a sequence of bits produced by such a generator from a sequence of truly random bits. In this paper, we give some theoretical background concerning this class of generators and we look at the practicality of using them for simulation applications. We examine in particular their ease of implementation, their efficiency, periodicity, the ease of jumping ahead in the sequence, the minimum size of modulus that should be used, etc. 1. INTRODUCTION In the recent years, a growing interest has raised for "cryptographically strong" (or "perfect", or "unpredictable "...
Parallel Complexity of Integer Coprimality
, 2000
"... It is shown that integer coprimality testing is in NC. AMS classification codes: 68Q15, 68Q22, 68Q25 1 Introduction The object of this paper is to prove Theorem 1 Integer coprimality is in NC. 1.1 Background The parallel complexity of basic arithmetic operations has been closely investigated sin ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
It is shown that integer coprimality testing is in NC. AMS classification codes: 68Q15, 68Q22, 68Q25 1 Introduction The object of this paper is to prove Theorem 1 Integer coprimality is in NC. 1.1 Background The parallel complexity of basic arithmetic operations has been closely investigated since the 1960's. In the case of arithmetic, problem size is usually measured in terms of binary notation for the integer inputs. It is known that addition and multiplication of nbit integers can be done in NC1, i.e., by logspace computable Boolean circuit families of O(log n) depth and with n O(1) Boolean gates. Details about these classical results may be found in [12], and information about the parallel complexity class NC may be found in [11, 4]. It is also know that division can be done in the same time and size bounds, but slightly more than logspace is needed to build the requisite Boolean circuits. It is open whether or not division is in NC1. See [2, 5, 7] for more information abou...
Optimal and Sublogarithmic Time
"... Abstract.We assume a parallel RAM model which allows both concurrent reads and concurrent writes of a global memory. Our main result is an optimal randomized parallel algorithm for INTEGER SORT (i.e., for sorting n integers in the range [1,n]). Our algorithm costs only logarithmic time and is the f ..."
Abstract
 Add to MetaCart
Abstract.We assume a parallel RAM model which allows both concurrent reads and concurrent writes of a global memory. Our main result is an optimal randomized parallel algorithm for INTEGER SORT (i.e., for sorting n integers in the range [1,n]). Our algorithm costs only logarithmic time and is the first known that is optimal: the product of its time and processor bounds is upper bounded by a linear function of the input size. We also give a deterministic sublogarithmic time algorithm for prefix sum. In addition we present a sublogarithmic time algorithm for obtaining a random permutation of n elements in parallel. And finally, we present sublogarithmic time algorithms for GENERAL SORT and INTEGER SORT. Our sublogarithmic GENERAL SORT algorithm is also optimal.
1.1 Problem framework
, 1992
"... In general, manufacturing production lines can be classified as: ..."
unknown title
"... n the mind of the average computer user, the problem of generating uniform variates by computer has been solved long ago. After all, every computer:system offers one or more function(s) to do so. Many software products, like compilers, spreadsheets, statistical or numerical packages, etc. also offe ..."
Abstract
 Add to MetaCart
n the mind of the average computer user, the problem of generating uniform variates by computer has been solved long ago. After all, every computer:system offers one or more function(s) to do so. Many software products, like compilers, spreadsheets, statistical or numerical packages, etc. also offer their own. These functions supposedly return numbers that could be used, for all practical purposes, as if they were the values taken by independent random variables, with a