Results 1  10
of
27
On the distribution of spacings between zeros of the zeta function
 MATH. COMP
, 1987
"... A numerical study of the distribution of spacings between zeros of the Riemann zeta function is presented. It is based on values for the first 10 5 zeros and for zeros number 10 12 + 1 to 10 12 + 10 5 that are accurate to within ± 10 − 8, and which were calculated on the Cray1 and Cray XMP compute ..."
Abstract

Cited by 86 (9 self)
 Add to MetaCart
A numerical study of the distribution of spacings between zeros of the Riemann zeta function is presented. It is based on values for the first 10 5 zeros and for zeros number 10 12 + 1 to 10 12 + 10 5 that are accurate to within ± 10 − 8, and which were calculated on the Cray1 and Cray XMP computers. This study tests the Montgomery pair correlation conjecture as well as some further conjectures that predict that the zeros of the zeta function behave similarly to eigenvalues of random hermitian matrices. Matrices of this type are used in modeling energy levels in physics, and many statistical properties of their eigenvalues are known. The agreement between actual statistics for zeros of the zeta function and conjectured results is generally good, and improves at larger heights. Several initially unexpected phenomena were found in the data and some were explained by
LOW REDUNDANCY IN STATIC DICTIONARIES WITH CONSTANT QUERY TIME
 SIAM J. COMPUT.
, 2001
"... A static dictionary is a data structure storing subsets of a finite universe U, answering membership queries. We show that on a unit cost RAM with word size Θ(log U), a static dictionary for nelement sets with constant worst case query time can be obtained using B +O(log log U)+o(n) (U) bits ..."
Abstract

Cited by 50 (7 self)
 Add to MetaCart
A static dictionary is a data structure storing subsets of a finite universe U, answering membership queries. We show that on a unit cost RAM with word size Θ(log U), a static dictionary for nelement sets with constant worst case query time can be obtained using B +O(log log U)+o(n) (U) bits of storage, where B = ⌈log2 ⌉ is the minimum number of bits needed to represent all nn element subsets of U.
Primality testing using elliptic curves
 Journal of the ACM
, 1999
"... Abstract. We present a primality proving algorithm—a probabilistic primality test that produces short certificates of primality on prime inputs. We prove that the test runs in expected polynomial time for all but a vanishingly small fraction of the primes. As a corollary, we obtain an algorithm for ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
Abstract. We present a primality proving algorithm—a probabilistic primality test that produces short certificates of primality on prime inputs. We prove that the test runs in expected polynomial time for all but a vanishingly small fraction of the primes. As a corollary, we obtain an algorithm for generating large certified primes with distribution statistically close to uniform. Under the conjecture that the gap between consecutive primes is bounded by some polynomial in their size, the test is shown to run in expected polynomial time for all primes, yielding a Las Vegas primality test. Our test is based on a new methodology for applying group theory to the problem of prime certification, and the application of this methodology using groups generated by elliptic curves over finite fields. We note that our methodology and methods have been subsequently used and improved upon, most notably in the primality proving algorithm of Adleman and Huang using hyperelliptic curves and
Low Redundancy in Static Dictionaries with O(1) Worst Case Lookup Time
 IN PROCEEDINGS OF THE 26TH INTERNATIONAL COLLOQUIUM ON AUTOMATA, LANGUAGES AND PROGRAMMING (ICALP '99
, 1999
"... A static dictionary is a data structure for storing subsets of a nite universe U , so that membership queries can be answered efficiently. We study this problem in a unit cost RAM model with word size (log jU j), and show that for nelement subsets, constant worst case query time can be obtained us ..."
Abstract

Cited by 21 (5 self)
 Add to MetaCart
A static dictionary is a data structure for storing subsets of a nite universe U , so that membership queries can be answered efficiently. We study this problem in a unit cost RAM model with word size (log jU j), and show that for nelement subsets, constant worst case query time can be obtained using B +O(log log jU j) + o(n) bits of storage, where B = dlog 2 jUj n e is the minimum number of bits needed to represent all such subsets. For jU j = n log O(1) n the dictionary supports constant time rank queries.
Low Redundancy in Dictionaries with O(1) Worst Case Lookup Time
 IN PROC. 26TH INTERNATIONAL COLLOQUIUM ON AUTOMATA, LANGUAGES AND PROGRAMMING (ICALP
, 1998
"... A static dictionary is a data structure for storing subsets of a finite universe U , so that membership queries can be answered efficiently. We study this problem in a unit cost RAM model with word size ze jU j), and show that for nelement subsets, constant worst case query time can be obtain ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
A static dictionary is a data structure for storing subsets of a finite universe U , so that membership queries can be answered efficiently. We study this problem in a unit cost RAM model with word size ze jU j), and show that for nelement subsets, constant worst case query time can be obtained using B +O(log log jU j) + o(n) bits of storage, where B = dlog jU j e is the minimum number of bits needed to represent all such subsets. The solution for dense subsets uses B + O( jU j log log jU j log jU j ) bits of storage, and supports constant time rank queries. In a dynamic setting, allowing insertions and deletions, our techniques give an O(B) bit space usage.
Rectangle Free Coloring of Grids
"... A twodimensional grid is a set Gn,m = [n] × [m]. A grid Gn,m is ccolorable if there is a function χn,m: Gn,m → [c] such that there are no rectangles with all four corners the same color. We address the following question: for which values of n and m is Gn,m ccolorable? This problem can be viewed ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
A twodimensional grid is a set Gn,m = [n] × [m]. A grid Gn,m is ccolorable if there is a function χn,m: Gn,m → [c] such that there are no rectangles with all four corners the same color. We address the following question: for which values of n and m is Gn,m ccolorable? This problem can be viewed as a bipartite Ramsey problem and is related to a the GallaiWitt theorem (also called the multidimensioanl Van Der Waerden’s Theorem). We determine (1) exactly which grids are 2colorable, (2) exactly which grids are 3colorable, and (3) (assuming a reasonable conjecture) exactly which grids are 4colorable. We use combinatorics, finite fields, and tournament graphs.
An invitation to additive prime number theory
, 2004
"... The main purpose of this survey is to introduce the inexperienced reader to additive prime number theory and some related branches of analytic number theory. We state the main problems in the field, sketch their history and the basic machinery used to study them, and try to give a representative sam ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
The main purpose of this survey is to introduce the inexperienced reader to additive prime number theory and some related branches of analytic number theory. We state the main problems in the field, sketch their history and the basic machinery used to study them, and try to give a representative sample of the directions of current research.
A New Approach to Formal Language Theory by Kolmogorov Complexity
 Preprint, SIAM J. Comput
, 1995
"... We present a new approach to formal language theory using Kolmogorov complexity. The main results presented here are an alternative for pumping lemma(s), a new characterization for regular languages, and a new method to separate deterministic contextfree languages and nondeterministic contextfree ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We present a new approach to formal language theory using Kolmogorov complexity. The main results presented here are an alternative for pumping lemma(s), a new characterization for regular languages, and a new method to separate deterministic contextfree languages and nondeterministic contextfree languages. The use of the new `incompressibility arguments' is illustrated by many examples. The approach is also successful at the high end of the Chomsky hierarchy since one can quantify nonrecursiveness in terms of Kolmogorov complexity. (This is a preliminary uncorrected version. The final version is the one published in SIAM J. Comput., 24:2(1995), 398410.) 1 Introduction It is feasible to reconstruct parts of formal language theory using algorithmic information theory (Kolmogorov complexity). We provide theorems on how to use Kolmogorov complexity as a concrete and powerful tool. We do not just want A preliminary version of part of this work was presented at the 16th International...