Results 1 
4 of
4
Power from Random Strings
 IN PROCEEDINGS OF THE 43RD IEEE SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE
, 2002
"... We show that sets consisting of strings of high Kolmogorov complexity provide examples of sets that are complete for several complexity classes under probabilistic and nonuniform reductions. These sets are provably not complete under the usual manyone reductions. Let ..."
Abstract

Cited by 43 (17 self)
 Add to MetaCart
We show that sets consisting of strings of high Kolmogorov complexity provide examples of sets that are complete for several complexity classes under probabilistic and nonuniform reductions. These sets are provably not complete under the usual manyone reductions. Let
Sampling Short Lattice Vectors and the Closest Lattice Vector Problem
, 2002
"... We present a 2^{O(n)} time Turing reduction from the closest lattice vector problem to the shortest lattice vector problem. Our reduction assumes access to a subroutine that solves SVP exactly and a subroutine to sample short vectors from a lattice, and computes a (1+ epsilon)approximation to CVP. ..."
Abstract

Cited by 33 (0 self)
 Add to MetaCart
We present a 2^{O(n)} time Turing reduction from the closest lattice vector problem to the shortest lattice vector problem. Our reduction assumes access to a subroutine that solves SVP exactly and a subroutine to sample short vectors from a lattice, and computes a (1+ epsilon)approximation to CVP. As a consequence, using the SVP algorithm from [1], we obtain a randomized 2^{O(1+epsilon^1)n} algorithm to obtain a (1+epsilon)approximation for the closest lattice vector problem in n dimensions. This improves the existing time bound of O(n!) for CVP (achieved by a deterministic algorithm in [2]).
Statistical pruning for nearmaximum likelihood decoding
 IEEE Transactions on Signal Processing
, 2007
"... In many communications problems, maximumlikelihood (ML) decoding reduces to nding the closest (skewed) lattice point in Ndimensions to a given point x 2 CN. In its full generality, this problem is known to be NPcomplete and requires complexity exponential in N. Recently, the expected complexity o ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
(Show Context)
In many communications problems, maximumlikelihood (ML) decoding reduces to nding the closest (skewed) lattice point in Ndimensions to a given point x 2 CN. In its full generality, this problem is known to be NPcomplete and requires complexity exponential in N. Recently, the expected complexity of the sphere decoder, a particular algorithm that solves the ML problem exactly, has been computed where it is shown that over a wide range of rates, SNRs and dimensions the expected computation involves no more than N 3 computations. In this paper, we propose an algorithm that, for large N, offers substantial computational savings over the sphere decoder, while maintaining performance arbitrarily close to ML. We statistically prune the search space to a subset that, with high probability, contains the optimal solution, thereby reducing the complexity of the search. Bounds on the error performance of the new method are proposed. The complexity of the new algorithm is analysed in three ways. One is an asymptotic analysis and holds for very large dimensions and the other two are an upper bound and approximation that are of interest in small to moderately large dimensions. Simulations are presented to compare the algorithm with the original sphere decoder. 1