Results 1  10
of
1,491
c ○ 2009 Society for Industrial and Applied Mathematics AN LLL ALGORITHM WITH QUADRATIC COMPLEXITY ∗
"... is a fundamental tool in computational number theory and theoretical computer science, which can be viewed as an efficient algorithmic version of Hermite’s inequality on Hermite’s constant. Given an integer ddimensional lattice basis with vectors of Euclidean norm less than B in an ndimensional sp ..."
Abstract
 Add to MetaCart
dimensional space, the L3 algorithm outputs a reduced basis in O(d3n log B ·M(dlog B)) bit operations, where M(k) denotes the time required to multiply kbit integers. This worstcase complexity is problematic for applications where d or/and log B are often large. As a result, the original L3 algorithm is almost
Interiorpoint Methods
, 2000
"... The modern era of interiorpoint methods dates to 1984, when Karmarkar proposed his algorithm for linear programming. In the years since then, algorithms and software for linear programming have become quite sophisticated, while extensions to more general classes of problems, such as convex quadrati ..."
Abstract

Cited by 612 (15 self)
 Add to MetaCart
quadratic programming, semidefinite programming, and nonconvex and nonlinear problems, have reached varying levels of maturity. We review some of the key developments in the area, including comments on both the complexity theory and practical algorithms for linear programming, semidefinite programming
Efficient belief propagation for early vision
 In CVPR
, 2004
"... Markov random field models provide a robust and unified framework for early vision problems such as stereo, optical flow and image restoration. Inference algorithms based on graph cuts and belief propagation yield accurate results, but despite recent advances are often still too slow for practical u ..."
Abstract

Cited by 515 (8 self)
 Add to MetaCart
use. In this paper we present new algorithmic techniques that substantially improve the running time of the belief propagation approach. One of our techniques reduces the complexity of the inference algorithm to be linear rather than quadratic in the number of possible labels for each pixel, which
Large margin methods for structured and interdependent output variables
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2005
"... Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses the complementary ..."
Abstract

Cited by 624 (12 self)
 Add to MetaCart
the complementary issue of designing classification algorithms that can deal with more complex outputs, such as trees, sequences, or sets. More generally, we consider problems involving multiple dependent output variables, structured output spaces, and classification problems with class attributes. In order
FloatingPoint LLL Revisited
, 2005
"... The LenstraLenstraLovász lattice basis reduction algorithm (LLL or L³) is a very popular tool in publickey cryptanalysis and in many other fields. Given an integer ddimensional lattice basis with vectors of norm less than B in an ndimensional space, L³ outputs a socalled L³reduced basis in po ..."
Abstract

Cited by 50 (7 self)
 Add to MetaCart
The LenstraLenstraLovász lattice basis reduction algorithm (LLL or L³) is a very popular tool in publickey cryptanalysis and in many other fields. Given an integer ddimensional lattice basis with vectors of norm less than B in an ndimensional space, L³ outputs a socalled L³reduced basis
A new alternating minimization algorithm for total variation image reconstruction
 SIAM J. IMAGING SCI
, 2008
"... We propose, analyze and test an alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization. This algorithm arises from a new halfquadratic model applicable to not only the anisotropic but also isotropic forms of total variati ..."
Abstract

Cited by 224 (26 self)
 Add to MetaCart
We propose, analyze and test an alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization. This algorithm arises from a new halfquadratic model applicable to not only the anisotropic but also isotropic forms of total
The Ant System Applied To The Quadratic Assignment Problem
 IEEE transactions on Knowledge and Data Engineering
, 1994
"... In recent years there has been growing interest in algorithms inspired by the observation of natural phenomena to define computational procedures which can solve complex problems. In this article we describe a distributed heuristic algorithm which was inspired by the observation of the behavior of a ..."
Abstract

Cited by 176 (7 self)
 Add to MetaCart
In recent years there has been growing interest in algorithms inspired by the observation of natural phenomena to define computational procedures which can solve complex problems. In this article we describe a distributed heuristic algorithm which was inspired by the observation of the behavior
An LLLreduction algorithm with quasilinear time complexity
, 2010
"... Abstract. We devise an algorithm, e L 1, with the following specifications: It takes as input an arbitrary basis B = (bi)i ∈ Z d×d of a Euclidean lattice L; It computes a basis of L which is reduced for a mild modification of the LenstraLenstraLovász reduction; It terminates in time O(d 5+ε β + d ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
ω+1+ε β 1+ε) where β = log max ‖bi ‖ (for any ε> 0 and ω is a valid exponent for matrix multiplication). This is the first LLLreducing algorithm with a time complexity that is quasilinear in β and polynomial in d. The backbone structure of e L 1 is able to mimic the KnuthSchönhage fast gcd
FloatingPoint LLL: Theoretical and Practical Aspects
"... The textbook LLL algorithm can be sped up considerably by replacing the underlying rational arithmetic used for the GramSchmidt orthogonalisation by floatingpoint approximations. We review how this modification has been and is currently implemented, both in theory and in practice. Using floating ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
point approximations seems to be natural for LLL even from the theoretical point of view: it is the key to reach a bitcomplexity which is quadratic with respect to the bitlength of the input vectors entries, without fast integer multiplication. The latter bitcomplexity strengthens the connection between LLL
Worstcase complexity of the optimal LLL algorithm
 In Proceedings of LATIN’2000  Punta del Este. LNCS 1776
"... . In this paper, we consider the open problem of the complexity of the LLL algorithm in the case when the approximation parameter t of the algorithm has its extreme value 1. This case is of interest because the output is then the strongest Lovaszreduced basis. Experiments reported by Lagarias and ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
. In this paper, we consider the open problem of the complexity of the LLL algorithm in the case when the approximation parameter t of the algorithm has its extreme value 1. This case is of interest because the output is then the strongest Lovaszreduced basis. Experiments reported by Lagarias
Results 1  10
of
1,491