Results 1  10
of
753
Explicit Dimension Reduction and Its Applications
"... We construct a small set of explicit linear transformations mapping Rn to RO(log n), such that the L2 norm of any vector in Rn is distorted by at most 1 ± o(1) in at least a fraction of 1 − o(1) of the transformations in the set. Albeit the tradeoff between the distortion and the success probability ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
We construct a small set of explicit linear transformations mapping Rn to RO(log n), such that the L2 norm of any vector in Rn is distorted by at most 1 ± o(1) in at least a fraction of 1 − o(1) of the transformations in the set. Albeit the tradeoff between the distortion and the success
Bounded geometries, fractals, and lowdistortion embeddings
"... The doubling constant of a metric space (X; d) is thesmallest value * such that every ball in X can be covered by * balls of half the radius. The doubling dimension of X isthen defined as dim(X) = log2 *. A metric (or sequence ofmetrics) is called doubling precisely when its doubling dimension is ..."
Abstract

Cited by 198 (40 self)
 Add to MetaCart
The doubling constant of a metric space (X; d) is thesmallest value * such that every ball in X can be covered by * balls of half the radius. The doubling dimension of X isthen defined as dim(X) = log2 *. A metric (or sequence ofmetrics) is called doubling precisely when its doubling dimension
Faster Dimension Reduction
, 2010
"... Data represented geometrically in highdimensional vector spaces can be found in many applications. Images and videos, are often represented by assigning a dimension for every pixel (and time). Text documents may be represented in a vector space where each word in the dictionary incurs a dimension ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
Data represented geometrically in highdimensional vector spaces can be found in many applications. Images and videos, are often represented by assigning a dimension for every pixel (and time). Text documents may be represented in a vector space where each word in the dictionary incurs a
Fast Dimension Reduction Using Rademacher Series on Dual BCH Codes
"... The Fast JohnsonLindenstrauss Transform (FJLT) was recently discovered by Ailon and Chazelle as a novel technique for performing fast dimension reduction with small distortion from ℓ d 2 to ℓ k 2 in time O(max{d log d, k 3}). For k in [Ω(log d), O(d 1/2)] this beats time O(dk) achieved by naive mul ..."
Abstract

Cited by 75 (10 self)
 Add to MetaCart
of random bits used and reduction to ℓ1 space. The connection between geometry and discrete coding theory discussed here is interesting in its own right and may be useful in other algorithmic applications as well.
Filtered MatrixVector Products via the Lanczos Algorithm with Applications to Dimension Reduction
, 2007
"... This paper discusses an efficient technique for computing filtered matrixvector (matvec) products by exploiting the Lanczos algorithm. The goal of the proposed method, which is the same as that of the truncated singular value decomposition (SVD), is to preserve the quality of the resulting matvec ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
vec product in major singular directions of the matrix. Unlike the SVDbased techniques, the proposed algorithms achieve this goal by using a small number of Lanczos vectors, without explicitly computing the major singular values/vectors of the matrix. The main advantage of the proposed method is its low cost
Dimension reduction in regression estimation with nearest neighbor
 Electronic Journal of Statistics
"... In regression with a highdimensional predictor vector, dimension reduction methods aim at replacing the predictor by a lower dimensional version without loss of information on the regression. In this context, the socalled central mean subspace is the key of dimension reduction. The last two decade ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
to be consistent. Improvement due to the dimension reduction step is then observed in term of its rate of convergence. All the results are distributionsfree. As an application, we give an explicit rate of convergence using the SIR method.
Structure preserving dimension reduction for clustered text data based on the generalized singular value decomposition
 SIAM Journal on Matrix Analysis and Applications
, 2003
"... Abstract. In today’s vector space information retrieval systems, dimension reduction is imperative for efficiently manipulating the massive quantity of data. To be useful, this lowerdimensional representation must be a good approximation of the full document set. To that end, we adapt and extend th ..."
Abstract

Cited by 53 (19 self)
 Add to MetaCart
be nonsingular, which restricts its application to document sets in which the number of terms does not exceed the number of documents. We show that by using the generalized singular value decomposition (GSVD), we can achieve the same goal regardless of the relative dimensions of the termdocument matrix
Separable Nonlinear Least Squares: the Variable Projection Method and its Applications
 Institute of Physics, Inverse Problems
, 2002
"... this paper nonlinear data fitting problems which have as their underlying model a linear combination of nonlinear functions. More generally, one can also consider that there are two sets of unknown parameters, where one set is dependent on the other and can be explicitly eliminated. Models of this t ..."
Abstract

Cited by 96 (2 self)
 Add to MetaCart
this paper nonlinear data fitting problems which have as their underlying model a linear combination of nonlinear functions. More generally, one can also consider that there are two sets of unknown parameters, where one set is dependent on the other and can be explicitly eliminated. Models
On dimensionality reduction for classification and its application
 in IEEE Int. Conf. Acoust., Speech. Signal Processing
, 2006
"... In this paper, we evaluate the contribution of the classification constrained dimensionality reduction (CCDR) algorithm to the performance of several classifiers. We present an extension to previously introduced CCDR algorithm to multiple hypotheses. We investigate classification performance using t ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
In this paper, we evaluate the contribution of the classification constrained dimensionality reduction (CCDR) algorithm to the performance of several classifiers. We present an extension to previously introduced CCDR algorithm to multiple hypotheses. We investigate classification performance using
Results 1  10
of
753