Results 1  10
of
955
Sublinear Time Orthogonal Tensor Decomposition *
"... Abstract A recent work (Wang et. al., NIPS 2015) gives the fastest known algorithms for orthogonal tensor decomposition with provable guarantees. Their algorithm is based on computing sketches of the input tensor, which requires reading the entire input. We show in a number of cases one can achiev ..."
Abstract
 Add to MetaCart
achieve the same theoretical guarantees in sublinear time, i.e., even without reading most of the input tensor. Instead of using sketches to estimate inner products in tensor decomposition algorithms, we use importance sampling. To achieve sublinear time, we need to know the norms of tensor slices, and we
ATOMIC DECOMPOSITION BY BASIS PURSUIT
, 1995
"... The TimeFrequency and TimeScale communities have recently developed a large number of overcomplete waveform dictionaries  stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods for d ..."
Abstract

Cited by 2728 (61 self)
 Add to MetaCart
The TimeFrequency and TimeScale communities have recently developed a large number of overcomplete waveform dictionaries  stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods
Orthogonal Tensor Decompositions
 SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS
, 2001
"... We explore the orthogonal decomposition of tensors (also known as multidimensional arrays or nway arrays) using two different definitions of orthogonality. We present numerous examples to illustrate the difficulties in understanding such decompositions. We conclude with a counterexample to a tensor ..."
Abstract

Cited by 124 (9 self)
 Add to MetaCart
We explore the orthogonal decomposition of tensors (also known as multidimensional arrays or nway arrays) using two different definitions of orthogonality. We present numerous examples to illustrate the difficulties in understanding such decompositions. We conclude with a counterexample to a
Orthogonal Rank Decompositions for Tensors
 In preparation
, 1999
"... rder and all subdimensions are equal) then the inner product of A and B is defined as A \Delta B j m 1 X i 1 =1 m 2 X i 2 =1 \Delta \Delta \Delta mn X i n=1 A i 1 i 2 \Delta\Delta\Deltai n B i 1 i 2 \Delta\Delta\Deltai n : Correspondingly, the norm of A, kAk, is defined as kAk 2 j A \D ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
\Delta A. A tensor A is a unit tensor if kAk = 1. A decomposed tensor is a tensor that can be written as x = x<F39
Balanced model reduction via the proper orthogonal decomposition
 AIAA Journal
, 2002
"... A new method for performing a balanced reduction of a highorder linear system is presented. The technique combines the proper orthogonal decomposition and concepts from balanced realization theory. The method of snapshots is used to obtain lowrank, reducedrange approximationsto the system control ..."
Abstract

Cited by 134 (6 self)
 Add to MetaCart
A new method for performing a balanced reduction of a highorder linear system is presented. The technique combines the proper orthogonal decomposition and concepts from balanced realization theory. The method of snapshots is used to obtain lowrank, reducedrange approximationsto the system
Tensor decompositions for learning latent variable models
, 2014
"... This work considers a computationally and statistically efficient parameter estimation method for a wide class of latent variable models—including Gaussian mixture models, hidden Markov models, and latent Dirichlet allocation—which exploits a certain tensor structure in their loworder observable mo ..."
Abstract

Cited by 83 (7 self)
 Add to MetaCart
moments (typically, of second and thirdorder). Specifically, parameter estimation is reduced to the problem of extracting a certain (orthogonal) decomposition of a symmetric tensor derived from the moments; this decomposition can be viewed as a natural generalization of the singular value decomposition
UR Thirdorder Orthogonal Tensor Product Expansion
"... Abstract—As a method of expanding a higherorder tensor data to tensor products of vectors we have proposed the Thirdorder Orthogonal Tensor Product Expansion (3OTPE) that did similar expansion as HigherOrder Singular Value Decomposition (HOSVD). In this paper we provide a computation algorithm to ..."
Abstract
 Add to MetaCart
Abstract—As a method of expanding a higherorder tensor data to tensor products of vectors we have proposed the Thirdorder Orthogonal Tensor Product Expansion (3OTPE) that did similar expansion as HigherOrder Singular Value Decomposition (HOSVD). In this paper we provide a computation algorithm
Multilinear Subspace Regression: An Orthogonal Tensor Decomposition Approach
"... A multilinear subspace regression model based on so called latent variable decomposition is introduced. Unlike standard regression methods which typically employ matrix (2D) data representations followed by vector subspace transformations, the proposed approach uses tensor subspace transformations t ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
A multilinear subspace regression model based on so called latent variable decomposition is introduced. Unlike standard regression methods which typically employ matrix (2D) data representations followed by vector subspace transformations, the proposed approach uses tensor subspace transformations
Orthogonal Decomposition of Symmetric
"... A real symmetric tensor is orthogonally decomposable (or odeco) if it can be written as a linear combination of symmetric powers of n vectors which form an orthonormal basis of Rn. Motivated by the spectral theorem for real symmetric matrices, we study the properties of odeco tensors. We give a form ..."
Abstract
 Add to MetaCart
A real symmetric tensor is orthogonally decomposable (or odeco) if it can be written as a linear combination of symmetric powers of n vectors which form an orthonormal basis of Rn. Motivated by the spectral theorem for real symmetric matrices, we study the properties of odeco tensors. We give a
Beyond streams and graphs: Dynamic tensor analysis
 In KDD
, 2006
"... How do we find patterns in authorkeyword associations, evolving over time? Or in DataCubes, with productbranchcustomer sales information? Matrix decompositions, like principal component analysis (PCA) and variants, are invaluable tools for mining, dimensionality reduction, feature selection, rule ..."
Abstract

Cited by 113 (16 self)
 Add to MetaCart
How do we find patterns in authorkeyword associations, evolving over time? Or in DataCubes, with productbranchcustomer sales information? Matrix decompositions, like principal component analysis (PCA) and variants, are invaluable tools for mining, dimensionality reduction, feature selection, rule
Results 1  10
of
955