## Tensor Decompositions and Applications (2009)

### Cached

### Download Links

Venue: | SIAM REVIEW |

Citations: | 228 - 14 self |

### BibTeX

@ARTICLE{Kolda09tensordecompositions,

author = {Tamara G. Kolda and Brett W. Bader},

title = {Tensor Decompositions and Applications},

journal = {SIAM REVIEW},

year = {2009},

volume = {51},

number = {3},

pages = {455--500}

}

### OpenURL

### Abstract

This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or N -way array. Decompositions of higher-order tensors (i.e., N -way arrays with N â¥ 3) have applications in psychometrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, etc. Two particular tensor decompositions can be considered to be higher-order extensions of the matrix singular value decompo- sition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rank-one tensors, and the Tucker decomposition is a higher-order form of principal components analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The N-way Toolbox and Tensor Toolbox, both for MATLAB, and the Multilinear Engine are examples of software packages for working with tensors.

### Citations

2721 | Indexing by Latent Semantic Analysis
- Deerwester, Dumais, et al.
- 1990
(Show Context)
Citation Context ... PARAFAC2 is that it can be used on the original data. Chew et al. [44] used PARAFAC2 for clustering documents across multiple languages. The idea is to extend the concept of latent semantic indexing =-=[75, 72, 21]-=- in a cross-language context by using multiple translations of the same collection of documents, i.e., a parallel corpus. In this case, Xk is a term-by-document matrix for the kth language in the para... |

2197 | The art of computer programming - Knuth - 1973 |

1357 |
Independent component analysis, a new concept
- Comon
- 1994
(Show Context)
Citation Context ...s that optimizes a function that measures the “simplicity” of the core as measured by some objective [120]. Another is to use a Jacobi-type algorithm to maximize the magnitude of the diagonal entries =-=[46, 65, 162]-=-. Finally, the HOSVD generates an all-orthogonal core, as mentioned previously, which is yet another type of special core structure that might be useful. 4.4. Applications of Tucker. Several examples ... |

989 |
H.S.: Learning the parts of objects by non-negative matrix factorization
- Lee, Seung
- 1999
(Show Context)
Citation Context ...al indeterminacy. A restricted form of PARATUCK2, with two columns in A and three in B, is appropriate in this case. 5.6. Nonnegative Tensor Factorizations. Paatero and Tapper [181] and Lee and Seung =-=[151]-=- proposed using nonnegative matrix factorizations for analyzing nonnegative data, such as environmental models and grayscale images, because it is desirable for the decompositions to retain the nonneg... |

533 | Using linear algebra for intelligent information retrieval
- Berry, Dumais, et al.
- 1995
(Show Context)
Citation Context ... PARAFAC2 is that it can be used on the original data. Chew et al. [44] used PARAFAC2 for clustering documents across multiple languages. The idea is to extend the concept of latent semantic indexing =-=[75, 72, 21]-=- in a cross-language context by using multiple translations of the same collection of documents, i.e., a parallel corpus. In this case, Xk is a term-by-document matrix for the kth language in the para... |

373 |
Gaussian elimination is not optimal
- Strassen
- 1969
(Show Context)
Citation Context ...uth [130, sec. 4.6.4]. The most interesting example of this is Strassen matrix multiplication, which is an application of a decomposition of a 4 × 4 × 4 tensor to describe 2 × 2 matrix multiplication =-=[208, 141, 147, 24]-=-. In the last ten years, interest in tensor decompositions has expanded to other fields. Examples include signal processing [62, 196, 47, 43, 68, 80, 173, 60, 61], numerical linear algebra [87, 63, 64... |

307 |
Analysis of individual differences in multidimensional scaling via an n-way generalization of Eckart-Young decomposition
- Carroll, Chang
- 1970
(Show Context)
Citation Context ..., 106], and the idea of a multiway model is attributed to Cattell in 1944 [40, 41]. These concepts received scant attention until the work of Tucker in the 1960s [224, 225, 226] and Carroll and Chang =-=[38]-=- and Harshman [90] in 1970, all of which appeared in psychometrics literature. Appellof and Davidson [13] are generally credited as being the first to use tensor decompositions (in 1981) in chemometri... |

262 |
Foundations of the PARAFAC procedure: Models and conditions for an explanatory multimodal factor analysis. UCLA Working Papers Phonet
- Harshman
- 1970
(Show Context)
Citation Context ...ea of a multiway model is attributed to Cattell in 1944 [40, 41]. These concepts received scant attention until the work of Tucker in the 1960s [224, 225, 226] and Carroll and Chang [38] and Harshman =-=[90]-=- in 1970, all of which appeared in psychometrics literature. Appellof and Davidson [13] are generally credited as being the first to use tensor decompositions (in 1981) in chemometrics, and tensors ha... |

261 |
Positive matrix factorization: A non-negative factor model with optimal utilization of error estimates of data values
- Paatero, Tapper
- 1994
(Show Context)
Citation Context ...would introduce rotational indeterminacy. A restricted form of PARATUCK2, with two columns in A and three in B, is appropriate in this case. 5.6. Nonnegative Tensor Factorizations. Paatero and Tapper =-=[181]-=- and Lee and Seung [151] proposed using nonnegative matrix factorizations for analyzing nonnegative data, such as environmental models and grayscale images, because it is desirable for the decompositi... |

231 | A multilinear Singular Value Decomposition
- Lathauwer, Moor, et al.
- 2000
(Show Context)
Citation Context ...1, 147, 24]. In the last ten years, interest in tensor decompositions has expanded to other fields. Examples include signal processing [62, 196, 47, 43, 68, 80, 173, 60, 61], numerical linear algebra =-=[87, 63, 64, 132, 244, 133, 149]-=-, computer vision [229, 230, 231, 190, 236, 237, 193, 102, 235], numerical analysis [22, 108, 23, 89, 114, 88, 115], data mining [190, 4, 157, 211, 5, 209, 210, 44, 14], graph analysis [136, 135, 15],... |

230 |
The approximation of one matrix by another of lower rank
- Eckart, Young
- 1936
(Show Context)
Citation Context ...R has a CP decomposition that is generically unique if R ≤ L and R(R − 1) ≤ IJK(3IJK − IJ − IK − JK − I − J − K +3)/4. 3.3. Low-Rank Approximations and the Border Rank. For matrices, Eckart and Young =-=[76]-=- showed that a best rank-k approximation is given by the leading k factors of the SVD. In other words, let R be the rank of a matrix A and assume its SVD is given by A = R∑ r=1 σr ur ◦ vr with σ1 ≥ σ2... |

202 |
Some mathematical notes on three-mode factor analysis
- Tucker
- 1966
(Show Context)
Citation Context ... originated with Hitchcock in 1927[105, 106], and the idea of a multiway model is attributed to Cattell in 1944 [40, 41]. These concepts received scant attention until the work of Tucker in the 1960s =-=[224, 225, 226]-=- and Carroll and Chang [38] and Harshman [90] in 1970, all of which appeared in psychometrics literature. Appellof and Davidson [13] are generally credited as being the first to use tensor decompositi... |

170 |
Three-way arrays: rank and uniqueness of trilinear decompositions with application to arithmetic complexity and statistics
- Kruskal
- 1977
(Show Context)
Citation Context ...uth [130, sec. 4.6.4]. The most interesting example of this is Strassen matrix multiplication, which is an application of a decomposition of a 4 × 4 × 4 tensor to describe 2 × 2 matrix multiplication =-=[208, 141, 147, 24]-=-. In the last ten years, interest in tensor decompositions has expanded to other fields. Examples include signal processing [62, 196, 47, 43, 68, 80, 173, 60, 61], numerical linear algebra [87, 63, 64... |

147 | Multilinear analysis of image ensembles: Tensorfaces - Vasilescu, Terzopoulos - 2002 |

118 | The varimax criterion for analytic rotation in factor analysis - Kaiser - 1958 |

96 | Using latent semantic analysis to improve access to textual information
- Dumais, Furnas, et al.
- 1988
(Show Context)
Citation Context ... PARAFAC2 is that it can be used on the original data. Chew et al. [44] used PARAFAC2 for clustering documents across multiple languages. The idea is to extend the concept of latent semantic indexing =-=[75, 72, 21]-=- in a cross-language context by using multiple translations of the same collection of documents, i.e., a parallel corpus. In this case, Xk is a term-by-document matrix for the kth language in the para... |

93 | Face transfer with multilinear models
- Vlasic, Brand, et al.
(Show Context)
Citation Context ...nsor decompositions has expanded to other fields. Examples include signal processing [62, 196, 47, 43, 68, 80, 173, 60, 61], numerical linear algebra [87, 63, 64, 132, 244, 133, 149], computer vision =-=[229, 230, 231, 190, 236, 237, 193, 102, 235]-=-, numerical analysis [22, 108, 23, 89, 114, 88, 115], data mining [190, 4, 157, 211, 5, 209, 210, 44, 14], graph analysis [136, 135, 15], neuroscience [20, 163, 165, 167, 170, 168, 169, 2, 3, 70, 71],... |

84 | Orthogonal Tensor Decompositions
- Kolda
- 2001
(Show Context)
Citation Context ...1, 147, 24]. In the last ten years, interest in tensor decompositions has expanded to other fields. Examples include signal processing [62, 196, 47, 43, 68, 80, 173, 60, 61], numerical linear algebra =-=[87, 63, 64, 132, 244, 133, 149]-=-, computer vision [229, 230, 231, 190, 236, 237, 193, 102, 235], numerical analysis [22, 108, 23, 89, 114, 88, 115], data mining [190, 4, 157, 211, 5, 209, 210, 44, 14], graph analysis [136, 135, 15],... |

79 | Non-negative tensor factorization with applications to statistics and computer vision
- Shashua, Hazan
- 2005
(Show Context)
Citation Context ...nsor decompositions has expanded to other fields. Examples include signal processing [62, 196, 47, 43, 68, 80, 173, 60, 61], numerical linear algebra [87, 63, 64, 132, 244, 133, 149], computer vision =-=[229, 230, 231, 190, 236, 237, 193, 102, 235]-=-, numerical analysis [22, 108, 23, 89, 114, 88, 115], data mining [190, 4, 157, 211, 5, 209, 210, 44, 14], graph analysis [136, 135, 15], neuroscience [20, 163, 165, 167, 170, 168, 169, 2, 3, 70, 71],... |

78 | CubeSVD: a novel approach to personalized Web search
- Sun, Zeng, et al.
(Show Context)
Citation Context ...173, 60, 61], numerical linear algebra [87, 63, 64, 132, 244, 133, 149], computer vision [229, 230, 231, 190, 236, 237, 193, 102, 235], numerical analysis [22, 108, 23, 89, 114, 88, 115], data mining =-=[190, 4, 157, 211, 5, 209, 210, 44, 14]-=-, graph analysis [136, 135, 15], neuroscience [20, 163, 165, 167, 170, 168, 169, 2, 3, 70, 71], and more. Several surveys have been written in other fields [138, 52, 104, 27, 28, 47, 129, 78, 48, 200,... |

75 |
Principal component analysis of three-mode data by means of alternating least squares algorithms
- Kroonenberg, Leeuw
- 1980
(Show Context)
Citation Context ...cker decomposition (some specific to three-way and some for N-way). Name Proposed by Three-mode factor analysis (3MFA/Tucker3) Tucker, 1966 [226] Three-mode PCA (3MPCA) Kroonenberg and De Leeuw, 1980 =-=[140]-=- N-mode PCA Kapteyn et al., 1986 [113] Higher-order SVD (HOSVD) De Lathauwer et al., 2000 [63] N-mode SVD Vasilescu and Terzopoulos, 2002 [229] The Tucker decomposition is a form of higher-order PCA. ... |

75 | Multilinear subspace analysis of image ensembles
- Vasilescu, Terzopoulos
- 2003
(Show Context)
Citation Context ...nsor decompositions has expanded to other fields. Examples include signal processing [62, 196, 47, 43, 68, 80, 173, 60, 61], numerical linear algebra [87, 63, 64, 132, 244, 133, 149], computer vision =-=[229, 230, 231, 190, 236, 237, 193, 102, 235]-=-, numerical analysis [22, 108, 23, 89, 114, 88, 115], data mining [190, 4, 157, 211, 5, 209, 210, 44, 14], graph analysis [136, 135, 15], neuroscience [20, 163, 165, 167, 170, 168, 169, 2, 3, 70, 71],... |

71 | Polynomial interpolation in several variables
- Alexander, Hirschowitz
- 1995
(Show Context)
Citation Context ...) ⎤ ⎥ ,TENSOR DECOMPOSITIONS AND APPLICATIONS 467 except for when (N,I) ∈{(3, 5), (4, 3), (4, 4), (4, 5)}, in which case it should be increased by one. The result is due to Alexander and Hirschowitz =-=[7, 49]-=-. 3.2. Uniqueness. An interesting property of higher-order tensors is that their rank decompositions are often unique, whereas matrix decompositions are not. Sidiropoulos and Bro [199] and Ten Berge [... |

71 | Beyond streams and graphs: dynamic tensor analysis
- Sun, Tao, et al.
(Show Context)
Citation Context ...173, 60, 61], numerical linear algebra [87, 63, 64, 132, 244, 133, 149], computer vision [229, 230, 231, 190, 236, 237, 193, 102, 235], numerical analysis [22, 108, 23, 89, 114, 88, 115], data mining =-=[190, 4, 157, 211, 5, 209, 210, 44, 14]-=-, graph analysis [136, 135, 15], neuroscience [20, 163, 165, 167, 170, 168, 169, 2, 3, 70, 71], and more. Several surveys have been written in other fields [138, 52, 104, 27, 28, 47, 129, 78, 48, 200,... |

69 | Tensor rank and the ill-posedness of the best low-rank approximation problem
- Silva, Lim
(Show Context)
Citation Context ...coordinate system. This notion of tensors is not to be confused with tensors in physics and engineering (such as stress tensors) [175], which are generally referred to as tensor fields in mathematics =-=[69]-=-. A third-order tensor has three indices, as shown in Figure 1.1. A first-order tensor is a vector, a second-order tensor is a matrix, and tensors of order three or higher are called higher-order tens... |

69 | Blind PARAFAC receivers for DSCDMA systems
- Sidiropoulos, Giannakis, et al.
- 2000
(Show Context)
Citation Context ...modeling of fluorescence excitation-emission data. Sidiropoulos, Bro, and Giannakis [196] considered the application of CP to sensor array processing. Other applications in telecommunications include =-=[198, 197, 59]-=-. CP also has important applications in independent component analysis (ICA); see [58] and references therein. Several authors have used CP decompositions in neuroscience. As mentioned previously, Möc... |

69 | On the best rank-1 and rank-(r1,r2,. . .,rn) approximation of higher-order tensors
- Lathauwer, Moor, et al.
(Show Context)
Citation Context ... TUCKALS2 that computed the Tucker2 decomposition of a threeway array.) Kapteyn, Neudecker, and Wansbeek [113] later extended TUCKALS3 to N-way arrays for N > 3. De Lathauwer, De Moor, and Vandewalle =-=[64]-=- proposed more efficient techniques for calculating the factor matrices (specifically, computing only the dominant singular vectors of X (n) and using an SVD rather than an eigenvalue decomposition or... |

68 | The ubiquitous Kronecker product - Loan - 2000 |

67 |
Multi-way Analysis
- Smilde, Bro, et al.
- 2004
(Show Context)
Citation Context ...st to use tensor decompositions (in 1981) in chemometrics, and tensors have since become extremely popular in that field [103, 201, 27, 28, 31, 152, 241, 121, 12, 9, 29], even spawning a book in 2004 =-=[200]-=-. In parallel to the developments in psychometrics and chemometrics, there was a great deal of interest in decompositions of bilinear forms in the field of algebraic complexity; see, e.g., Knuth [130,... |

65 | Parallel factor analysis in sensor array processing
- Sidiropoulos, Bro, et al.
(Show Context)
Citation Context ...4 × 4 × 4 tensor to describe 2 × 2 matrix multiplication [208, 141, 147, 24]. In the last ten years, interest in tensor decompositions has expanded to other fields. Examples include signal processing =-=[62, 196, 47, 43, 68, 80, 173, 60, 61]-=-, numerical linear algebra [87, 63, 64, 132, 244, 133, 149], computer vision [229, 230, 231, 190, 236, 237, 193, 102, 235], numerical analysis [22, 108, 23, 89, 114, 88, 115], data mining [190, 4, 157... |

62 |
Algorithm 862: MATLAB tensor classes for fast algorithm prototyping
- Bader, Kolda
(Show Context)
Citation Context ...ng CP and Tucker, with an emphasis toward data analysis in chemometrics. Like the N-way Toolbox, the PLS Toolbox can handle constraints and missing data. The MATLAB Tensor Toolbox, by Bader and Kolda =-=[16, 17, 18]-=-, is a generalpurpose set of classes that extends MATLAB’s core capabilities to support operations such as tensor multiplication and matricization. It comes with ALS-based algorithms for CP and Tucker... |

62 | Decomposition of quantics in sums of power of linear forms, Signal Process
- Comon, Mourrain
- 1996
(Show Context)
Citation Context ...the elements of G are zero, thereby eliminating interactions between corresponding components and improving uniqueness. Superdiagonalization of the core is impossible (even in the symmetric case; see =-=[50, 49]-=-), but it is possible to try to make as many elements zero or very small as possible. This was first observed by Tucker [226] and has been studied by several authors; see, e.g., [117, 103, 127, 172, 1... |

56 | Multilinear image analysis for facial recognition
- Vasilescu, Terzopoulos
(Show Context)
Citation Context |

54 |
Tutorial and applications
- Bro, “PARAFAC
- 1997
(Show Context)
Citation Context ... literature. Appellof and Davidson [13] are generally credited as being the first to use tensor decompositions (in 1981) in chemometrics, and tensors have since become extremely popular in that field =-=[103, 201, 27, 28, 31, 152, 241, 121, 12, 9, 29]-=-, even spawning a book in 2004 [200]. In parallel to the developments in psychometrics and chemometrics, there was a great deal of interest in decompositions of bilinear forms in the field of algebrai... |

53 |
Tensorial Extensions of Independent Component Analysis for Group FMRI Data Analysis. NeuroImage
- Beckmann, Smith
- 2005
(Show Context)
Citation Context ...ion [229, 230, 231, 190, 236, 237, 193, 102, 235], numerical analysis [22, 108, 23, 89, 114, 88, 115], data mining [190, 4, 157, 211, 5, 209, 210, 44, 14], graph analysis [136, 135, 15], neuroscience =-=[20, 163, 165, 167, 170, 168, 169, 2, 3, 70, 71]-=-, and more. Several surveys have been written in other fields [138, 52, 104, 27, 28, 47, 129, 78, 48, 200, 69, 29, 6, 184], and a book has appeared very recently on multiway data analysis [139]. Moreo... |

53 |
A fast non-negativity-constrained least squares algorithm
- Bro, Jong
- 1997
(Show Context)
Citation Context ...negative tensor factorization generically as NTF but fail to differentiate between CP and Tucker. Hereafter, we use the terminology NNCP (nonnegative CP) and NNT (nonnegative Tucker). Bro and De Jong =-=[32]-=- consider NNCP and NNT. They solve the subproblems in CP-ALS and Tucker-ALS with a specially adapted version of the NNLS method of Lawson and Hanson [150]. In the case of NNCP for a third-order tensor... |

51 |
Facial expression decomposition
- Wang, Ahuja
- 2003
(Show Context)
Citation Context |

50 |
2000, The N-way toolbox for MATLAB
- Andersson, Bro
(Show Context)
Citation Context ... 27, 28, 47, 129, 78, 48, 200, 69, 29, 6, 184], and a book has appeared very recently on multiway data analysis [139]. Moreover, there are several software packages available for working with tensors =-=[179, 11, 146, 85, 16, 17, 18, 239, 243]-=-. Wherever possible, the titles in the references section of this review are hyperlinked to either the publisher web page for the paper or the author’s version. Many older papers are now available onl... |

50 | Rank-one approximation to high order tensors - Zhang, Golub |

50 | Multi-way analysis in the food industry: Models, algorithms and applications
- Bro
- 1998
(Show Context)
Citation Context ... literature. Appellof and Davidson [13] are generally credited as being the first to use tensor decompositions (in 1981) in chemometrics, and tensors have since become extremely popular in that field =-=[103, 201, 27, 28, 31, 152, 241, 121, 12, 9, 29]-=-, even spawning a book in 2004 [200]. In parallel to the developments in psychometrics and chemometrics, there was a great deal of interest in decompositions of bilinear forms in the field of algebrai... |

49 |
Linear image coding for regression and classification using the tensor-rank principle
- Shashua, Levin
- 2001
(Show Context)
Citation Context ...tanglement in online chat rooms. In text analysis, Bader, Berry, and Browne [14] used CP for automatic conversation detection in email over time using a term-byauthor-by-time array. Shashua and Levin =-=[194]-=- applied CP to image compression and classification. Furukawa et al. [82] applied a CP model to bidirectional texture functions in order to build a compressed texture database. Bauckhage [19] extended... |

48 |
link between the canonical decomposition in multilinear algebra and simultaneous matrix diagonalization
- Lathauwer, “A
(Show Context)
Citation Context ...A (N)) = R. They further observed that since rank(A ⊙ B) ≤ rank(A ⊗ B) ≤ rank(A) · rank(B), an even simpler necessary condition is ⎛ N∏ ⎜ min ⎝ rank(A (m) ⎞ ⎟ ) ⎠ ≥ R. n=1,...,N m=1 m̸=n De Lathauwer =-=[55]-=- has looked at methods to determine the rank of a tensor and the question of when a given CP decomposition is deterministically or generically (i.e., with probability one) unique. The CP decomposition... |

48 | Multilinear Independent Components Analysis
- Vasilescu, Terzopoulos
(Show Context)
Citation Context ... Mahoney, Maggioni, and Drineas [161] extend the matrix CUR decomposition to tensors; others have done related work using sampling methods for tensor approximation [74, 54]. Vasilescu and Terzopoulos =-=[233]-=- explored higher-order versions of ICA, a variation of PCA that, in some sense, rotates the principal components so that they are statistically independent. Beckmann and Smith [20] extended CP to deve... |

47 | Efficient MATLAB computations with sparse and factored tensors
- Bader, Kolda
- 2006
(Show Context)
Citation Context ... 27, 28, 47, 129, 78, 48, 200, 69, 29, 6, 184], and a book has appeared very recently on multiway data analysis [139]. Moreover, there are several software packages available for working with tensors =-=[179, 11, 146, 85, 16, 17, 18, 239, 243]-=-. Wherever possible, the titles in the references section of this review are hyperlinked to either the publisher web page for the paper or the author’s version. Many older papers are now available onl... |

46 | On the best rank-1 approximation of higher-order supersymmetric tensors
- Kofidis, Regalia
(Show Context)
Citation Context ...hm in Figure 3.3 to sparse tensors. Zhang and Golub [244] proposed a generalized Rayleigh–Newton iteration to compute a rank-one factor, which is another way to compute greedy CP. Kofidis and Regalia =-=[131]-=- presented a higher-order power method for supersymmetric tensors. We conclude this section by noting that there have also been substantial developments on variations of CP to account for missing valu... |

46 | On the uniqueness of multilinear decomposition of N-way arrays
- Sidiropoulos, Bro
- 2000
(Show Context)
Citation Context ...d Hirschowitz [7, 49]. 3.2. Uniqueness. An interesting property of higher-order tensors is that their rank decompositions are often unique, whereas matrix decompositions are not. Sidiropoulos and Bro =-=[199]-=- and Ten Berge [213] provide some history of uniqueness results for CP. The earliest uniqueness result is due to Harshman in 1970 [90], which he in turn credits to Dr. Robert Jennich. Harshman’s resul... |

45 | Sparse image coding using a 3D non-negative tensor factorization
- Hazan, Polak, et al.
- 2005
(Show Context)
Citation Context |

45 | J.P.: Higher-order web link analysis using multilinear algebra
- Kolda, Bader, et al.
- 2005
(Show Context)
Citation Context ... 244, 133, 149], computer vision [229, 230, 231, 190, 236, 237, 193, 102, 235], numerical analysis [22, 108, 23, 89, 114, 88, 115], data mining [190, 4, 157, 211, 5, 209, 210, 44, 14], graph analysis =-=[136, 135, 15]-=-, neuroscience [20, 163, 165, 167, 170, 168, 169, 2, 3, 70, 71], and more. Several surveys have been written in other fields [138, 52, 104, 27, 28, 47, 129, 78, 48, 200, 69, 29, 6, 184], and a book ha... |

45 |
Towards a standardized notation and terminology in multiway analysis
- Kiers
(Show Context)
Citation Context ...at would be familiar to applied mathematicians and with the terminology of previous publications in the area of tensor decompositions. The notation used here is very similar to that proposed by Kiers =-=[122]-=-. Other standards have been proposed as well; see Harshman [94] and Harshman and Hong [96]. The order of a tensor is the number of dimensions, also known as ways or modes. 3 Vectors (tensors of order ... |

44 | Tensor rank is NP-complete - Håstad - 1990 |