Results 1 
2 of
2
ON THE TENSOR SVD AND OPTIMAL LOW RANK ORTHOGONAL APPROXIMATIONS OF TENSORS ∗
"... Abstract. It is known that a high order tensor does not necessarily have an optimal low rank approximation, and that a tensor might not be orthogonally decomposable (i.e., admit a tensor SVD). We provide several sufficient conditions which lead to the failure of the tensor SVD, and characterize the ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract. It is known that a high order tensor does not necessarily have an optimal low rank approximation, and that a tensor might not be orthogonally decomposable (i.e., admit a tensor SVD). We provide several sufficient conditions which lead to the failure of the tensor SVD, and characterize the existence of the tensor SVD with respect to the Higher Order SVD (HOSVD) of a tensor. In face of these difficulties to generalize standard results known in the matrix case to tensors, we consider low rank orthogonal approximations of tensors. The existence of an optimal approximation is theoretically guaranteed under certain conditions, and this optimal approximation yields a tensor decomposition where the diagonal of the core is maximized. We present an algorithm to compute this approximation and analyze its convergence behavior. Key words. multilinear algebra, singular value decomposition, tensor decomposition, low rank approximation AMS subject classifications. 15A69, 15A18
FACTORIZATION STRATEGIES FOR THIRDORDER TENSORS ∗
"... Abstract. Operations with tensors, or multiway arrays, have become increasingly prevalent in recent years. Traditionally, tensors are represented or decomposed as a sum of rank1 outer products using either the CANDECOMP/PARAFAC (CP) or the Tucker models, or some variation thereof. Such decompositio ..."
Abstract
 Add to MetaCart
Abstract. Operations with tensors, or multiway arrays, have become increasingly prevalent in recent years. Traditionally, tensors are represented or decomposed as a sum of rank1 outer products using either the CANDECOMP/PARAFAC (CP) or the Tucker models, or some variation thereof. Such decompositions are motivated by specific applications where the goal is to find an approximate such representation for a given multiway array. The specifics of the approximate representation (such as how many terms to use in the sum, orthogonality constraints, etc.) depend on the application. In this paper, we explore an alternate representation of tensors which shows promise with respect to the tensor approximation problem. Reminiscent of matrix factorizations, we present a new factorization of a tensor as a product of tensors. To derive the new factorization, we define a closed multiplication operation between tensors. A major motivation for considering this new type of tensor multiplication is to devise new types of factorizations for tensors which can then be used in applications. Specifically, this new multiplication allows us to introduce concepts such as tensor transpose, inverse, and identity, which lead to the notion of an orthogonal tensor. The multiplication also gives rise to a linear operator, and the null space of the resulting operator is identified. We extend the concept of outer products of vectors to outer products of matrices. All derivations are presented for thirdorder tensors. However, they can be easily extended to the orderp (p> 3) case. We conclude with an application in image deblurring. Key words. multilinear algebra, tensor decomposition, singular value decomposition, multidimensional arrays AMS subject classifications. 15A69, 65F30 1. Introduction. With