Results 1  10
of
56
Tensor Decompositions and Applications
 SIAM REVIEW
, 2009
"... This survey provides an overview of higherorder tensor decompositions, their applications, and available software. A tensor is a multidimensional or N way array. Decompositions of higherorder tensors (i.e., N way arrays with N â¥ 3) have applications in psychometrics, chemometrics, signal proce ..."
Abstract

Cited by 512 (15 self)
 Add to MetaCart
(Show Context)
This survey provides an overview of higherorder tensor decompositions, their applications, and available software. A tensor is a multidimensional or N way array. Decompositions of higherorder tensors (i.e., N way arrays with N â¥ 3) have applications in psychometrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, etc. Two particular tensor decompositions can be considered to be higherorder extensions of the matrix singular value decompo
sition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rankone tensors, and the Tucker decomposition is a higherorder form of principal components analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The Nway Toolbox and Tensor Toolbox, both for MATLAB, and the Multilinear Engine are examples of software packages for working with tensors.
Fast Local Algorithms for Large Scale Nonnegative Matrix and Tensor Factorizations
, 2008
"... Nonnegative matrix factorization (NMF) and its extensions such as Nonnegative Tensor Factorization (NTF) have become prominent techniques for blind sources separation (BSS), analysis of image databases, data mining and other information retrieval and clustering applications. In this paper we propose ..."
Abstract

Cited by 37 (10 self)
 Add to MetaCart
Nonnegative matrix factorization (NMF) and its extensions such as Nonnegative Tensor Factorization (NTF) have become prominent techniques for blind sources separation (BSS), analysis of image databases, data mining and other information retrieval and clustering applications. In this paper we propose a family of efficient algorithms for NMF/NTF, as well as sparse nonnegative coding and representation, that has many potential applications in computational neuroscience, multisensory processing, compressed sensing and multidimensional data analysis. We have developed a class of optimized local algorithms which are referred to as Hierarchical Alternating Least Squares (HALS) algorithms. For these purposes, we have performed sequential constrained minimization on a set of squared Euclidean distances. We then extend this approach to robust cost functions using the Alpha and Beta divergences and derive flexible update rules. Our algorithms are locally stable and work well for NMFbased blind source separation (BSS) not only for the overdetermined case but also for an underdetermined (overcomplete) case (i.e., for a system which has less sensors than sources) if data are sufficiently sparse. The NMF learning rules are extended and generalized for Nth order nonnegative tensor factorization (NTF). Moreover, these algorithms can be tuned to different noise statistics by adjusting a single parameter. Extensive experimental results confirm the accuracy and computational performance of the developed algorithms, especially, with usage of multilayer hierarchical NMF approach [3].
On the Uniqueness of Nonnegative Sparse Solutions to Underdetermined Systems of Equations
, 2008
"... An underdetermined linear system of equations Ax = b with nonnegativity constraint x 0 is considered. It is shown that for matrices A with a rowspan intersecting the positive orthant, if this problem admits a sufficiently sparse solution, it is necessarily unique. The bound on the required sparsity ..."
Abstract

Cited by 34 (0 self)
 Add to MetaCart
(Show Context)
An underdetermined linear system of equations Ax = b with nonnegativity constraint x 0 is considered. It is shown that for matrices A with a rowspan intersecting the positive orthant, if this problem admits a sufficiently sparse solution, it is necessarily unique. The bound on the required sparsity depends on a coherence property of the matrix A. This coherence measure can be improved by applying a conditioning stage on A, thereby strengthening the claimed result. The obtained uniqueness theorem relies on an extended theoretical analysis of the `00`1 equivalence developed here as well, considering a matrix A with arbitrary column norms, and an arbitrary monotone elementwise concave penalty replacing the `1norm objective function. Finally, from a numerical point of view, a greedy algorithm—a variant of the matching pursuit—is presented, such that it is guaranteed to find this sparse solution. It is further shown how this algorithm can benefit from welldesigned conditioning of A.
Hierarchical ALS Algorithms for Nonnegative Matrix and 3D Tensor Factorization
 In: Independent Component Analysis, ICA07
"... Abstract. In the paper we present new Alternating Least Squares (ALS) algorithms for Nonnegative Matrix Factorization (NMF) and their extensions to 3D Nonnegative Tensor Factorization (NTF) that are robust in the presence of noise and have many potential applications, including multiway Blind Sourc ..."
Abstract

Cited by 33 (6 self)
 Add to MetaCart
(Show Context)
Abstract. In the paper we present new Alternating Least Squares (ALS) algorithms for Nonnegative Matrix Factorization (NMF) and their extensions to 3D Nonnegative Tensor Factorization (NTF) that are robust in the presence of noise and have many potential applications, including multiway Blind Source Separation (BSS), multisensory or multidimensional data analysis, and nonnegative neural sparse coding. We propose to use local cost functions whose simultaneous or sequential (one by one) minimization leads to a very simple ALS algorithm which works under some sparsity constraints both for an underdetermined (a system which has less sensors than sources) and overdetermined model. The extensive experimental results confirm the validity and high performance of the developed algorithms, especially with usage of the multilayer hierarchical NMF. Extension of the proposed algorithm to
A NonNegative and Sparse Enough Solution of an Underdetermined Linear System of Equations is Unique
, 2007
"... In this paper we consider an underdetermined linear system of equations Ax = b with nonnegative entries of A and b, and the solution x being also required to be nonnegative. We show that if there exists a sufficiently sparse solution to this problem, it is necessarily unique. Furthermore, we presen ..."
Abstract

Cited by 32 (2 self)
 Add to MetaCart
(Show Context)
In this paper we consider an underdetermined linear system of equations Ax = b with nonnegative entries of A and b, and the solution x being also required to be nonnegative. We show that if there exists a sufficiently sparse solution to this problem, it is necessarily unique. Furthermore, we present a greedy algorithm – a variant of the matching pursuit – that is guaranteed to find this sparse solution. We also extend the existing theoretical analysis of the basis pursuit problem, i.e. min �x�1 s.t. Ax = b, by studying conditions for perfect recovery of sparse enough solutions. Considering a matrix A with arbitrary column norms, and an arbitrary monotone elementwise concave penalty replacing the ℓ1norm objective function, we generalize known equivalence results. Beyond the desirable generalization that this result introduces, we show how it is exploited to lead to the abovementioned uniqueness claim. 1
Nonnegative Tucker decomposition, in
 Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR2007
, 2007
"... Nonnegative tensor factorization (NTF) is a recent multiway (multilinear) extension of nonnegative matrix factorization (NMF), where nonnegativity constraints are imposed on the CANDECOMP/PARAFAC model. In this paper we consider the Tucker model with nonnegativity constraints and develop a new tenso ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
(Show Context)
Nonnegative tensor factorization (NTF) is a recent multiway (multilinear) extension of nonnegative matrix factorization (NMF), where nonnegativity constraints are imposed on the CANDECOMP/PARAFAC model. In this paper we consider the Tucker model with nonnegativity constraints and develop a new tensor factorization method, referred to as nonnegative Tucker decomposition (NTD). The main contributions of this paper include: (1) multiplicative updating algorithms for NTD; (2) an initialization method for speeding up convergence; (3) a sparseness control method in tensor factorization. Through several computer vision examples, we show the useful behavior of the NTD, over existing NTF and NMF methods. 1.
Tensor Sparse Coding for Region Covariances
"... Abstract. Sparse representation of signals has been the focus of much research in the recent years. A vast majority of existing algorithms deal with vectors, and higher–order data like images are dealt with by vectorization. However, the structure of the data may be lost in the process, leading to a ..."
Abstract

Cited by 21 (5 self)
 Add to MetaCart
(Show Context)
Abstract. Sparse representation of signals has been the focus of much research in the recent years. A vast majority of existing algorithms deal with vectors, and higher–order data like images are dealt with by vectorization. However, the structure of the data may be lost in the process, leading to a poorer representation and overall performance degradation. In this paper we propose a novel approach for sparse representation of positive definite matrices, where vectorization will destroy the inherent structure of the data. The sparse decomposition of a positive definite matrix is formulated as a convex optimization problem, which falls under the category of determinant maximization (MAXDET) problems [1], for which efficient interior point algorithms exist. Experimental results are shown with simulated examples as well as in real–world computer vision applications, demonstrating the suitability of the new model. This forms the first step toward extending the cornucopia of sparsitybased algorithms to positive definite matrices.
Controlling sparseness in nonnegative tensor factorization
 IN: ECCV. (2006
, 2006
"... Nonnegative tensor factorization (NTF) has recently been proposed as sparse and efficient image representation (Welling and Weber, Patt. Rec. Let., 2001). Until now, sparsity of the tensor factorization has been empirically observed in many cases, but there was no systematic way to control it. In ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
(Show Context)
Nonnegative tensor factorization (NTF) has recently been proposed as sparse and efficient image representation (Welling and Weber, Patt. Rec. Let., 2001). Until now, sparsity of the tensor factorization has been empirically observed in many cases, but there was no systematic way to control it. In this work, we show that a sparsity measure recently proposed for nonnegative matrix factorization (Hoyer, J. Mach. Learn. Res., 2004) applies to NTF and allows precise control over sparseness of the resulting factorization. We devise an algorithm based on sequential conic programming and show improved performance over classical NTF codes on artificial and on realworld data sets.
Nonnegative Tensor Factorization for Continuous EEG Classification
, 2007
"... In this paper we present a method for continuous EEG classification, where we employ nonnegative tensor factorization (NTF) to determine discriminative spectral features and use the Viterbi algorithm to continuously classify multiple mental tasks. This is an extension of our previous work on the use ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
In this paper we present a method for continuous EEG classification, where we employ nonnegative tensor factorization (NTF) to determine discriminative spectral features and use the Viterbi algorithm to continuously classify multiple mental tasks. This is an extension of our previous work on the use of nonnegative matrix factorization (NMF) for EEG classification. Numerical experiments with two data sets in BCI competition, confirm the useful behavior of the method
HIERARCHICAL TENSOR APPROXIMATION OF MULTIDIMENSIONAL IMAGES
"... Visual data comprises of multiscale and inhomogeneous signals. In this paper, we exploit these characteristics and develop an adaptive data approximation technique based on a hierarchical tensorbased transformation. In this technique, an original multidimensional image is transformed into a hiera ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
Visual data comprises of multiscale and inhomogeneous signals. In this paper, we exploit these characteristics and develop an adaptive data approximation technique based on a hierarchical tensorbased transformation. In this technique, an original multidimensional image is transformed into a hierarchy of signals to expose its multiscale structures. The signal at each level of the hierarchy is further divided into a number of smaller tensors to expose its spatially inhomogeneous structures. These smaller tensors are further transformed and pruned using a collective tensor approximation technique. Experimental results indicate that our technique can achieve higher compression ratios than existing functional approximation methods, including wavelet transforms, wavelet packet transforms and singlelevel tensor approximation.