Results 1  10
of
72
Tensor Decompositions and Applications
 SIAM REVIEW
, 2009
"... This survey provides an overview of higherorder tensor decompositions, their applications, and available software. A tensor is a multidimensional or N way array. Decompositions of higherorder tensors (i.e., N way arrays with N â¥ 3) have applications in psychometrics, chemometrics, signal proce ..."
Abstract

Cited by 237 (14 self)
 Add to MetaCart
This survey provides an overview of higherorder tensor decompositions, their applications, and available software. A tensor is a multidimensional or N way array. Decompositions of higherorder tensors (i.e., N way arrays with N â¥ 3) have applications in psychometrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, etc. Two particular tensor decompositions can be considered to be higherorder extensions of the matrix singular value decompo
sition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rankone tensors, and the Tucker decomposition is a higherorder form of principal components analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The Nway Toolbox and Tensor Toolbox, both for MATLAB, and the Multilinear Engine are examples of software packages for working with tensors.
Sparse image coding using a 3D nonnegative tensor factorization
 In: International Conference of Computer Vision (ICCV
, 2005
"... We introduce an algorithm for a nonnegative 3D tensor factorization for the purpose of establishing a local parts feature decomposition from an object class of images. In the past such a decomposition was obtained using nonnegative matrix factorization (NMF) where images were vectorized before bein ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
We introduce an algorithm for a nonnegative 3D tensor factorization for the purpose of establishing a local parts feature decomposition from an object class of images. In the past such a decomposition was obtained using nonnegative matrix factorization (NMF) where images were vectorized before being factored by NMF. A tensor factorization (NTF) on the other hand preserves the 2D representations of images and provides a unique factorization (unlike NMF which is not unique). The resulting ”factors” from the NTF factorization are both sparse (like with NMF) but also separable allowing efficient convolution with the test image. Results show a superior decomposition to what an NMF can provide on all fronts — degree of sparsity, lack of ghost residue due to invariant parts and efficiency of coding of around an order of magnitude better. Experiments on using the local parts decomposition for face detection using SVM and Adaboost classifiers demonstrate that the recovered features are discriminatory and highly effective for classification. 1.
Multiway clustering using supersymmetric nonnegative tensor factorization. ECCV
 Proc. of the European Conference on Computer Vision (ECCV
, 2006
"... Abstract. We consider the problem of clustering data into k ≥ 2 clusters given complex relations — going beyond pairwise — between the data points. The complex nwise relations are modeled by an nway array where each entry corresponds to an affinity measure over an ntuple of data points. We show th ..."
Abstract

Cited by 34 (2 self)
 Add to MetaCart
Abstract. We consider the problem of clustering data into k ≥ 2 clusters given complex relations — going beyond pairwise — between the data points. The complex nwise relations are modeled by an nway array where each entry corresponds to an affinity measure over an ntuple of data points. We show that a probabilistic assignment of data points to clusters is equivalent, under mild conditional independence assumptions, to a supersymmetric nonnegative factorization of the closest hyperstochastic version of the input nway affinity array. We derive an algorithm for finding a local minimum solution to the factorization problem whose computational complexity is proportional to the number of ntuple samples drawn from the data. We apply the algorithm to a number of visual interpretation problems including 3D multibody segmentation and illuminationbased clustering of human faces. 1
A NonNegative and Sparse Enough Solution of an Underdetermined Linear System of Equations is Unique
, 2007
"... In this paper we consider an underdetermined linear system of equations Ax = b with nonnegative entries of A and b, and the solution x being also required to be nonnegative. We show that if there exists a sufficiently sparse solution to this problem, it is necessarily unique. Furthermore, we presen ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
In this paper we consider an underdetermined linear system of equations Ax = b with nonnegative entries of A and b, and the solution x being also required to be nonnegative. We show that if there exists a sufficiently sparse solution to this problem, it is necessarily unique. Furthermore, we present a greedy algorithm – a variant of the matching pursuit – that is guaranteed to find this sparse solution. We also extend the existing theoretical analysis of the basis pursuit problem, i.e. min �x�1 s.t. Ax = b, by studying conditions for perfect recovery of sparse enough solutions. Considering a matrix A with arbitrary column norms, and an arbitrary monotone elementwise concave penalty replacing the ℓ1norm objective function, we generalize known equivalence results. Beyond the desirable generalization that this result introduces, we show how it is exploited to lead to the abovementioned uniqueness claim. 1
Higher order learning with graphs
 In ICML ’06: Proceedings of the 23rd international conference on Machine learning
, 2006
"... Recently there has been considerable interest in learning with higher order relations (i.e., threeway or higher) in the unsupervised and semisupervised settings. Hypergraphs and tensors have been proposed as the natural way of representing these relations and their corresponding algebra as the nat ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
Recently there has been considerable interest in learning with higher order relations (i.e., threeway or higher) in the unsupervised and semisupervised settings. Hypergraphs and tensors have been proposed as the natural way of representing these relations and their corresponding algebra as the natural tools for operating on them. In this paper we argue that hypergraphs are not a natural representation for higher order relations, indeed pairwise as well as higher order relations can be handled using graphs. We show that various formulations of the semisupervised and the unsupervised learning problem on hypergraphs result in the same graph theoretic problem and can be analyzed using existing tools. 1.
Probabilistic latent variable models as nonnegative factorizations,” Computational intelligence and Neuroscience
, 2008
"... In this paper we present a family of probabilistic latent variable models which can be used for analysis of nonnegative data. We show that there strong ties between nonnegative matrix factorization and this family, and we also provide some straightforward extensions which can help in dealing with ..."
Abstract

Cited by 21 (5 self)
 Add to MetaCart
In this paper we present a family of probabilistic latent variable models which can be used for analysis of nonnegative data. We show that there strong ties between nonnegative matrix factorization and this family, and we also provide some straightforward extensions which can help in dealing with shiftinvariances, higher order decompositions and sparsity constraints. Through these extensions we argue that the use of this approach allows for rapid development of complex statistical models for analyzing nonnegative data.
L.S.: Learning optimal ranking with tensor factorization for tag recommendation
 In: KDD ’09: Proceeding of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
, 2009
"... Tag recommendation is the task of predicting a personalized list of tags for a user given an item. This is important for many websites with tagging capabilities like last.fm or delicious. In this paper, we propose a method for tag recommendation based on tensor factorization (TF). In contrast to oth ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
Tag recommendation is the task of predicting a personalized list of tags for a user given an item. This is important for many websites with tagging capabilities like last.fm or delicious. In this paper, we propose a method for tag recommendation based on tensor factorization (TF). In contrast to other TF methods like higher order singular value decomposition (HOSVD), our method RTF (‘ranking with tensor factorization’) directly optimizes the factorization model for the best personalized ranking. RTF handles missing values and learns from pairwise ranking constraints. Our optimization criterion for TF is motivated by a detailed analysis of the problem and of interpretation schemes for the observed data in tagging systems. In all, RTF directly optimizes for the actual problem using a correct interpretation of the data. We provide a gradient descent algorithm to solve our optimization problem. We also provide an improved learning and prediction method with runtime complexity analysis for RTF. The prediction runtime of RTF is independent of the number of observations and only depends on the factorization dimensions. Besides the theoretical analysis, we empirically show that our method outperforms other stateoftheart tag recommendation methods like FolkRank, PageRank and HOSVD both in quality and prediction runtime.
Controlling sparseness in nonnegative tensor factorization
 IN: ECCV. (2006
, 2006
"... Nonnegative tensor factorization (NTF) has recently been proposed as sparse and efficient image representation (Welling and Weber, Patt. Rec. Let., 2001). Until now, sparsity of the tensor factorization has been empirically observed in many cases, but there was no systematic way to control it. In ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
Nonnegative tensor factorization (NTF) has recently been proposed as sparse and efficient image representation (Welling and Weber, Patt. Rec. Let., 2001). Until now, sparsity of the tensor factorization has been empirically observed in many cases, but there was no systematic way to control it. In this work, we show that a sparsity measure recently proposed for nonnegative matrix factorization (Hoyer, J. Mach. Learn. Res., 2004) applies to NTF and allows precise control over sparseness of the resulting factorization. We devise an algorithm based on sequential conic programming and show improved performance over classical NTF codes on artificial and on realworld data sets.
Nonnegative matrix approximation: algorithms and applications
, 2006
"... Low dimensional data representations are crucial to numerous applications in machine learning, statistics, and signal processing. Nonnegative matrix approximation (NNMA) is a method for dimensionality reduction that respects the nonnegativity of the input data while constructing a lowdimensional ap ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
Low dimensional data representations are crucial to numerous applications in machine learning, statistics, and signal processing. Nonnegative matrix approximation (NNMA) is a method for dimensionality reduction that respects the nonnegativity of the input data while constructing a lowdimensional approximation. NNMA has been used in a multitude of applications, though without commensurate theoretical development. In this report we describe generic methods for minimizing generalized divergences between the input and its low rank approximant. Some of our general methods are even extensible to arbitrary convex penalties. Our methods yield efficient multiplicative iterative schemes for solving the proposed problems. We also consider interesting extensions such as the use of penalty functions, nonlinear relationships via “link ” functions, weighted errors, and multifactor approximations. We present some experiments as an illustration of our algorithms. For completeness, the report also includes a brief literature survey of the various algorithms and the applications of NNMA. Keywords: Nonnegative matrix factorization, weighted approximation, Bregman divergence, multiplicative
Nonnegative Tucker decomposition, in
 Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR2007
, 2007
"... Nonnegative tensor factorization (NTF) is a recent multiway (multilinear) extension of nonnegative matrix factorization (NMF), where nonnegativity constraints are imposed on the CANDECOMP/PARAFAC model. In this paper we consider the Tucker model with nonnegativity constraints and develop a new tenso ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
Nonnegative tensor factorization (NTF) is a recent multiway (multilinear) extension of nonnegative matrix factorization (NMF), where nonnegativity constraints are imposed on the CANDECOMP/PARAFAC model. In this paper we consider the Tucker model with nonnegativity constraints and develop a new tensor factorization method, referred to as nonnegative Tucker decomposition (NTD). The main contributions of this paper include: (1) multiplicative updating algorithms for NTD; (2) an initialization method for speeding up convergence; (3) a sparseness control method in tensor factorization. Through several computer vision examples, we show the useful behavior of the NTD, over existing NTF and NMF methods. 1.