Results 1  10
of
6,232
Lowrank approximation of tensors
, 2014
"... In many applications such as data compression, imaging or genomic data analysis, it is important to approximate a given tensor by a tensor that is sparsely representable. For matrices, i.e. 2tensors, such a representation can be obtained via the singular value decomposition, which allows to compute ..."
Abstract
 Add to MetaCart
to compute best rank kapproximations. For very big matrices a low rank approximation using SVD is not computationally feasible. In this case different approximations are available. It seems that variants of CURdecomposition are most suitable. For dmode tensors T ∈ ⊗di=1Rni, with d> 2, many
Bundle Adjustment  A Modern Synthesis
 VISION ALGORITHMS: THEORY AND PRACTICE, LNCS
, 2000
"... This paper is a survey of the theory and methods of photogrammetric bundle adjustment, aimed at potential implementors in the computer vision community. Bundle adjustment is the problem of refining a visual reconstruction to produce jointly optimal structure and viewing parameter estimates. Topics c ..."
Abstract

Cited by 555 (12 self)
 Add to MetaCart
covered include: the choice of cost function and robustness; numerical optimization including sparse Newton methods, linearly convergent approximations, updating and recursive methods; gauge (datum) invariance; and quality control. The theory is developed for general robust cost functions rather than
MultiWay Compressed Sensing for Sparse LowRank Tensors
, 2012
"... For linear models, compressed sensing theory and methods enable recovery of sparse signals of interest from few measurements—in the order of the number of nonzero entries as opposed to the length of the signal of interest. Results of similar flavor have more recently emerged for bilinear models, bu ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
, but no results are available for multilinear models of tensor data. In this contribution, we consider compressed sensing for sparse and lowrank tensors. More specifically, we consider lowrank tensors synthesized as sums of outer products of sparse loading vectors, and a special class of linear dimensionality
Lowrank Tensor Recovery via Iterative Hard
"... Abstract—We study recovery of lowrank tensors from a small number of measurements. A version of the iterative hard thresholding algorithm (TIHT) for the higher order singular value decomposition (HOSVD) is introduced. As a first step towards the analysis of the algorithm, we define a corresponding ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Abstract—We study recovery of lowrank tensors from a small number of measurements. A version of the iterative hard thresholding algorithm (TIHT) for the higher order singular value decomposition (HOSVD) is introduced. As a first step towards the analysis of the algorithm, we define a corresponding
LOWRANK TENSOR DECOMPOSITION BASED ANOMALY DETECTION FOR HYPERSPECTRAL IMAGERY
"... Anomaly detection becomes increasingly important in hyperspectral image analysis, since it can now uncover many material substances which were previously unresolved by multispectral sensors. In this paper, we propose a Lowrank Tensor Decomposition based anomaly Detection (LTDD) algorithm for Hyp ..."
Abstract
 Add to MetaCart
for Hyperspectral Imagery. The HSI data cube is first modeled as a dense lowrank tensor plus a sparse tensor. Based on the obtained lowrank tensor, LTDD further decomposes the lowrank tensor using Tucker decomposition to extract the core tensor which is treated as the “support ” of the anomaly spectral
New Ranks for EvenOrder Tensors and Their Applications in LowRank Tensor Optimization
, 2015
"... In this paper, we propose three new tensor decompositions for evenorder tensors corresponding respectively to the rankone decompositions of some unfolded matrices. Consequently such new decompositions lead to three new notions of (evenorder) tensor ranks, to be called the Mrank, the symmetric M ..."
Abstract
 Add to MetaCart
in the lowCPrank tensor recovery model. Numerical results on both synthetic data and real data from colored video completion and decomposition problems show that the Mrank is indeed an effective and easy computable approximation of the CPrank in the context of lowrank tensor recovery.
Giannakis, “Nonparametric lowrank tensor imputation
 in Proc. IEEE Statistical Signal Process. Workshop, Ann Arbor
, 2012
"... Completion or imputation of threeway data arrays with missing entries is a basic problem encountered in various areas, including bioinformatics, image processing, and preference analysis. If available, prior information about the data at hand should be incorporated to enhance performance of the i ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
of the imputation method adopted. This is the motivation behind the proposed lowrank tensor estimator which leverages the correlation across slices of the data cube in the form of reproducing kernels. The rank of the tensor estimate is controlled by a novel regularization on the factors of its PARAFAC decomposition
Survey on Independent Component Analysis
 NEURAL COMPUTING SURVEYS
, 1999
"... A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the ..."
Abstract

Cited by 2241 (104 self)
 Add to MetaCart
A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the original data. Wellknown linear transformation methods include, for example, principal component analysis, factor analysis, and projection pursuit. A recently developed linear transformation method is independent component analysis (ICA), in which the desired representation is the one that minimizes the statistical dependence of the components of the representation. Such a representation seems to capture the essential structure of the data in many applications. In this paper, we survey the existing theory and methods for ICA.
Large margin methods for structured and interdependent output variables
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2005
"... Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses the complementary ..."
Abstract

Cited by 612 (12 self)
 Add to MetaCart
Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses the complementary issue of designing classification algorithms that can deal with more complex outputs, such as trees, sequences, or sets. More generally, we consider problems involving multiple dependent output variables, structured output spaces, and classification problems with class attributes. In order to accomplish this, we propose to appropriately generalize the wellknown notion of a separation margin and derive a corresponding maximummargin formulation. While this leads to a quadratic program with a potentially prohibitive, i.e. exponential, number of constraints, we present a cutting plane algorithm that solves the optimization problem in polynomial time for a large class of problems. The proposed method has important applications in areas such as computational biology, natural language processing, information retrieval/extraction, and optical character recognition. Experiments from various domains involving different types of output spaces emphasize the breadth and generality of our approach.
Results 1  10
of
6,232