Results 1 -
4 of
4
S.: Iteration bounds for finding -stationary points of structured nonconvex optimization. Working Paper
, 2014
"... In this paper we study proximal conditional-gradient (CG) and proximal gradient-projection type algorithms for a block-structured constrained nonconvex optimization model, which arises naturally from tensor data analysis. First, we introduce a new notion of -stationarity, which is suitable for the s ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
(Show Context)
In this paper we study proximal conditional-gradient (CG) and proximal gradient-projection type algorithms for a block-structured constrained nonconvex optimization model, which arises naturally from tensor data analysis. First, we introduce a new notion of -stationarity, which is suitable for the structured problem under consideration. We then propose two types of first-order algorithms for the model based on the proximal conditional-gradient (CG) method and the proximal gradient-projection method respectively. If the nonconvex objective function is in the form of mathematical expectation, we then discuss how to incorporate randomized sampling to avoid computing the expectations exactly. For the general block optimization model, the proximal subroutines are performed for each block according to either the block-coordinate-descent (BCD) or the maximum-block-improvement (MBI) updating rule. If the gradient of the nonconvex part of the objective f satisfies ‖∇f(x) − ∇f(y)‖q ≤ M‖x − y‖δp where δ = p/q with 1/p + 1/q = 1, then we prove that the new algorithms have an overall iteration complexity bound of O(1/q) in finding an -stationary solution. If f is concave then the iteration complexity reduces to O(1/). Our numerical experiments for tensor approximation problems show promising performances of the new solution algorithms.
New Ranks for Even-Order Tensors and Their Applications in Low-Rank Tensor Optimization
, 2015
"... In this paper, we propose three new tensor decompositions for even-order tensors correspond-ing respectively to the rank-one decompositions of some unfolded matrices. Consequently such new decompositions lead to three new notions of (even-order) tensor ranks, to be called the M-rank, the symmetric M ..."
Abstract
- Add to MetaCart
(Show Context)
In this paper, we propose three new tensor decompositions for even-order tensors correspond-ing respectively to the rank-one decompositions of some unfolded matrices. Consequently such new decompositions lead to three new notions of (even-order) tensor ranks, to be called the M-rank, the symmetric M-rank, and the strongly symmetric M-rank in this paper. We discuss the bounds between these new tensor ranks and the CP(CANDECOMP/PARAFAC)-rank and the symmetric CP-rank of an even-order tensor. In particular, we show: (1) these newly defined ranks actually coincide with each other if the even-order tensor in question is super-symmetric; (2) the CP-rank and symmetric CP-rank for a fourth-order tensor can be both lower and upper bounded (up to a constant factor) by the corresponding M-rank. Since the M-rank is much easi-er to compute than the CP-rank, we can replace the CP-rank by the M-rank in the low-CP-rank tensor recovery model. Numerical results on both synthetic data and real data from colored video completion and decomposition problems show that the M-rank is indeed an effective and easy computable approximation of the CP-rank in the context of low-rank tensor recovery.