Results 1 
4 of
4
S.: Iteration bounds for finding stationary points of structured nonconvex optimization. Working Paper
, 2014
"... In this paper we study proximal conditionalgradient (CG) and proximal gradientprojection type algorithms for a blockstructured constrained nonconvex optimization model, which arises naturally from tensor data analysis. First, we introduce a new notion of stationarity, which is suitable for the s ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
In this paper we study proximal conditionalgradient (CG) and proximal gradientprojection type algorithms for a blockstructured constrained nonconvex optimization model, which arises naturally from tensor data analysis. First, we introduce a new notion of stationarity, which is suitable for the structured problem under consideration. We then propose two types of firstorder algorithms for the model based on the proximal conditionalgradient (CG) method and the proximal gradientprojection method respectively. If the nonconvex objective function is in the form of mathematical expectation, we then discuss how to incorporate randomized sampling to avoid computing the expectations exactly. For the general block optimization model, the proximal subroutines are performed for each block according to either the blockcoordinatedescent (BCD) or the maximumblockimprovement (MBI) updating rule. If the gradient of the nonconvex part of the objective f satisfies ‖∇f(x) − ∇f(y)‖q ≤ M‖x − y‖δp where δ = p/q with 1/p + 1/q = 1, then we prove that the new algorithms have an overall iteration complexity bound of O(1/q) in finding an stationary solution. If f is concave then the iteration complexity reduces to O(1/). Our numerical experiments for tensor approximation problems show promising performances of the new solution algorithms.
New Ranks for EvenOrder Tensors and Their Applications in LowRank Tensor Optimization
, 2015
"... In this paper, we propose three new tensor decompositions for evenorder tensors corresponding respectively to the rankone decompositions of some unfolded matrices. Consequently such new decompositions lead to three new notions of (evenorder) tensor ranks, to be called the Mrank, the symmetric M ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper, we propose three new tensor decompositions for evenorder tensors corresponding respectively to the rankone decompositions of some unfolded matrices. Consequently such new decompositions lead to three new notions of (evenorder) tensor ranks, to be called the Mrank, the symmetric Mrank, and the strongly symmetric Mrank in this paper. We discuss the bounds between these new tensor ranks and the CP(CANDECOMP/PARAFAC)rank and the symmetric CPrank of an evenorder tensor. In particular, we show: (1) these newly defined ranks actually coincide with each other if the evenorder tensor in question is supersymmetric; (2) the CPrank and symmetric CPrank for a fourthorder tensor can be both lower and upper bounded (up to a constant factor) by the corresponding Mrank. Since the Mrank is much easier to compute than the CPrank, we can replace the CPrank by the Mrank in the lowCPrank tensor recovery model. Numerical results on both synthetic data and real data from colored video completion and decomposition problems show that the Mrank is indeed an effective and easy computable approximation of the CPrank in the context of lowrank tensor recovery.