Results 1 
2 of
2
Optimisation on Support Vector Machines
"... In this paper we deal with the optimisation problem involved in determining the maximal margin separation hyperplane in support vector machines. We consider three dierent formulations, based on L2 norm distance (the standard case), L1 norm, and L1 norm. We consider separation in the original spac ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper we deal with the optimisation problem involved in determining the maximal margin separation hyperplane in support vector machines. We consider three dierent formulations, based on L2 norm distance (the standard case), L1 norm, and L1 norm. We consider separation in the original space of the data (i.e., there are no kernel transformations). For any of these cases, we focus on the following problem: having the optimal solution for a given training data set, one is given a new training example. The purpose is to use the information about the solution of the problem without the additional example in order to speed up the new optimisation problem. We also consider the case of reoptimisation after removing an example from the data set. We report results obtained for some standard benchmark problems. 1 Introduction The support vector machine standard formulation of the problem of optimal separation of two classes of points can be found, for example, in [9]. It consist...
REGULAR PAPER Knowledge and Information Systems
"... Abstract Tensor representation is helpful to reduce the small sample size problem in discriminative subspace selection. As pointed by this paper, this is mainly because the structure information of objects in computer vision research is a reasonable constraint to reduce the number of unknown paramet ..."
Abstract
 Add to MetaCart
Abstract Tensor representation is helpful to reduce the small sample size problem in discriminative subspace selection. As pointed by this paper, this is mainly because the structure information of objects in computer vision research is a reasonable constraint to reduce the number of unknown parameters used to represent a learning model. Therefore, we apply this information to the vectorbased learning and generalize the vectorbased learning to the tensorbased learning as the supervised tensor learning (STL) framework, which accepts tensors as input. To obtain the solution of STL, the alternating projection optimization procedure is developed. The STL framework is a combination of the convex optimization and the operations in multilinear algebra. The tensor representation helps reduce the overfitting problem in vectorbased learning. Based on STL and its alternating projection optimization procedure, we generalize support vector machines, minimax probability machine, Fisher discriminant analysis, and distance metric learning, to support tensor machines, tensor minimax probability machine, tensor Fisher discriminant analysis, and the multiple distance metrics learning, respectively. We also study the iterative procedure for feature extraction within STL. To examine the effectiveness of STL, we implement the tensor minimax probability machine for image classification. By comparing with minimax probability machine, the tensor version reduces the overfitting problem.