Results 1 
3 of
3
Robust Linear Programming Discrimination Of Two Linearly Inseparable Sets
, 1992
"... INTRODUCTION We consider the two pointsets A and B in the ndimensional real space R n represented by the m \Theta n matrix A and the k \Theta n matrix B respectively. Our principal objective here is to formulate a single linear program with the following properties: (i) If the convex hulls of A ..."
Abstract

Cited by 210 (33 self)
 Add to MetaCart
INTRODUCTION We consider the two pointsets A and B in the ndimensional real space R n represented by the m \Theta n matrix A and the k \Theta n matrix B respectively. Our principal objective here is to formulate a single linear program with the following properties: (i) If the convex hulls of A and B are disjoint, a strictly separating plane is obtained. (ii) If the convex hulls of A and B intersect, a plane is obtained that minimizes some measure of misclassification points, for all possible cases. (iii) No extraneous constraints are imposed on the linear program that rule out any specific case from consideration. Most linear programming formulations 6,5,12,4 have property (i)
Feature Selection via Mathematical Programming
, 1997
"... The problem of discriminating between two finite point sets in ndimensional feature space by a separating plane that utilizes as few of the features as possible, is formulated as a mathematical program with a parametric objective function and linear constraints. The step function that appears in th ..."
Abstract

Cited by 60 (22 self)
 Add to MetaCart
The problem of discriminating between two finite point sets in ndimensional feature space by a separating plane that utilizes as few of the features as possible, is formulated as a mathematical program with a parametric objective function and linear constraints. The step function that appears in the objective function can be approximated by a sigmoid or by a concave exponential on the nonnegative real line, or it can be treated exactly by considering the equivalent linear program with equilibrium constraints (LPEC). Computational tests of these three approaches on publicly available realworld databases have been carried out and compared with an adaptation of the optimal brain damage (OBD) method for reducing neural network complexity. One feature selection algorithm via concave minimization (FSV) reduced crossvalidation error on a cancer prognosis database by 35.4% while reducing problem features from 32 to 4. Feature selection is an important problem in machine learning [18, 15, 1...
Back Propagation Neural Network by Comparing Hidden Neurons: Case study on Breast Cancer Diagnosis
"... This paper investigates the potential of applying the feed forward neural network architecture for the classification of breast cancer. Backpropagation algorithm is used for training multilayer artificial neural network. Missing values are replaced with median method before the construction of the ..."
Abstract
 Add to MetaCart
This paper investigates the potential of applying the feed forward neural network architecture for the classification of breast cancer. Backpropagation algorithm is used for training multilayer artificial neural network. Missing values are replaced with median method before the construction of the network. This paper presents the results of a comparison among ten different hidden neuron initialization methods. The classification results have indicated that the network gave the good diagnostic performance of 99.28%.