#### DMCA

## Research Article Breast Cancer Detection with Reduced Feature Set

### Citations

13218 | Statistical Learning Theory
- Vapnik
- 2003
(Show Context)
Citation Context ...oreover, the irrelevant components in the inputs will decrease the generalization performance of RBFNN [13]. Support vector machine (SVM) is an effective statistical learningmethod for classification =-=[14]-=-. SVM is based on finding optimal hyperplane to separate different classes mapping input data into higher-dimensional feature space. SVM has advantage of fast training technique, even with large numbe... |

1863 | A training algorithm for optimal margin classi er
- Boser, Guyon, et al.
- 1992
(Show Context)
Citation Context ...tion to the distance using the spread values. 2.4. Support Vector Machine (SVM). SVM is a supervised learning algorithm studied for data classification and regression. It was proposed by Boser et al. =-=[30]-=- and Vapnik [31]. SVM algorithm is used to find a hyperplane that separates the classesminimizing training error andmaximizing themargin in order to increase generation capability. When the datasets a... |

847 | Independent component analysis: Algorithms and applications” Neural Networks, v 13, n 4-5, p
- Hyvärinen, Oja
- 2000
(Show Context)
Citation Context ...nique to reduce dimensionality using second order statistical information [20]. Independent component analysis (ICA) is a recently developed method in pattern recognition and signal processing fields =-=[21, 22]-=-. It involves higher order statistics to extract independent components that involve richer information than PCA. ICA can be used to reduce dimensionality before training ... |

251 | Support Vector Machines for 3-D Object Recognition - Pontil, Verri - 1998 |

152 | Breast Cancer Diagnosis and prognosis via linear programming
- Mangasarian, Street, et al.
- 1995
(Show Context)
Citation Context ...72% 94.50% Bagui et al. [37] 64% test data, ... |

106 |
Index for rating diagnostic tests."
- Youden
- 1950
(Show Context)
Citation Context ...hen “poor discriminant,” DP < 2 then “limited discriminant,” DP < 3 then “fair discriminant” and other cases then “good discriminant.” Youden’s index evaluates a classifier’s ability to avoid failure =-=[33]-=- and is described as ... |

53 |
Vapnik,TheNature of Statistical LearningTheory,
- N
- 1995
(Show Context)
Citation Context ...ance using the spread values. 2.4. Support Vector Machine (SVM). SVM is a supervised learning algorithm studied for data classification and regression. It was proposed by Boser et al. [30] and Vapnik =-=[31]-=-. SVM algorithm is used to find a hyperplane that separates the classesminimizing training error andmaximizing themargin in order to increase generation capability. When the datasets are linearly sepa... |

31 | Machine Learning Techniques to Diagnose Breast Cancer from
- Street, Wolberg, et al.
- 1994
(Show Context)
Citation Context ... 1 calculated for each cell nucleus, and the mean, standard error, and “worst” or largest (mean of the three largest values) of these features were calculated for each image, resulting in 30 features =-=[24]-=-. 2.2. Independent Component Analysis. The basic model of ICA is as follows. Suppose that the observed signal is the linear combination of two independently distributed sources. The observed signal ca... |

28 |
Support vector machines combined with feature selection for breast cancer diagnosis,” Expert systems with applications,
- Akay
- 2009
(Show Context)
Citation Context ...on Computational and Mathematical Methods in Medicine Volume 2015, Article ID 265138, 11 pages http://dx.doi.org/10.1155/2015/265138 2 Computational and Mathematical Methods in Medicine of input data =-=[15, 16]-=-. Therefore it has been used for many recognition problems such as object recognition and face detection [17–19]. Principal component analysis (PCA) is a technique to reduce dimensionality using secon... |

18 | Medical analysis and diagnosis by neural networks,”
- Brause
- 2001
(Show Context)
Citation Context ...racy of diagnosis. In Brause’s work, the result shows that the most experienced physician can diagnose with 79.97% accuracy while 91.1% correct diagnosis is achieved with the help of machine learning =-=[6]-=-. Tumors are classified as benign and malignant. Benign tumors are not cancerous or life threatening. However these can increase the risk of getting breast cancer. Malignant tumors are cancerous and m... |

17 |
Breast cancer detection using rank nearest neighbor
- Bagui, Bagui, et al.
- 2003
(Show Context)
Citation Context ...previous studies and this study. Author Method Feature number Accuracy Sensitivity Krishnan et al. [36] 40% test data, SVM (poly.) 30 92.62% 92.69% 40% test data, SVM (RBF) 93.72% 94.50% Bagui et al. =-=[37]-=- 64% test data, ... |

13 |
Breast mass classification based on cytological patterns using
- Subashini, Ramalingam, et al.
- 2009
(Show Context)
Citation Context ...r. Malignant tumors are cancerous and more alarming than benign tumors. Although significant studies are performed for early detection, about 20%of all womenwithmalignant tumors die from this disease =-=[7]-=-. In order to improve accuracy of breast mass classification as benign and malignant, the performance of backpropagation artificial neural network (ANN) was evaluated [8]. Moreover, the fast learning ... |

9 |
Fast detection of masses in computer-aided mammography
- Christoyianni
(Show Context)
Citation Context ...presents an improvement in diagnostic decision support system, while reducing computational complexity. 1. Introduction Breast cancer is one of the leading causes of death among all cancers for women =-=[1]-=-. Early detection and correct diagnosis of cancer are essential for the treatment of the disease. However, the traditional approach to cancer diagnosis depends highly on the experience of doctors and ... |

6 |
Optimal neural network architecture selection: Improvement in computerized detection of microcalcifications,”
- Gurcan, Chan, et al.
- 2002
(Show Context)
Citation Context ... tumors die from this disease [7]. In order to improve accuracy of breast mass classification as benign and malignant, the performance of backpropagation artificial neural network (ANN) was evaluated =-=[8]-=-. Moreover, the fast learning rates and generalization capabilities of radial basis function neural networks (RBFNN) have showed excellent accuracy in microcalcification detection task [9, 10]. The ad... |

4 |
Medical Diagnosis Using Neural Networks,
- Salim
- 2004
(Show Context)
Citation Context ...r visual inspections. Naturally, human beings can make mistakes due to their limitations.Humans can recognize patterns easily.However, they fail when probabilities have to be assigned to observations =-=[2]-=-. Although several tests are applied, exact diagnosis may be difficult even for an expert. That is why automatic diagnosis of breast cancer is investigated bymany researchers. Computer aided diagnosti... |

4 | An evaluation of dimension reduction techniques for one-class classification.
- Villalba, Cunningham
- 2007
(Show Context)
Citation Context ...A. ICA can be used to reduce dimensionality before training ... |

4 |
Reliable and Computationally Efficient Maximum-Likelihood Estimation of "Proper
- Pesce, Metz
- 2007
(Show Context)
Citation Context ...re of the receiver operating characteristic (ROC) curve. The diagnostic performance of a test or a classifier to distinguish diseased cases from normal cases is evaluated using the ROC curve analysis =-=[34]-=-. In this study, an attempt has been made to evaluate the performance of the classifiers computing the aforementioned measures for 5/10-fold cross-validations (CV) and 20% data partitioning. For 5-CV ... |

3 |
Radialbasis-function based classification of mammographic microcalcifications using texture features
- Dhawan, Chitre, et al.
- 1995
(Show Context)
Citation Context ...s evaluated [8]. Moreover, the fast learning rates and generalization capabilities of radial basis function neural networks (RBFNN) have showed excellent accuracy in microcalcification detection task =-=[9, 10]-=-. The advantages of RBFNN are simple structure, good performance with approaching nonlinear function, and fast convergence velocity. Thus, it has been widely used in pattern recognition and system mod... |

3 |
High-speed face recognition using self-adaptive radial basis function neural networks
- Sing, Thakur, et al.
- 2009
(Show Context)
Citation Context ...advantages of RBFNN are simple structure, good performance with approaching nonlinear function, and fast convergence velocity. Thus, it has been widely used in pattern recognition and system modeling =-=[11, 12]-=-. On the other hand, the structure of RBFNN increases when the net’s input dimension increases. Moreover, the irrelevant components in the inputs will decrease the generalization performance of RBFNN ... |

3 | A face and fingerprint identity authentication system based onmulti-route detection - Zhou, Su, et al. - 2007 |

3 |
A comparative analysis of principal component and independent component techniques for electrocardiograms,
- Chawla
- 2007
(Show Context)
Citation Context ...nique to reduce dimensionality using second order statistical information [20]. Independent component analysis (ICA) is a recently developed method in pattern recognition and signal processing fields =-=[21, 22]-=-. It involves higher order statistics to extract independent components that involve richer information than PCA. ICA can be used to reduce dimensionality before training ... |

3 |
Microarray data classification based on ensemble independent component selection
- Liu, Li, et al.
- 2009
(Show Context)
Citation Context ...bles have variances of one. PCA can be used for both these computations because it decorrelates the data and gives information on the variance of the decorrelated data in the form of the eigenvectors =-=[25]-=-. ICs are determined by applying a linear transformation to the uncorrelated data: ic ... |

3 |
Breast cancer diagnosis based on feature extraction using a hybrid of K-means and support vector machine algorithms,” Expert Systems with Applications
- Zheng, Yoon, et al.
- 2014
(Show Context)
Citation Context ...8] PSO + SVM 30 93.52% 91.52% QPSO + SVM 93.06% 90.00% Mangasarian et al. [39] 10-CV, MSM-T Best 3 97.50% — Mert et al. [40] 10-CV, PNN 3 (2IC + DWT) 96.31% 98.88% LOO, PNN 97.01% 97.78% Zheng et al. =-=[41]-=- ... |

2 | Colonic polyp detection in CT colonography with fuzzy rule based 3D template matching. - Kilic, ON, et al. - 2009 |

2 |
A support vector machine based MSM model for financial short-term volatility forecasting
- Wang, Huang, et al.
- 2013
(Show Context)
Citation Context ...on Computational and Mathematical Methods in Medicine Volume 2015, Article ID 265138, 11 pages http://dx.doi.org/10.1155/2015/265138 2 Computational and Mathematical Methods in Medicine of input data =-=[15, 16]-=-. Therefore it has been used for many recognition problems such as object recognition and face detection [17–19]. Principal component analysis (PCA) is a technique to reduce dimensionality using secon... |

2 | Evaluation of face recognition techniques using - Gumus, Kilic, et al. - 2010 |

2 | The UD RLS algorithm for training feedforward neural networks
- Bilski
- 2005
(Show Context)
Citation Context ...ctive mode [26]. 2.3. Artificial Neural Networks. Feedforward neural network (FFNN) is most popular ANN structure due to its simplicity in mathematical analysis and good representational capabilities =-=[27, 28]-=-. FFNN has been used successfully to various applications such as control, signal processing, and pattern classification. FFNN architecture is shown Figure 2. ... |

2 |
A comparison of feed-forward back-propagation and radial basis artificial neural networks: A Monte Carlo study
- Ahmad
- 2010
(Show Context)
Citation Context ...igure 2: Architecture of feedforward neural network. the nonlinear activation function, ... |

2 |
Statistical analysis of mammographic features and it classification using support vector machine.
- Krishnan, M, et al.
- 2010
(Show Context)
Citation Context ...putational and Mathematical Methods in Medicine 9 Table 7: Comparison of the methods and accuracy of previous studies and this study. Author Method Feature number Accuracy Sensitivity Krishnan et al. =-=[36]-=- 40% test data, SVM (poly.) 30 92.62% 92.69% 40% test data, SVM (RBF) 93.72% 94.50% Bagui et al. [37] 64% test data, ... |

2 |
Abdel Moniem, “Support vectormachine for diagnosis cancer disease: a comparative study
- Sweilam, Tharwat, et al.
- 2010
(Show Context)
Citation Context ...36] 40% test data, SVM (poly.) 30 92.62% 92.69% 40% test data, SVM (RBF) 93.72% 94.50% Bagui et al. [37] 64% test data, ... |

1 | Classification of pulmonary nodules by using hybrid features - Tartar, Kilic, et al. |

1 | Evaluation of bagging ensemble method with time-domain feature extraction for diagnosing of arrhythmia beats - Mert, Akan - 2014 |

1 |
Superior neuro-fuzzy classification systems
- Azar, El-Said
- 2012
(Show Context)
Citation Context ...s evaluated [8]. Moreover, the fast learning rates and generalization capabilities of radial basis function neural networks (RBFNN) have showed excellent accuracy in microcalcification detection task =-=[9, 10]-=-. The advantages of RBFNN are simple structure, good performance with approaching nonlinear function, and fast convergence velocity. Thus, it has been widely used in pattern recognition and system mod... |

1 |
A new method for decision on the structure of RBF neural network
- Jia, Zhao, et al.
- 2006
(Show Context)
Citation Context ...advantages of RBFNN are simple structure, good performance with approaching nonlinear function, and fast convergence velocity. Thus, it has been widely used in pattern recognition and system modeling =-=[11, 12]-=-. On the other hand, the structure of RBFNN increases when the net’s input dimension increases. Moreover, the irrelevant components in the inputs will decrease the generalization performance of RBFNN ... |

1 |
An experimental study: on reducingRBF input dimension by ICAandPCA
- Huang, Law, et al.
- 2002
(Show Context)
Citation Context .... On the other hand, the structure of RBFNN increases when the net’s input dimension increases. Moreover, the irrelevant components in the inputs will decrease the generalization performance of RBFNN =-=[13]-=-. Support vector machine (SVM) is an effective statistical learningmethod for classification [14]. SVM is based on finding optimal hyperplane to separate different classes mapping input data into high... |

1 |
Classification of macular and optic nerve disease by principal component analysis
- Kara, Güven, et al.
- 2007
(Show Context)
Citation Context ... many recognition problems such as object recognition and face detection [17–19]. Principal component analysis (PCA) is a technique to reduce dimensionality using second order statistical information =-=[20]-=-. Independent component analysis (ICA) is a recently developed method in pattern recognition and signal processing fields [21, 22]. It involves higher order statistics to extract independent component... |

1 |
Estimation of stream temperature in Firtina Creek (Rize-Turkiye) using artificial neural network model
- Sivri, Kilic, et al.
- 2007
(Show Context)
Citation Context ...ctive mode [26]. 2.3. Artificial Neural Networks. Feedforward neural network (FFNN) is most popular ANN structure due to its simplicity in mathematical analysis and good representational capabilities =-=[27, 28]-=-. FFNN has been used successfully to various applications such as control, signal processing, and pattern classification. FFNN architecture is shown Figure 2. ... |

1 |
DDC: distancebased decision classifier,”Neural
- Hamidzadeh, Monsefi, et al.
- 2012
(Show Context)
Citation Context ...e 6:The distribution of computed IC (reduced feature vector). For training processes, ... |

1 |
An improved hybrid feature reduction for increased breast cancer diagnostic performance
- Mert, Kılıç, et al.
- 2014
(Show Context)
Citation Context ...RNN 30 96.00% 95.09% 64% test data, ... |