## Differential evolution methods for unsupervised image classification

Venue: | in Proc. 7th CEC, 2005 |

Citations: | 16 - 0 self |

### BibTeX

@INPROCEEDINGS{Omran_differentialevolution,

author = {Mahamed G. H. Omran and Andries P Engelbrecht and Ayed Salman},

title = {Differential evolution methods for unsupervised image classification},

booktitle = {in Proc. 7th CEC, 2005},

year = {},

pages = {966--973},

publisher = {IEEE Press}

}

### OpenURL

### Abstract

Abstract- A clustering method that is based on

### Citations

1857 | Some Methods for classification and Analysis of Multivariate Observations
- MacQueen
- 1967
(Show Context)
Citation Context ...tting elongated classes. Another category of unsupervised partitional algorithms is the class of non-iterative algorithms. The most widely used non-iterative algorithm is MacQueen's K-means algorithm =-=[8]-=-. This algorithm works in two phases as follows: one phase to find the centroids of the classes and the second to classify the image pixels. Competitive Learning (CL), updates the centroids sequential... |

240 |
Remote Sensing And Image Interpretation
- Lillesand, Kiefer
- 1999
(Show Context)
Citation Context ...step which is followed by the classification step. There are several popular supervised algorithms such as the minimum-distance-to-mean, parallelepiped and the Gaussian maximum likelihood classifiers =-=[2]-=-. In the unsupervised approach the classes are unknown and the approach starts by partitioning the image data into groups (or clusters), according to a similarity measure, which can be compared with r... |

184 |
Cluster analysis of multivariate data: Efficiency versus interpretability of classification
- Forgy
- 1965
(Show Context)
Citation Context ...el in the image is then assigned to the closest cluster (i.e. closest centroid). Finally, the centriods are recalculated according to the associated pixels. This process is repeated until convergence =-=[6]-=-. The K-means algorithm suffers from the following drawbacks: • the algorithm is data-dependent; • it is a greedy algorithm that depends on the initial conditions, which may cause the algorithm to con... |

150 |
Machine Vision: theory, algorithms, practicalities 2nd Ed
- Davies
- 1997
(Show Context)
Citation Context ...y an analyst [2]. Therefore, unsupervised classification is also referred to as a clustering problem. In general, the unsupervised approach has several advantages over the supervised approach, namely =-=[3]-=- • For unsupervised approaches, there is no need for an analyst to specify in advance all the classes in the image data set. The clustering algorithm automatically finds distinct classes, which dramat... |

86 |
A Clustering Technique for Summarizing Multivariate Data
- Ball, Hall
- 1967
(Show Context)
Citation Context ... the initial conditions, which may cause the algorithm to converge to suboptimal solutions; and • the user needs to specify the number of classes in advance [3]. ISODATA is an enhancement proposed by =-=[7]-=- that operates on the same concept as the K-means algorithm with the addition of the possibility of merging classes and splitting elongated classes. Another category of unsupervised partitional algori... |

82 | A Robust Competitive Clustering Algorithm with Applications in Computer Vision
- Frigui, Krishnapuram
- 1999
(Show Context)
Citation Context .... The focus of this paper is on the unsupervised approach. There are several algorithms that belong to this approach. These algorithms can be categorized into two groups: hierarchical and partitional =-=[4, 5]-=-. In hierarchical clustering, the output is "a tree showing a sequence of clustering with each clustering being a partition of the data set" [5]. This type of algorithms have the following advantages:... |

69 |
A convergence theorem for the fuzzy isodata clustering algorithm
- Bezdek
- 1980
(Show Context)
Citation Context ... multispectral scanner data sets obtained from the U.S. Geological Survey (USGS). 4.1 DE versus state-of-the-art clustering algorithms This section compares the performance of the DE with Kmeans, FCM =-=[14]-=-, KHM [15], H2 [16], a GA clustering algorithm and a PSO clustering algorithm [17]. In all cases, for DE, PSO and GA, 50 individuals were trained for 100 iterations; for the other algorithms 5000 iter... |

44 | Alternatives to the kmeans algorithm that find better clusterings
- Hamerly, Elkan
- 2002
(Show Context)
Citation Context ...ner data sets obtained from the U.S. Geological Survey (USGS). 4.1 DE versus state-of-the-art clustering algorithms This section compares the performance of the DE with Kmeans, FCM [14], KHM [15], H2 =-=[16]-=-, a GA clustering algorithm and a PSO clustering algorithm [17]. In all cases, for DE, PSO and GA, 50 individuals were trained for 100 iterations; for the other algorithms 5000 iterations were used (i... |

35 |
An Empirical Study of Evolutionary Techniques for Multiobjective Optimization in Engineering Design
- Coello
- 1996
(Show Context)
Citation Context ...lusters (i.e. good clustering). The fitness function is thus a multi-objective problem. Approaches to solve multi-objective problems have been developed mostly for evolutionary computation approaches =-=[13]-=-. Since our scope is to illustrate the applicability of DE to unsupervised image classification, and not on multi-objective optimization, a simple weighted approach is used to cope with multiple objec... |

27 | A Genetic C-means Clustering Algorithm Applied to Image Quantization
- Scheunders
- 1997
(Show Context)
Citation Context ...he centroids of the classes and the second to classify the image pixels. Competitive Learning (CL), updates the centroids sequentially by moving the closest centroid toward the pixel being classified =-=[9]-=-. Non-iterative algorithms suffer the drawback of being dependent on the order in which the data points are presented. To overcome this problem, the choice of data points can be randomized [3]. Lilles... |

23 | Histogram Clustering for Unsupervised Image Segmentation
- Puzicha, Hofmann, et al.
- 2000
(Show Context)
Citation Context ...to eliminate the need to tune objective weights. A gbest DE is also proposed with encouraging results. 1 Introduction Image clustering is the process of identifying groups of similar image primitives =-=[1]-=-. These image primitives can be pixels, regions, line elements and so on, depending on the problem encountered. Many basic image processing techniques such as quantization, segmentation and coarsening... |

18 | A simple and global optimization algorithm for engineering problems: Differential evolution algorithm
- Karaboga, Okdem
- 2004
(Show Context)
Citation Context ...ried over to the next generation. DE is easy to implement, requires little parameter tunning [11], can find the global optimum regardless of the initial parameter values and exhibits fast convergence =-=[12]-=-. 3 DE-Based Clustering Algorithm This section defines the terminology used throughout the rest of the paper. A measure is given to quantify thesquality of a clustering algorithm, after which the DEba... |

18 |
Particle Swarm Optimization Method For Image Clustering
- Omran, Engelbrecht
- 2005
(Show Context)
Citation Context ...4.1 DE versus state-of-the-art clustering algorithms This section compares the performance of the DE with Kmeans, FCM [14], KHM [15], H2 [16], a GA clustering algorithm and a PSO clustering algorithm =-=[17]-=-. In all cases, for DE, PSO and GA, 50 individuals were trained for 100 iterations; for the other algorithms 5000 iterations were used (i.e. all algorithms have performed 5000 function evaluations). F... |

16 | Generalized k-harmonic means - boosting in unsupervised learning
- Zhang
- 2000
(Show Context)
Citation Context ...tral scanner data sets obtained from the U.S. Geological Survey (USGS). 4.1 DE versus state-of-the-art clustering algorithms This section compares the performance of the DE with Kmeans, FCM [14], KHM =-=[15]-=-, H2 [16], a GA clustering algorithm and a PSO clustering algorithm [17]. In all cases, for DE, PSO and GA, 50 individuals were trained for 100 iterations; for the other algorithms 5000 iterations wer... |

13 | Particle swarm optimization methods for pattern recognition and image processing
- Omran
- 2005
(Show Context)
Citation Context ...100 iterations; for the other algorithms 5000 iterations were used (i.e. all algorithms have performed 5000 function evaluations). For K-means, FCM, KHM, H2, GA and PSO, the parameters were set as in =-=[18]-=-. For the DE, Pr was set to 0.9 as suggested by [19]. According to [19], γ is generally in the range [0.5, 1]. Therefore, in order to free the user from specifying a value for γ , in this paper,γ star... |

11 |
A comparative study of differential evolution, particle swarm optimization, and 5.68771e-005 (5.43526e-005) (96.00) evolutionary algorithms on numerical benchmark problems
- Vesterstroem, Thomsen
- 1980
(Show Context)
Citation Context ..., PSO and GA showed similar performance, with no significant difference. However, due to the advantages of DE as shown in section 2, DE is the best choice to use. These results confirm the results of =-=[20]-=- and [21]. The segmented images resulting from the DE-based clustering algorithm are shown in Figure 2. These results show that the DE-based clustering algorithm is a viable alternative that merit fur... |

10 |
Clustering by Space-Space Filtering
- Leung, Zhang, et al.
- 2000
(Show Context)
Citation Context .... The focus of this paper is on the unsupervised approach. There are several algorithms that belong to this approach. These algorithms can be categorized into two groups: hierarchical and partitional =-=[4, 5]-=-. In hierarchical clustering, the output is "a tree showing a sequence of clustering with each clustering being a partition of the data set" [5]. This type of algorithms have the following advantages:... |

7 |
High performance clustering with differential evolution
- Paterlini, Krink
(Show Context)
Citation Context ...arent is replaced with its offspring if the fitness of the offspring is better, otherwise the parent is carried over to the next generation. DE is easy to implement, requires little parameter tunning =-=[11]-=-, can find the global optimum regardless of the initial parameter values and exhibits fast convergence [12]. 3 DE-Based Clustering Algorithm This section defines the terminology used throughout the re... |

2 |
Differential Evolution – a Simple and Efficient Adaptive Scheme for Global Optimization over Continuous Spaces
- P
- 1995
(Show Context)
Citation Context ...ection 4 presents experimental results to illustrate the efficiency of the algorithm. Section 5 concludes the paper, and outlines future research. 2 Differential Evolution Differential evolution (DE) =-=[10]-=- is a population-based search strategy very similar to standard evolutionary algorithms. The main difference is in the reproduction step where offspring is created from three parents using an arithmet... |