Results 1 -
8 of
8
Parameter Tuning for Configuring and Analyzing Evolutionary Algorithms
- Swarm and Evolutionary Computation
, 2011
"... In this paper we present a conceptual framework for parameter tuning, provide a survey of tuning methods, and discuss related methodological issues. The framework is based on a three-tier hierarchy of a problem, an evolutionary algorithm (EA), and a tuner. Furthermore, we distinguish problem instanc ..."
Abstract
-
Cited by 25 (1 self)
- Add to MetaCart
In this paper we present a conceptual framework for parameter tuning, provide a survey of tuning methods, and discuss related methodological issues. The framework is based on a three-tier hierarchy of a problem, an evolutionary algorithm (EA), and a tuner. Furthermore, we distinguish problem instances, parameters, and EA performance measures as major factors, and discuss how tuning can be directed to algorithm performance and/or robustness. For the survey part we establish different taxonomies to categorize tuning methods and review existing work. Finally, we elaborate on how tuning can improve methodology by facilitating well-funded experimental comparisons and algorithm analysis.
Screening the Parameters Affecting Heuristic Performance
- In Proceedings of the Genetic and Evolutionary Computation Conference
, 2007
"... This research screens the tuning parameters of a combinatorial optimization heuristic. Specifically, it presents a Design of Experiments (DOE) approach that uses a Fractional Factorial Design to screen the tuning parameters of Ant Colony System (ACS) for the Travelling Salesperson problem. Screening ..."
Abstract
-
Cited by 9 (4 self)
- Add to MetaCart
(Show Context)
This research screens the tuning parameters of a combinatorial optimization heuristic. Specifically, it presents a Design of Experiments (DOE) approach that uses a Fractional Factorial Design to screen the tuning parameters of Ant Colony System (ACS) for the Travelling Salesperson problem. Screening is a preliminary step to building a Response Surface Model (RSM) [20, 18]. It identifies those parameters that need not be included in a Response Surface Model, thus reducing the complexity and expense of the RSM design. 10 algorithm parameters and 2 problem characteristics are considered. Open questions on the effect of 3 parameters on performance are answered. Ant placement and choice of ant for pheromone update have no effect. However, the choice of parallel or sequential solution construction does indeed influence performance. A further parameter, sometimes assumed important, was shown to have no effect on performance. A new problem characteristic that effects performance was identified. The importance of measuring solution time was highlighted by helping identify the prohibitive cost of non-integer parameters where those parameters are exponents in the ACS algorithm’s computations. All results are obtained with a publicly available algorithm and problem generator.
Identifying Key Algorithm Parameters and Instance Features using Forward Selection
"... Abstract. Most state-of-the-art algorithms for large scale optimization expose free parameters, giving rise to combinatorial spaces of possible configurations. Typically, these spaces are hard for humans to understand. In this work, we study a model-based approach for identifying a small set of both ..."
Abstract
-
Cited by 7 (5 self)
- Add to MetaCart
(Show Context)
Abstract. Most state-of-the-art algorithms for large scale optimization expose free parameters, giving rise to combinatorial spaces of possible configurations. Typically, these spaces are hard for humans to understand. In this work, we study a model-based approach for identifying a small set of both algorithm parameters and instance features that suffices for predicting empirical algorithm performance well. Our empirical analyses on a wide variety of hard combinatorial problem benchmarks (spanning SAT, MIP, and TSP) show that—for parameter configurations sampled uniformly at random—very good performance predictions can typically be obtained based on just two key parameters, and that similarly, few instance features and algorithm parameters suffice to predict the most salient algorithm performance characteristics in the combined configuration/feature space. We also use these models to identify settings of these key parameters that are predicted to achieve the best overall performance, both on average across instances and in an instance-specific way. This serves as a further way of evaluating model quality and also provides a tool for further understanding the parameter space. We provide software for carrying out this analysis on arbitrary problem domains and hope that it will help algorithm developers gain insights into the key parameters of their algorithms, the key features of their instances, and their interactions. 1
Experiments on Metaheuristics: Methodological Overview and Open Issues
"... Metaheuristics are a wide class of solution methods that have been successfully applied to many optimization problems. The assessment of these methods is commonly based on experimental analysis but the lack of a methodology in these analyses limits the scientific value of their results. In this pape ..."
Abstract
-
Cited by 6 (0 self)
- Add to MetaCart
Metaheuristics are a wide class of solution methods that have been successfully applied to many optimization problems. The assessment of these methods is commonly based on experimental analysis but the lack of a methodology in these analyses limits the scientific value of their results. In this paper we formalize different scenarios for the analysis and comparison of metaheuristics by experimentation. For each scenario we give pointers to the existing statistical methodology for carrying out a sound analysis. Finally, we provide a set of open issues and further research directions.
Tuning the Performance of the MMAS Heuristic
- Engineering Stochastic Local Search Algorithms. Designing, Implementing and Analyzing Effective Heuristics, volume 4638 of Lecture Notes in Computer Science
, 2007
"... Abstract. This paper presents an in-depth Design of Experiments (DOE) methodology for the performance analysis of a stochastic heuristic. The heuristic under investigation is Max-Min Ant System (MMAS) for the Travelling Salesperson Problem (TSP). Specifically, the Response Surface Methodology is use ..."
Abstract
-
Cited by 4 (1 self)
- Add to MetaCart
Abstract. This paper presents an in-depth Design of Experiments (DOE) methodology for the performance analysis of a stochastic heuristic. The heuristic under investigation is Max-Min Ant System (MMAS) for the Travelling Salesperson Problem (TSP). Specifically, the Response Surface Methodology is used to model and tune MMAS performance with regard to 10 tuning parameters, 2 problem characteristics and 2 performance metrics—solution quality and solution time. The accuracy of these predictions is methodically verified in a separate series of confirmation experiments. The two conflicting responses are simultaneously optimised using desirability functions. Recommendations on optimal parameter settings are made. The optimal parameters are methodically verified. The large number of degrees-of-freedom in the MMAS design are overcome with a Minimum Run Resolution V design. Publicly available algorithm and problem generator implementations are used throughout. The paper should therefore serve as an illustrative case study of the principled engineering of a stochastic heuristic. 1
Parameter tuning versus adaptation: proof of principle study on differential evolution
, 2008
"... The efficacy of an optimization method often depends on the choosing of a number of control parameters. Practitioners have traditionally chosen these control parameters manually, often according to elaborate guidelines or in a trial-and-error manner, which is laborious and susceptible to human misco ..."
Abstract
-
Cited by 3 (1 self)
- Add to MetaCart
The efficacy of an optimization method often depends on the choosing of a number of control parameters. Practitioners have traditionally chosen these control parameters manually, often according to elaborate guidelines or in a trial-and-error manner, which is laborious and susceptible to human misconceptions of what causes an optimizer to perform well. In recent years many variants to original optimization methods have appeared, which seek to adapt the control parameters during optimization, so as to remedy the need for a practitioner to determine good control parameters for a problem at hand. Ironically however, these variants typically just introduce new and additional parameters that must be chosen by the practitioner. Despite this obvious paradox these optimizer variants are still considered state-of-the-art, because they do show performance improvement empirically. In this paper, such variants of the general purpose optimization method known as Differential Evolution (DE) are studied with the intent of determining if their schemes for adapting control parameters yield an actual performance advantage over the basic form of DE which keeps the control parameters fixed during optimization. To fairly compare the performance of these optimizer variants against each other, their control parameters are all tuned by an automated approach. This unveils their true performance capabilities, and the results show that the DE variants generally have comparable performance, and hence that adaptive parameter schemes do not appear to yield a general and consistent performance improvement, as previously believed.
HEALTHSYSTEM CONSORTIUM HOSPITALS
, 2012
"... This Dissertation is brought to you for free and open access by the Graduate School at VCU Scholars Compass. It has been accepted for inclusion in Theses and Dissertations by an authorized administrator of VCU Scholars Compass. For more information, please contact libcompass@vcu.edu. ..."
Abstract
- Add to MetaCart
(Show Context)
This Dissertation is brought to you for free and open access by the Graduate School at VCU Scholars Compass. It has been accepted for inclusion in Theses and Dissertations by an authorized administrator of VCU Scholars Compass. For more information, please contact libcompass@vcu.edu.
A Roadmap of Nature-Inspired Systems Research and Development
"... Abstract. Nature-inspired algorithms such as genetic algorithms, particle swarm optimisation and ant colony algorithms have successfully solved computer science problems of search and optimisation. The initial implementations of these techniques focused on static problems solved on single machines. ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract. Nature-inspired algorithms such as genetic algorithms, particle swarm optimisation and ant colony algorithms have successfully solved computer science problems of search and optimisation. The initial implementations of these techniques focused on static problems solved on single machines. These have been extended