Results 1  10
of
11
Old Bachelor Acceptance: A New Class of NonMonotone Threshold Accepting Methods
"... Stochastic hillclimbing algorithms, particularly simulated annealing (SA) and threshold acceptance (TA), have become very popular for global optimization applications. Typical implementations of SA or TA usemonotone temperature or threshold schedules, and are not formulated to accommodate practical ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
Stochastic hillclimbing algorithms, particularly simulated annealing (SA) and threshold acceptance (TA), have become very popular for global optimization applications. Typical implementations of SA or TA usemonotone temperature or threshold schedules, and are not formulated to accommodate practical time limits. We present a new threshold acceptance strategy called Old Bachelor Acceptance (OBA) which has three distinguishing features: (i) it is specifically motivated by the practical requirement of optimization within a prescribed time bound, (ii) the threshold schedule is selftuning, and (iii) the threshold schedule is nonmonotone, with threshold values even allowed to become negative. The standard implementation of the TA method of Dueck and Scheuer is a special case of OBA. Experiments using several classes of symmetric traveling salesman problem instances show that OBA can outperform previous hillclimbing methods for timecritical optimizations. A number of directions for future work are suggested.
A label field fusion Bayesian model and its penalized maximum rand estimator for image segmentation
 IEEE Trans. Image Process
, 2010
"... Abstract—This paper presents a novel segmentation approach based on a Markov random field (MRF) fusion model which aims at combining several segmentation results associated with simpler clustering models in order to achieve a more reliable and accurate segmentation result. The proposed fusion model ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
Abstract—This paper presents a novel segmentation approach based on a Markov random field (MRF) fusion model which aims at combining several segmentation results associated with simpler clustering models in order to achieve a more reliable and accurate segmentation result. The proposed fusion model is derived from the recently introduced probabilistic Rand measure for comparing one segmentation result to one or more manual segmentations of the same image. This nonparametric measure allows us to easily derive an appealing fusion model of label fields, easily expressed as a Gibbs distribution, or as a nonstationary MRF model defined on a complete graph. Concretely, this Gibbs energy model encodes the set of binary constraints, in terms of pairs of pixel labels, provided by each segmentation results to be fused. Combined with a prior distribution, this energybased Gibbs model also allows for definition of an interesting penalized maximum probabilistic rand estimator with which the fusion of simple, quickly estimated, segmentation results appears as an interesting alternative to complex segmentation models existing in the literature. This fusion framework has been successfully applied on the Berkeley image database. The experiments reported in this paper demonstrate that the proposed method is efficient in terms of visual evaluation and quantitative performance measures and performs well compared to the best existing stateoftheart segmentation methods recently proposed in the literature. Index Terms—Bayesian model, Berkeley image database, color textured image segmentation, energybased model, label field fusion, Markovian (MRF) model, probabilistic Rand index. I.
Reactive search: machine learning for memorybased heuristics
 Teofilo F. Gonzalez (Ed.), Approximation Algorithms and Metaheuristics, Taylor & Francis Books (CRC Press
, 2005
"... 1 Introduction: the role of the user in heuristics Most stateoftheart heuristics are characterized by a certain number of choices and free parameters, whose appropriate setting is a subject that raises issues of research methodology [5, 41, 51]. In some cases, these parameters are tuned through a ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
1 Introduction: the role of the user in heuristics Most stateoftheart heuristics are characterized by a certain number of choices and free parameters, whose appropriate setting is a subject that raises issues of research methodology [5, 41, 51]. In some cases, these parameters are tuned through a feedback loop that includes the user as a crucial learning component: depending on preliminary algorithm tests some parameter values are changed by the
InformationConserving Object Recognition
 in Noisy Images Using Simulated Annealing, http://www.umiacs.umd.edu:80//users/betke/iccv95.ps
, 1997
"... The problem of recognizing objects imaged in complex realworld scenes is examined from a parametric perspective using the theory of statistical estimation. A scalar measure of an object's complexity, which is invariant under affine transformation and changes in image noise level, is extracted from ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
The problem of recognizing objects imaged in complex realworld scenes is examined from a parametric perspective using the theory of statistical estimation. A scalar measure of an object's complexity, which is invariant under affine transformation and changes in image noise level, is extracted from the object's Fisher information. The volume of Fisher information is shown to provide an overall statistical measure of the object's recognizability in a particular image, while the complexity provides an intrinsically physical measure that characterizes the object in any image. An informationconserving method is then developed for recognizing an object imaged in a complex scene. Here the term "informationconserving" means that the method uses all the measured data pertinent to the object's recognizability, attains the theoretical lower bound on estimation error for any unbiased estimate of the parameter vector describing the object, and therefore is statistically optimal. This method is t...
BestSoFar vs. WhereYouAre: Implications for Optimal FiniteTime Annealing
 Systems and Control Letters
, 1994
"... The simulated annealing (SA) algorithm is widely used for heuristic global optimization due to its highquality results and its ability, in theory, to yield optimal solutions with probability one. Standard SA implementations use monotone decreasing, or "cooling", temperature schedules that are motiv ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
The simulated annealing (SA) algorithm is widely used for heuristic global optimization due to its highquality results and its ability, in theory, to yield optimal solutions with probability one. Standard SA implementations use monotone decreasing, or "cooling", temperature schedules that are motivated by the algorithm's proof of optimality as well as by an analogy with statistical thermodynamics. In this paper, we challenge this motivation. The theoretical framework under which monotone cooling schedules are "optimal" fails to capture the practical performance of the algorithm; we therefore propose a "bestsofar" (BSF) criterion that measures the practical utility of a given annealing schedule. For small instances of two classic combinatorial problems, we determine annealing schedules that are optimal in terms of expected cost of the output solution. When the goal is to optimize the cost of the last solution seen by the algorithm (the "whereyouare" criterion used in previous theor...
Simulated Annealing with Inaccurate Cost Functions
 in Proceedings of the IMACS International Congress of Mathematics and Computer Science
, 1994
"... . Simulated annealing is an algorithm which generates nearoptimal outcomes to combinatorial optimization problems. It is commonly thought to be slow. Costfunction approximation and parallel processing increase simulated annealing speed, but they can cause inaccuracies that degrade the outcome. Pri ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
. Simulated annealing is an algorithm which generates nearoptimal outcomes to combinatorial optimization problems. It is commonly thought to be slow. Costfunction approximation and parallel processing increase simulated annealing speed, but they can cause inaccuracies that degrade the outcome. Prior theoretical work has not adequately related costfunction inaccuracy to the runtime or quality of the outcome. We prove these results about annealing with inaccurate costfunctions: 1) Expected cost at equilibrium is exponentially affected by fl=T , where fl limits costfunction rangeerrors and T gives the temperature. 2) Expected cost at equilibrium is exponentially affected by (oe 2 \Gamma oe 2 )=2T 2 , when the errors have a Gaussian distribution. 3) Constraining fl to a constant factor of T guarantees convergence under a 1= log t temperature schedule. 4) A similar constraint guarantees convergence for a fractal space with a geometric temperature schedule. 5) Inaccuracies worse...
BestSoFar vs. WhereYouAre: New Perspectives on Simulated Annealing for CAD
 in Proc. European Design Automation Conf
, 1993
"... The simulated annealing (SA) algorithm [14] [5] has been applied to every difficult optimization problem in VLSI CAD. Existing SA implementations use monotone decreasing, or cooling, temperature schedules motivated by the algorithm's proof of optimality as well as by an analogy with statistical ther ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
The simulated annealing (SA) algorithm [14] [5] has been applied to every difficult optimization problem in VLSI CAD. Existing SA implementations use monotone decreasing, or cooling, temperature schedules motivated by the algorithm's proof of optimality as well as by an analogy with statistical thermodynamics. This paper gives strong evidence that challenges the correctness of using such schedules. Specifically, the theoretical framework under which monotone cooling schedules is proved optimal fails to capture the practical application of simulated annealing; in practice, the algorithm runs for a finite rather than infinite amount of time; and the algorithm returns the best solution visited during the entire run ("bestsofar") rather than the last solution visited ("whereyouare"). For small instances of classic VLSI CAD problems, we determine annealing schedules that are optimal in terms of the expected quality of the bestsofar solution. These optimal schedules do not decrease mo...
Simulated Annealing of Neural Networks: the "Cooling" Strategy Reconsidered
, 1993
"... The simulated annealing (SA) algorithm [12] [5] has been widely used to address intractable global optimizations in many fields, including training of artificial neural networks. Implementations of annealing universally use a monotone decreasing, or "cooling ", temperature schedule which is motivate ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
The simulated annealing (SA) algorithm [12] [5] has been widely used to address intractable global optimizations in many fields, including training of artificial neural networks. Implementations of annealing universally use a monotone decreasing, or "cooling ", temperature schedule which is motivated by the algorithm's proof of optimality as well as analogies with statistical thermodynamics. In this paper, we challenge this motivation: the fact that cooling schedules are "optimal" in theory is not related to the practical performance of the algorithm. Our finding is based on a new "bestsofar" criterion for measuring the quality of annealing schedules. Motivated by studies of optimal schedules for small problems, we study highly nonstandard annealing schedules for training of feedforward perceptron networks on a realworld sensor classification benchmark. We find clear evidence that optimal schedules do not necessarily decrease monotonically to zero. 1 Introduction A typical applic...