## Trace-Based Methods for Solving Nonlinear Global Optimization and Satisfiability Problems (1996)

Venue: | J. of Global Optimization |

Citations: | 15 - 5 self |

### BibTeX

@ARTICLE{Wah96trace-basedmethods,

author = {Benjamin W. Wah and Yao-jen Chang},

title = {Trace-Based Methods for Solving Nonlinear Global Optimization and Satisfiability Problems},

journal = {J. of Global Optimization},

year = {1996},

volume = {10}

}

### OpenURL

### Abstract

. In this paper we present a method called NOVEL (Nonlinear Optimization via External Lead) for solving continuous and discrete global optimization problems. NOVEL addresses the balance between global search and local search, using a trace to aid in identifying promising regions before committing to local searches. We discuss NOVEL for solving continuous constrained optimization problems and show how it can be extended to solve constrained satisfaction and discrete satisfiability problems. We first transform the problem using Lagrange multipliers into an unconstrained version. Since a stable solution in a Lagrangian formulation only guarantees a local optimum satisfying the constraints, we propose a global search phase in which an aperiodic and bounded trace function is added to the search to first identify promising regions for local search. The trace generates an information-bearing trajectory from which good starting points are identified for further local searches. Taking only a sm...

### Citations

3857 | Optimization by simulated annealing
- Kirkpatrick, Gelatt, et al.
- 1983
(Show Context)
Citation Context ...mber of constraints in each problem. Application Problem No. Variables No. Constraints Pooling/Blending [5,10] [5,10] VLSI Compaction Design [10 2 ,10 5 ] [10 3 ,10 6 ] Pressure Vessel Design [15,20] =-=[40,50]-=- Distillation Column Sequencing [30,90] [30,70] Reactor-Separator-Recycle System [100-120] [80,100] Complex Chemical Reactor Network [40,110] [30,100] Heat Exchanger Network Synthesis [10,60] [10,40] ... |

2153 |
Genetic Algorithms + Data Structures = Evolution Programs
- Michalewicz
- 1996
(Show Context)
Citation Context ...imization problems. They can be adopted to handle constraints in constrained global optimization. 6 Non-transformational approaches include discarding and back-to-feasible-regions methods. The former =-=[47, 55]-=- drop solutions once they were found to be infeasible, and the latter [48] attempt to maintain feasibility by reflecting moves from boundaries if such moves go off the current feasible region. Both me... |

1133 |
Linear and Nonlinear Programming
- Luenberger
- 1984
(Show Context)
Citation Context ...aints. Transformational approaches, on the other hand, convert the original problem into another form before solving them. Well known methods include penalty, barrier, and Lagrange-multiplier methods =-=[54]-=-. Penalty methods transform constraints into part of the objective function and require tuning penalty coefficients either before or during the run. Barrier methods are similar except that barriers ar... |

1107 |
A computing procedure for quantification theory
- Davis, Putnam
- 1960
(Show Context)
Citation Context ... backtracking to search the space systematically, whereas incomplete methods usually rely on ad hoc heuristics. Complete search methods for solving SAT problems include resolution 14 and Davis-Putnam =-=[18]-=- procedure. They are computationally intensive because they are enumerative in nature. For instance, Selman et al. [75] and Gu [35] have reported that Davis-Putnam procedure cannot handle SAT problems... |

698 | A New Method for Solving Hard Satisfiability Problems
- Selman, Levesque, et al.
- 1996
(Show Context)
Citation Context ...[30, 83, 36, 84, 82, 37, 33, 31, 38, 34, 35]. These simple and effective heuristics significantly improve the performance of local search algorithms by many orders of magnitude. Selman developed GSAT =-=[79, 75, 76, 78, 73, 77]-=- that starts from a randomly generated assignment and performs local search iteratively by flipping variables. Such flipping is repeated until either a satisfiable assignment is found or a pre-set max... |

527 |
Neural computation of decisions in optimization problems
- Hopfield, Tank
- 1985
(Show Context)
Citation Context ... include simplex methods for solving linear programming problems, steepest descent, conjugate gradient, Quasi-Newton, and Lagrange-multiplier methods [54]. For instance, Hopfield-type neural networks =-=[44]-=- are a steepest descent implementation based on a set of differential equations. Another approach proposed by Gu [33, 34, 35] is to use direct transformation and apply descent methods such as steepest... |

484 |
Global Optimization Using Interval Analysis
- Hansen
(Show Context)
Citation Context ...mber of constraints in each problem. Application Problem No. Variables No. Constraints Pooling/Blending [5,10] [5,10] VLSI Compaction Design [10 2 ,10 5 ] [10 3 ,10 6 ] Pressure Vessel Design [15,20] =-=[40,50]-=- Distillation Column Sequencing [30,90] [30,70] Reactor-Separator-Recycle System [100-120] [80,100] Complex Chemical Reactor Network [40,110] [30,100] Heat Exchanger Network Synthesis [10,60] [10,40] ... |

421 | Minimizing Conflicts: A Heuristic Repair Method for Constraint-Satisfaction and Scheduling Problems
- Minton, Johnston, et al.
- 1992
(Show Context)
Citation Context ...quals 0 if the logical assignment x satisfies C i and 1 otherwise. In this case, N (x) equals 0 when all the clauses are satisfied. Unfortunately, N (x) defined in (6) has many dent-like local optima =-=[56, 75, 76, 33, 34, 35]-=-, where a local minimum is a state where its local neighborhood does not include any state that is strictly better. Consequently, descent or hill-climbing methods can get trapped at local minima, and ... |

374 | Noise strategies for improving local search
- Selman, Kautz, et al.
- 1994
(Show Context)
Citation Context ...[30, 83, 36, 84, 82, 37, 33, 31, 38, 34, 35]. These simple and effective heuristics significantly improve the performance of local search algorithms by many orders of magnitude. Selman developed GSAT =-=[79, 75, 76, 78, 73, 77]-=- that starts from a randomly generated assignment and performs local search iteratively by flipping variables. Such flipping is repeated until either a satisfiable assignment is found or a pre-set max... |

338 |
Thermodynamical approach to the traveling salesman problem: an efficient simulation algorithm
- Cerny
- 1985
(Show Context)
Citation Context ... is in its controlled strategy for adaptive search that samples many regions of attraction before ending up as a greedy search. SA was originally designed to solve combinatorial optimization problems =-=[50, 14]-=-; for an extensive survey, see [1, 2]. The application of SA and related techniques to solve continuous global optimization problems can be found in [95, 13, 16, 65, 53, 47]. Recently, Romeijn and Smi... |

290 |
and Easy Distributions of SAT Problems
- Hard
- 1992
(Show Context)
Citation Context ...SP). Existing approaches to solve constraint-satisfaction problems include backtracking [56, 34], best-first search, most-constrained first search [87], and local-search methods such as hill-climbing =-=[57, 76, 75, 56, 34]-=-. These methods are generally combined with heuristic guidance such as conflict minimization [56, 85]. 3.2.2. Continuous methods In the continuous approach, discrete variables in the original SAT prob... |

284 | Local search strategies for satisfiability testing - Selman, Kautz, et al. - 1996 |

279 |
Global optimization: deterministic approaches
- Horst, Tuy
- 1996
(Show Context)
Citation Context ...ter case, we use satisfiability problems (SATs) to demonstrate the effectiveness of our proposed method. Global minimization looks for the minimizer x that is no larger than any other local minimum x =-=[45, 23, 94, 63]-=-, whereas local minimization aims at finding a local minimum x . Finding the global optimum x is a challenging problem as there may not be enough time to find a feasible solution, and even when a feas... |

223 | Global Optimization
- Törn, Žilinkas
- 1989
(Show Context)
Citation Context ...cond [91] pushes each point towards a local minimum by performing a few steps of local descent. Classical treatments of clustering methods include [94, 52, 91], and more recent papers can be found in =-=[92, 12, 90, 68, 94, 93]-=-. Historically, clustering algorithms have been proposed to improve multi-start methods. Although they have some success in solving global optimization problems [90], they depend heavily on when to st... |

221 | Domain-independent extensions to GSAT: solving large structured satisfiability problems
- Selman, Kautz
- 1993
(Show Context)
Citation Context ...quals 0 if the logical assignment x satisfies C i and 1 otherwise. In this case, N (x) equals 0 when all the clauses are satisfied. Unfortunately, N (x) defined in (6) has many dent-like local optima =-=[56, 75, 76, 33, 34, 35]-=-, where a local minimum is a state where its local neighborhood does not include any state that is strictly better. Consequently, descent or hill-climbing methods can get trapped at local minima, and ... |

202 |
An introduction to simulated evolutionary Optimization
- Fogel
(Show Context)
Citation Context ...mentations [10, 8, 11, 9]. Genetic Algorithms (GA). Global search in GA involves the rational generation of new sample points based on performance of the whole population of samples in each iteration =-=[25]-=-. GA arrives at local solutions either as a result of gene drift at the end of the reproduction process or through local steps that are performed off-line. Gradient-like information is not used in the... |

202 |
The Breakout Method for Escaping From Local Minima
- Morris
- 1993
(Show Context)
Citation Context ...To the best of our knowledge, there is no successful application of genetic algorithms to solve SAT problems. Recently, some local search methods were proposed and applied to solve large SAT problems =-=[61, 27, 17]-=-. The most notable ones are those developed independently by Gu and Selman. Gu developed a group of local search methods for solving SAT and CSP problems. In his Ph.D thesis [29], he first formulated ... |

194 |
Global Optimization
- Horst, Tuy
- 1990
(Show Context)
Citation Context ...ter case, we use satisfiability problems (SATs) to demonstrate the effectiveness of our proposed method. Global minimization looks for the minimizer x that is no larger than any other local minimum x =-=[45, 23, 94, 63]-=-, whereas local minimization aims at finding a local minimum x . Finding the global optimum x is a challenging problem as there may not be enough time to find a feasible solution, and even when a feas... |

179 |
Minimizing multimodal functions of continuous variables with the simulated annealing’ algorithm
- Corana, Marchesi, et al.
- 1987
(Show Context)
Citation Context ... solve combinatorial optimization problems [50, 14]; for an extensive survey, see [1, 2]. The application of SA and related techniques to solve continuous global optimization problems can be found in =-=[95, 13, 16, 65, 53, 47]-=-. Recently, Romeijn and Smith [67] used SA to solve constrained continuous global optimization problems. Their results are comparable in quality to existing solutions on a collection of classical test... |

138 | Towards an understanding of hill-climbing procedures for SAT
- Gent, Walsh
- 1993
(Show Context)
Citation Context ...To the best of our knowledge, there is no successful application of genetic algorithms to solve SAT problems. Recently, some local search methods were proposed and applied to solve large SAT problems =-=[61, 27, 17]-=-. The most notable ones are those developed independently by Gu and Selman. Gu developed a group of local search methods for solving SAT and CSP problems. In his Ph.D thesis [29], he first formulated ... |

137 | ODEPACK: A Systematized Collection of ODE Solvers - Hindmarsh - 1983 |

126 |
A Collection of Test Problems for Constrained Global Optimization Algorithms
- FLOUDAS, PARDALOS
- 1990
(Show Context)
Citation Context ...ter case, we use satisfiability problems (SATs) to demonstrate the effectiveness of our proposed method. Global minimization looks for the minimizer x that is no larger than any other local minimum x =-=[45, 23, 94, 63]-=-, whereas local minimization aims at finding a local minimum x . Finding the global optimum x is a challenging problem as there may not be enough time to find a feasible solution, and even when a feas... |

122 |
Handbook of Global Optimization
- Horst, Pardalos
- 1995
(Show Context)
Citation Context ...s [26, 15, 94]. 3.1.3. Complexities of Global Optimization Methods over Continuous Variables Global optimization of continuous nonlinear problems has been recognized as very difficult and intractable =-=[62, 46]-=-. The previous work surveyed in this subsection are generally heuristic in nature and depends heavily on problem formulations, initial starting points, and amount of time allowed. Stochastic optimizat... |

104 |
Efficient local search for very large-scale satisfiability problems
- Gu
- 1992
(Show Context)
Citation Context ...rovement in solving large size SAT, n-queen, and graph coloring problems [29, 83, 84, 82, 85]. His methods use various local handlers to escape from local traps when a greedy search stops progressing =-=[30, 36, 37, 31, 38, 32]-=-. Here, a search can continue without improvement when it reaches a local minimum [36] and can escape from it by a combination of backtracking, restarts, and random swaps. In variable selection and va... |

101 | An empirical study of greedy local search for satisfiability testing
- Selman, Kautz
- 1993
(Show Context)
Citation Context ...[30, 83, 36, 84, 82, 37, 33, 31, 38, 34, 35]. These simple and effective heuristics significantly improve the performance of local search algorithms by many orders of magnitude. Selman developed GSAT =-=[79, 75, 76, 78, 73, 77]-=- that starts from a randomly generated assignment and performs local search iteratively by flipping variables. Such flipping is repeated until either a satisfiable assignment is found or a pre-set max... |

97 | GENET: A Connectionist Architecture for Solving Constraint Satisfaction Problems by Iterative Improvement
- Davenport, Tsang, et al.
- 1994
(Show Context)
Citation Context ...To the best of our knowledge, there is no successful application of genetic algorithms to solve SAT problems. Recently, some local search methods were proposed and applied to solve large SAT problems =-=[61, 27, 17]-=-. The most notable ones are those developed independently by Gu and Selman. Gu developed a group of local search methods for solving SAT and CSP problems. In his Ph.D thesis [29], he first formulated ... |

89 |
Using Genetic Algorithms in Engineering Design Optimization with Non-Linear Constraints
- Powell, Skolnick
(Show Context)
Citation Context ...differentiable functions. Efforts were 11 reported in combining GA for global search and penalty functions for relaxing constraints to solve real-world engineering problems with closed form functions =-=[66]-=-. Using ten constrained optimization problems arising from mechanical, chemical, and electrical engineering areas, the results show that GA converges slowly and has difficulty in satisfying constraint... |

71 |
Generalized Simulated Annealing for Function Optimization
- Bohachevsky, Johnson, et al.
- 1986
(Show Context)
Citation Context ...00] Complex Chemical Reactor Network [40,110] [30,100] Heat Exchanger Network Synthesis [10,60] [10,40] Speed Reducer Weight Minimization [5,10] [10,20] Phase and Chemical Reaction Equilibrium [7,10] =-=[4,13]-=- The second class of application problems we have studied are the satisfiability (SAT) problem described in the DIMACS benchmark suite. NOVEL, as a general method for global optimization, shows compet... |

61 | A discrete Lagrangian-based global search method for solving satisfiability problems
- Shang, Wah
- 1998
(Show Context)
Citation Context ...tween 0 and 9. Since our algorithm has less dependence on initial points, all ten runs have found satisfiable assignments. More extensive results on applying this method have been discussed elsewhere =-=[98]-=-. One of the major advantages of NOVEL is its ability to escape from local minima without restarts. Consequently, it is able to find feasible assignments irrespective of its initial points. In contras... |

54 |
A Stocastic Method for Global Optimization
- Boender, Kan, et al.
- 1982
(Show Context)
Citation Context ...cond [91] pushes each point towards a local minimum by performing a few steps of local descent. Classical treatments of clustering methods include [94, 52, 91], and more recent papers can be found in =-=[92, 12, 90, 68, 94, 93]-=-. Historically, clustering algorithms have been proposed to improve multi-start methods. Although they have some success in solving global optimization problems [90], they depend heavily on when to st... |

51 |
Object-oriented implementation of heuristic search methods for graph coloring, maximum clique, and satisfiability
- Fleurent, Ferland
- 1996
(Show Context)
Citation Context ...x L(x; ) differs 28 Table 4. Results on circuit diagnosis problems comparing NOVEL's results on Sun SS 10/51, GSAT's results on SGI Challenge [76, 74], and Fleurent et al. tabu search on Sun SS 10/50 =-=[22]-=-. Problem NOVEL (Max-Flips=625 per trial) GS [76] GS [74] Tabu ID Var. Clauses Trials Time (sec.) Clauses Time Time Time avg max min avg Unsat'd (sec.) (sec.) (sec.) ssa7552-158 1363 3034 428 25 7 15 ... |

45 |
Approach to Global Optimization
- Mockus
- 1989
(Show Context)
Citation Context ...ied. The rim of a region is a divide that separates it from others. Due to nonlinearity, global optimization is often performed without a priori knowledge of problem terrains or regions of attraction =-=[94, 97, 72, 58]-=-. Therefore, global optimization algorithms use heuristic global measures to search for new regions of attraction at run time. Promising regions identified are further optimized by local refinement pr... |

44 | Global optimization for neural network training
- Shang, Wah
- 1996
(Show Context)
Citation Context ...nding [5,10] [5,10] VLSI Compaction Design [10 2 ,10 5 ] [10 3 ,10 6 ] Pressure Vessel Design [15,20] [40,50] Distillation Column Sequencing [30,90] [30,70] Reactor-Separator-Recycle System [100-120] =-=[80,100]-=- Complex Chemical Reactor Network [40,110] [30,100] Heat Exchanger Network Synthesis [10,60] [10,40] Speed Reducer Weight Minimization [5,10] [10,20] Phase and Chemical Reaction Equilibrium [7,10] [4,... |

41 |
Simulated Annealing and Boltzmann
- Aarts, Korst
- 1989
(Show Context)
Citation Context ...tive search that samples many regions of attraction before ending up as a greedy search. SA was originally designed to solve combinatorial optimization problems [50, 14]; for an extensive survey, see =-=[1, 2]-=-. The application of SA and related techniques to solve continuous global optimization problems can be found in [95, 13, 16, 65, 53, 47]. Recently, Romeijn and Smith [67] used SA to solve constrained ... |

39 | A continuous approach to inductive inference
- Kamath, Karmarkar, et al.
(Show Context)
Citation Context ... following benchmarks. ffl Circuit Diagnosis Problems. Alan Van Gelder and Yumi Tsjuji contributed a set of SAT formulas based on circuit fault analysis. ffl Boolean Inductive Problems. Kamath et al. =-=[49]-=- developed a set of SAT encodings of Boolean induction problems. The task is to synthesize (or induce) a logical circuit from its input-output behavior. Figure 9 shows the solution process of NOVEL in... |

36 | A Polynomial Time Algorithm for the N-Queens Problem
- Sosič, Gu
- 1990
(Show Context)
Citation Context ...t. In the first component, he first developed the so-called min-conflicts heuristic [29] and showed significant performance improvement in solving large size SAT, n-queen, and graph coloring problems =-=[29, 83, 84, 82, 85]-=-. His methods use various local handlers to escape from local traps when a greedy search stops progressing [30, 36, 37, 31, 38, 32]. Here, a search can continue without improvement when it reaches a l... |

35 |
Stochastic Techniques for Global Optimization: A Survey of Recent Advances
- Schöen
- 1990
(Show Context)
Citation Context ...ied. The rim of a region is a divide that separates it from others. Due to nonlinearity, global optimization is often performed without a priori knowledge of problem terrains or regions of attraction =-=[94, 97, 72, 58]-=-. Therefore, global optimization algorithms use heuristic global measures to search for new regions of attraction at run time. Promising regions identified are further optimized by local refinement pr... |

34 |
Generalized descent for global optimization
- Griewank
- 1981
(Show Context)
Citation Context ...x is a challenging problem as there may not be enough time to find a feasible solution, and even when a feasible solution is found, we have no way of showing that it is optimal. As stated by Griewank =-=[28]-=-, global optimization is mathematically ill-posed in the sense that a lower bound for f(x) cannot be given after any finite number of evaluations, unless f satisfies certain subsidiary conditions such... |

34 | Local search for satisfiability (SAT) problem
- Gu
(Show Context)
Citation Context ...quals 0 if the logical assignment x satisfies C i and 1 otherwise. In this case, N (x) equals 0 when all the clauses are satisfied. Unfortunately, N (x) defined in (6) has many dent-like local optima =-=[56, 75, 76, 33, 34, 35]-=-, where a local minimum is a state where its local neighborhood does not include any state that is strictly better. Consequently, descent or hill-climbing methods can get trapped at local minima, and ... |

32 |
Simulated annealing for constrained global optimization
- Romeijn, Smith
- 1994
(Show Context)
Citation Context ...or an extensive survey, see [1, 2]. The application of SA and related techniques to solve continuous global optimization problems can be found in [95, 13, 16, 65, 53, 47]. Recently, Romeijn and Smith =-=[67]-=- used SA to solve constrained continuous global optimization problems. Their results are comparable in quality to existing solutions on a collection of classical test problems. Clustering Methods. By ... |

32 |
Efficient Search Techniques: An Empirical Study of the N-Queens Problem
- Stone, Stone
- 1986
(Show Context)
Citation Context ...be considered as a constraint-satisfaction problem (CSP). Existing approaches to solve constraint-satisfaction problems include backtracking [56, 34], best-first search, most-constrained first search =-=[87]-=-, and local-search methods such as hill-climbing [57, 76, 75, 56, 34]. These methods are generally combined with heuristic guidance such as conflict minimization [56, 85]. 3.2.2. Continuous methods In... |

29 |
A global optimization algorithm
- Becker, Lago
- 1970
(Show Context)
Citation Context ...n random sample points, these algorithms try to start just one local search in each cluster in order to identify its local minimum. Two global strategies have been used for clustering [94]. The first =-=[52]-=- retains only points with relatively low function values to form clusters that correspond respectively to regions of attraction. The second [91] pushes each point towards a local minimum by performing... |

29 | Efficient local search with conflict minimization: A case study of the N-queen problem
- Sosic, Gu
- 1994
(Show Context)
Citation Context ...t. In the first component, he first developed the so-called min-conflicts heuristic [29] and showed significant performance improvement in solving large size SAT, n-queen, and graph coloring problems =-=[29, 83, 84, 82, 85]-=-. His methods use various local handlers to escape from local traps when a greedy search stops progressing [30, 36, 37, 31, 38, 32]. Here, a search can continue without improvement when it reaches a l... |

27 |
Global optimization and stochastic differential equations
- Aluffi-Pentini, Parisi, et al.
- 1985
(Show Context)
Citation Context ...n this paper. The first class is a collection of constrained global optimization benchmark problems [23] derived from a variety of engineering applications. Unlike small artificial benchmark problems =-=[3, 90, 5, 20]-=-, most of these problems are non-convex and have sizes ranging from small (tens of variables) to medium to large (hundreds of variables). Many of them have their best known solutions reported by other... |

26 |
Terminal repeller unconstrained subenergy tunneling (TRUST) for fast global optimization
- Cetin, Barhen, et al.
- 1993
(Show Context)
Citation Context ...d the number of constraints in each problem. Application Problem No. Variables No. Constraints Pooling/Blending [5,10] [5,10] VLSI Compaction Design [10 2 ,10 5 ] [10 3 ,10 6 ] Pressure Vessel Design =-=[15,20]-=- [40,50] Distillation Column Sequencing [30,90] [30,70] Reactor-Separator-Recycle System [100-120] [80,100] Complex Chemical Reactor Network [40,110] [30,100] Heat Exchanger Network Synthesis [10,60] ... |

25 |
Parallel Algorithms and Architectures for Very Fast AI Search
- Gu
- 1989
(Show Context)
Citation Context ... SAT problems [61, 27, 17]. The most notable ones are those developed independently by Gu and Selman. Gu developed a group of local search methods for solving SAT and CSP problems. In his Ph.D thesis =-=[29]-=-, he first formulated conflicts in the objective function and proposed a discrete relaxation algorithm (a class of deterministic local search) to minimize the number of conflicts in these problems. Th... |

25 | Adaptive Simulated Annealing (ASA
- Ingber
(Show Context)
Citation Context ...imization problems. They can be adopted to handle constraints in constrained global optimization. 6 Non-transformational approaches include discarding and back-to-feasible-regions methods. The former =-=[47, 55]-=- drop solutions once they were found to be infeasible, and the latter [48] attempt to maintain feasibility by reflecting moves from boundaries if such moves go off the current feasible region. Both me... |

24 |
Simulated Annealing: Theory and Practice
- Laarhoven, Aarts
- 1987
(Show Context)
Citation Context ...tive search that samples many regions of attraction before ending up as a greedy search. SA was originally designed to solve combinatorial optimization problems [50, 14]; for an extensive survey, see =-=[1, 2]-=-. The application of SA and related techniques to solve continuous global optimization problems can be found in [95, 13, 16, 65, 53, 47]. Recently, Romeijn and Smith [67] used SA to solve constrained ... |

23 |
A new method of locating the maximum of an arbitrary multipeak curve in the presence of noise
- Kushner
- 1964
(Show Context)
Citation Context ...tion values. Schagen [71] used a stationary stochastic process model to represent internally an objective function of success in a reasonable number of function evaluations. Based on Kushner's method =-=[51]-=- in one dimension, a global search algorithm for optimization in n dimension is presented in [88]. More introductions can be found in [94, 58, 97, 59]. A major drawback of Bayesian methods is their co... |

23 |
Application of bayesian approach to numerical methods of global and stochastic optimization
- Mockus
- 1994
(Show Context)
Citation Context ...ble number of function evaluations. Based on Kushner's method [51] in one dimension, a global search algorithm for optimization in n dimension is presented in [88]. More introductions can be found in =-=[94, 58, 97, 59]-=-. A major drawback of Bayesian methods is their computational complexity that grows exponentially with the number of problem dimensions [88, 59]. Therefore, their use for solving multi-dimensional pro... |