• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 21,256
Next 10 →

Global Optimization with Polynomials and the Problem of Moments

by Jean B. Lasserre - SIAM JOURNAL ON OPTIMIZATION , 2001
"... We consider the problem of finding the unconstrained global minimum of a real-valued polynomial p(x) : R R, as well as the global minimum of p(x), in a compact set K defined by polynomial inequalities. It is shown that this problem reduces to solving an (often finite) sequence of convex linear ma ..."
Abstract - Cited by 577 (48 self) - Add to MetaCart
matrix inequality (LMI) problems. A notion of Karush-Kuhn-Tucker polynomials is introduced in a global optimality condition. Some illustrative examples are provided.

Stochastic global optimization

by Anatoly Zhigljavsky , 2008
"... Stochastic global optimization methods are methods for solving a global optimization prob-lem incorporating probabilistic (stochastic) elements, either in the problem data (the objective function, the constraints, etc.), or in the algorithm itself, or in both. Global optimization is a very important ..."
Abstract - Cited by 289 (6 self) - Add to MetaCart
Stochastic global optimization methods are methods for solving a global optimization prob-lem incorporating probabilistic (stochastic) elements, either in the problem data (the objective function, the constraints, etc.), or in the algorithm itself, or in both. Global optimization is a very

Efficient Algorithms for Globally Optimal Trajectories

by John N. Tsitsiklis - IEEE TRANSACTIONS ON AUTOMATIC CONTROL , 1995
"... We present serial and parallel algorithms for solving ..."
Abstract - Cited by 376 (1 self) - Add to MetaCart
We present serial and parallel algorithms for solving

Differential Evolution - A simple and efficient adaptive scheme for global optimization over continuous spaces

by Rainer Storn, Kenneth Price , 1995
"... A new heuristic approach for minimizing possibly nonlinear and non differentiable continuous space functions is presented. By means of an extensive testbed, which includes the De Jong functions, it will be demonstrated that the new method converges faster and with more certainty than Adaptive Simula ..."
Abstract - Cited by 427 (5 self) - Add to MetaCart
, CA 95687, kprice@solano.community.net. Introduction Problems which involve global optimiz...

Global Optimization of Statistical Functions with Simulated Annealing

by William L. Goffe, Gary D. Ferrier, John Rogers - Journal of Econometrics , 1994
"... Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simu ..."
Abstract - Cited by 299 (2 self) - Add to MetaCart
Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm

Generating Accurate Rule Sets Without Global Optimization

by Eibe Frank, Ian H. Witten - IN: PROC. OF THE 15TH INT. CONFERENCE ON MACHINE LEARNING , 1998
"... The two dominant schemes for rule-learning, C4.5 and RIPPER, both operate in two stages. First they induce an initial rule set and then they refine it using a rather complex optimization stage that discards (C4.5) or adjusts (RIPPER) individual rules to make them work better together. In contrast, t ..."
Abstract - Cited by 269 (7 self) - Add to MetaCart
, this paper shows how good rule sets can be learned one rule at a time, without any need for global optimization. We present an algorithm for inferring rules by repeatedly generating partial decision trees, thus combining the two major paradigms for rule generation -- creating rules from decision trees

Global Optimizations for Parallelism and Locality on Scalable Parallel Machines

by Jennifer M. Anderson, Monica S. Lam - IN PROCEEDINGS OF THE SIGPLAN '93 CONFERENCE ON PROGRAMMING LANGUAGE DESIGN AND IMPLEMENTATION , 1993
"... Data locality is critical to achieving high performance on large-scale parallel machines. Non-local data accesses result in communication that can greatly impact performance. Thus the mapping, or decomposition, of the computation and data onto the processors of a scalable parallel machine is a key i ..."
Abstract - Cited by 256 (20 self) - Add to MetaCart
Data locality is critical to achieving high performance on large-scale parallel machines. Non-local data accesses result in communication that can greatly impact performance. Thus the mapping, or decomposition, of the computation and data onto the processors of a scalable parallel machine is a key issue in compiling programs for these architectures.

Interactive Graph Cuts for Optimal Boundary & Region Segmentation of Objects in N-D Images

by Yuri Y. Boykov , Marie-Pierre Jolly , 2001
"... In this paper we describe a new technique for general purpose interactive segmentation of N-dimensional images. The user marks certain pixels as “object” or “background” to provide hard constraints for segmentation. Additional soft constraints incorporate both boundary and region information. Graph ..."
Abstract - Cited by 1010 (20 self) - Add to MetaCart
cuts are used to find the globally optimal segmentation of the N-dimensional image. The obtained solution gives the best balance of boundary and region properties among all segmentations satisfying the constraints. The topology of our segmentation is unrestricted and both “object” and “background

A Taxonomy of Global Optimization Methods Based on Response Surfaces

by Donald R. Jones - Journal of Global Optimization , 2001
"... Abstract. This paper presents a taxonomy of existing approaches for using response surfaces for global optimization. Each method is illustrated with a simple numerical example that brings out its advantages and disadvantages. The central theme is that methods that seem quite reasonable often have no ..."
Abstract - Cited by 235 (1 self) - Add to MetaCart
Abstract. This paper presents a taxonomy of existing approaches for using response surfaces for global optimization. Each method is illustrated with a simple numerical example that brings out its advantages and disadvantages. The central theme is that methods that seem quite reasonable often have

Global optimization by continuous GRASP

by M. J. Hirsch, C. N. Meneses, P. M. Pardalos, M. G. C. Resende - Optimization Letters
"... ABSTRACT. We introduce a novel global optimization method called Continuous GRASP (C-GRASP) which extends Feo and Resende’s greedy randomized adaptive search procedure (GRASP) from the domain of discrete optimization to that of continuous global optimization. This stochastic local search method is s ..."
Abstract - Cited by 33 (10 self) - Add to MetaCart
ABSTRACT. We introduce a novel global optimization method called Continuous GRASP (C-GRASP) which extends Feo and Resende’s greedy randomized adaptive search procedure (GRASP) from the domain of discrete optimization to that of continuous global optimization. This stochastic local search method
Next 10 →
Results 1 - 10 of 21,256
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University