Results 1  10
of
62
Bundle Adjustment  A Modern Synthesis
 VISION ALGORITHMS: THEORY AND PRACTICE, LNCS
, 2000
"... This paper is a survey of the theory and methods of photogrammetric bundle adjustment, aimed at potential implementors in the computer vision community. Bundle adjustment is the problem of refining a visual reconstruction to produce jointly optimal structure and viewing parameter estimates. Topics c ..."
Abstract

Cited by 555 (12 self)
 Add to MetaCart
(Show Context)
This paper is a survey of the theory and methods of photogrammetric bundle adjustment, aimed at potential implementors in the computer vision community. Bundle adjustment is the problem of refining a visual reconstruction to produce jointly optimal structure and viewing parameter estimates. Topics covered include: the choice of cost function and robustness; numerical optimization including sparse Newton methods, linearly convergent approximations, updating and recursive methods; gauge (datum) invariance; and quality control. The theory is developed for general robust cost functions rather than restricting attention to traditional nonlinear least squares.
OPT++: An objectoriented class library for nonlinear optimization
 Sandia Report SAND948225, Sandia National Laboratories
, 1994
"... Issued by Sandia National Laboratories, operated for the United States Department of Energy by Sandia Corporation. NOTICE: This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor any agency thereof, nor any of th ..."
Abstract

Cited by 45 (1 self)
 Add to MetaCart
(Show Context)
Issued by Sandia National Laboratories, operated for the United States Department of Energy by Sandia Corporation. NOTICE: This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor any agency thereof, nor any of their employees, nor any of the contractors, subcontractors, or their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any Information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise, does not necessarily constitute or imply Its endorsement, recommendation, or favoring by the United States Government, any agency thereof or any of their contractors or subconractors. The views and opinions expressed herein do not necessarily state or reflect those of the United States Government, any agency thereof or any of their contractors or subcontractors. This report has been reproduced from the best available copy. Available to DOE and DOE contractors from:
A Review Of Techniques In The Verified Solution Of Constrained Global Optimization Problems
, 1996
"... Elements and techniques of stateoftheart automatically verified constrained global optimization algorithms are reviewed, including a description of ways of rigorously verifying feasibility for equality constraints and a careful consideration of the role of active inequality constraints. Previousl ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
Elements and techniques of stateoftheart automatically verified constrained global optimization algorithms are reviewed, including a description of ways of rigorously verifying feasibility for equality constraints and a careful consideration of the role of active inequality constraints. Previously developed algorithms and general work on the subject are also listed. Limitations of present knowledge are mentioned, and advice is given on which techniques to use in various contexts. Applications are discussed. 1 INTRODUCTION, BASIC IDEAS AND LITERATURE We consider the constrained global optimization problem minimize OE(X) subject to c i (X) = 0; i = 1; : : : ; m (1.1) a i j x i j b i j ; j = 1; : : : ; q; where X = (x 1 ; : : : ; xn ) T . A general constrained optimization problem, including inequality constraints g(X) 0 can be put into this form by introducing slack variables s, replacing by s + g(X) = 0, and appending the bound constraint 0 s ! 1; see x2.2. 2 Chapter 1 W...
Modular modeling of cellular systems with ProMoT/Diva
 Bioinformatics
, 2003
"... Motivation: Need for software to setup and analyze complex mathematical models for cellular systems in a modular way, that also integrates the experimental environment of the cells. Results: A computer framework is described which allows the building of modularly structured models using an abstract, ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
Motivation: Need for software to setup and analyze complex mathematical models for cellular systems in a modular way, that also integrates the experimental environment of the cells. Results: A computer framework is described which allows the building of modularly structured models using an abstract, modular and general modeling methodology. With this methodology, reusable modeling entities are introduced which lead to the development of a modeling library within the modeling tool ProMot. The simulation environment Diva is used for numerical analysis and parameter identification of the models. The simulation environment provides a number of tools and algorithms to simulate and analyze complex biochemical networks. The described tools are the first steps towards an integrated computerbased modeling, simulation and visualization environment. Availability: Available on request to the authors. The software itself is free for scientific purposes but requires commercial libraries. Contact:
Cutoff Rate and Signal Design for the Quasistatic Rayleigh Fading SpaceTime Channel
, 2001
"... We consider the computational cutoff rate and its implications on signal design for the complex quasistatic Rayleigh at fading spatiotemporal channel under a peak power constraint where neither transmitter nor receiver know the channel matrix. The cutoff rate has an integral representation which ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
We consider the computational cutoff rate and its implications on signal design for the complex quasistatic Rayleigh at fading spatiotemporal channel under a peak power constraint where neither transmitter nor receiver know the channel matrix. The cutoff rate has an integral representation which is an increasing function of the distance between pairs of complex signal matrices. When the analysis is restricted to finite dimensional sets of signals interesting characterizations of the optimal rateachieving signal constellation can be obtained. For arbitrary finite dimension, the rateoptimal constellation must admit an equalizer distribution, i.e., a positive set of signal probabilities which equalizes the average distance between signal matrices in the constellation. When the number N of receive antennas is large the distanceoptimal constellation is nearly rateoptimal. When the number of matrices in the constellation is less than the ratio of the number of time samples to the number of transmit antennas, the rateoptimal cutoff rate attaining constellation is a set of equiprobable mutuallyorthogonal unitary matrices. When the SNR is below a specified threshold the matrices in the constellation are rank one and the cutoff rate is achieved by applying all transmit power to a single antenna and using orthogonal signaling. Finally, we derive recursive necessary conditions and sucient conditions for a constellation to lie in the feasible set.
Componentbased integration of chemistry and optimization software
 Journal of Computational Chemistry
, 2004
"... Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Componentbased software engineering is an emerging approach to managing the increasing complexity of scientific software. ..."
Abstract

Cited by 13 (8 self)
 Add to MetaCart
Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Componentbased software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance and highlight potential research enabled by this platform. Key words: electronic structure, component, software development, optimization 1 1
Optimization techniques for highperformance digital circuits
 in Proc. IEEE Int. Conf. ComputerAided Design (ICCAD
, 1997
"... The relentless push for high performance in custom digital circuits has led to renewed emphasis on circuit optimization or tuning. The parameters of the optimization are typically transistor and interconnect sizes. The design metrics are not just delay, transition times, power and area, but also ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
(Show Context)
The relentless push for high performance in custom digital circuits has led to renewed emphasis on circuit optimization or tuning. The parameters of the optimization are typically transistor and interconnect sizes. The design metrics are not just delay, transition times, power and area, but also signal integrity and manufacturability. This tutorial paper discusses some of the recently proposed methods of circuit optimization, with an emphasis on practical application and methodology impact. Circuit optimization techniques fall into three broad categories. The rst is dynamic tuning, based on timedomain simulation of the underlying circuit, typically combined with adjoint sensitivity computation. These methods are accurate but require the specication of input signals, and are best applied to small data
ow circuits and \crosssections " of larger circuits. Ecient sensitivity computation renders feasible the tuning of circuits with a few thousand transistors. Second, static tuners employ static timing analysis to evaluate the performance of the circuit. All paths through the logic are simultaneously tuned, and no input vectors are required. Large control macros are best tuned by these methods. However, in the context of deep submicron custom design, the inaccuracy of the delay models employed by these methods often limits their utility. Aggressive dynamic or static tuning can push a circuit into a precipitous corner of the manufacturing process space, which is a problem addressed by the third class of circuit optimization tools, statistical tuners. Statistical techniques are used to enhance manufacturability or maximize yield. In addition to surveying the above techniques, topics such as the use of stateoftheart nonlinear optimization methods and special considerations for interconnect sizing, clock tree optimization and noiseaware tuning will be brie
y considered. 1
Multilevel simulation and numerical optimization of complex engineering designs
, 1996
"... Multilevel representations have been studied extensively by arti cial intelligence researchers. We utilize the multilevel paradigm to attack the problem of performing multidiscipline engineering design optimization in the presence of many local optima. We use amultidisciplinary simulator at multip ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
Multilevel representations have been studied extensively by arti cial intelligence researchers. We utilize the multilevel paradigm to attack the problem of performing multidiscipline engineering design optimization in the presence of many local optima. We use amultidisciplinary simulator at multiple levels of abstraction, paired with a multilevel search space. We tested the resulting system in the domain of conceptual design of supersonic transport aircraft, and found that using multilevel simulation and optimization can decrease the cost of design space search by one or more orders of magnitude.
Using the GA and TAO toolkits for solving largescale optimization problems on parallel computers
"... Challenges in the scalable solution of largescale optimization problems include the development of innovative algorithms and efficient tools for parallel data manipulation. This paper discusses two complementary toolkits from the collection of Advanced CompuTational Software (ACTS), namely, Global ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
(Show Context)
Challenges in the scalable solution of largescale optimization problems include the development of innovative algorithms and efficient tools for parallel data manipulation. This paper discusses two complementary toolkits from the collection of Advanced CompuTational Software (ACTS), namely, Global Arrays (GA) for parallel data management and the Toolkit for Advanced Optimization (TAO), which have been integrated to support largescale scientific applications of unconstrained and bound constrained minimization problems. Most likely to benefit are minimization problems arising in classical molecular dynamics, free energy simulations, and other applications where the coupling among variables requires dense data structures. TAO uses abstractions for vectors and matrices so that its optimization algorithms can easily interface to distributed data management and linear algebra capabilities implemented in the GA library. The GA/TAO interfaces are available both in the traditional library mode and as components compliant with the Common Component Architecture (CCA). We highlight the design of each toolkit, describe the interfaces between them, and demonstrate their use. Categories and Subject Descriptors: