Results 1  10
of
32
Variational Surface Modeling
 Computer Graphics
, 1992
"... We present a new approach to interactive modeling of freeform surfaces. Instead of a fixed mesh of control points, the model presented to the user is that of an infinitely malleable surface, with no fixed controls. The user is free to apply control points and curves which are then available as handl ..."
Abstract

Cited by 172 (4 self)
 Add to MetaCart
We present a new approach to interactive modeling of freeform surfaces. Instead of a fixed mesh of control points, the model presented to the user is that of an infinitely malleable surface, with no fixed controls. The user is free to apply control points and curves which are then available as handles for direct manipulation. The complexity of the surface's shape may be increased by adding more control points and curves, without apparent limit. Within the constraints imposed by the controls, the shape of the surface is fully determined by one or more simple criteria, such as smoothness. Our method for solving the resulting constrained variational optimization problems rests on a surface representation scheme allowing nonuniform subdivision of Bspline surfaces. Automatic subdivision is used to ensure that constraints are met, and to enforce error bounds. Efficient numerical solutions are obtained by exploiting linearities in the problem formulation and the representation. Keywords: sur...
LARGESCALE LINEARLY CONSTRAINED OPTIMIZATION
, 1978
"... An algorithm for solving largescale nonlinear ' programs with linear constraints is presented. The method combines efficient sparsematrix techniques as in the revised simplex method with stable quasiNewton methods for handling the nonlinearities. A generalpurpose production code (MINOS) is ..."
Abstract

Cited by 93 (15 self)
 Add to MetaCart
An algorithm for solving largescale nonlinear ' programs with linear constraints is presented. The method combines efficient sparsematrix techniques as in the revised simplex method with stable quasiNewton methods for handling the nonlinearities. A generalpurpose production code (MINOS) is described, along with computational experience on a wide variety of problems.
Iterative Linear Algebra for Constrained Optimization
, 2005
"... Each step of an interior point method for nonlinear optimization requires the solution of a symmetric indefinite linear system known as a KKT system, or more generally, a saddle point problem. As the problem size increases, direct methods become prohibitively expensive to use for solving these probl ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Each step of an interior point method for nonlinear optimization requires the solution of a symmetric indefinite linear system known as a KKT system, or more generally, a saddle point problem. As the problem size increases, direct methods become prohibitively expensive to use for solving these problems; this leads to iterative solvers being the only viable alternative. In this thesis we consider iterative methods for solving saddle point systems and show that a projected preconditioned conjugate gradient method can be applied to these indefinite systems. Such a method requires the use of a specific class of preconditioners, (extended) constraint preconditioners, which exactly replicate some parts of the saddle point system that we wish to solve. The standard method for using constraint preconditioners, at least in the optimization community, has been to choose the constraint
An Efficient SteepestEdge Simplex Algorithm for SIMD Computers
"... This paper proposes a new parallelization of the Primal and Dual Simplex algorithms for Linear Programming (LP) problems on massively parallel SingleInstruction MultipleData (SIMD) computers. The algorithms are based on the SteepestEdge pivot selection method and the tableau representation of ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper proposes a new parallelization of the Primal and Dual Simplex algorithms for Linear Programming (LP) problems on massively parallel SingleInstruction MultipleData (SIMD) computers. The algorithms are based on the SteepestEdge pivot selection method and the tableau representation of the constraint matrix. The initial canonical tableau is formed on an attached scalar host unit, and then partitioned into a rectangular grid of submatrices and distributed to the individual Processor Element (PE) memories. In the beginning of the parallel algorithm key portions of the simplex tableau are partially replicated and stored along with the submatrices on each one of the PEs. The SIMD simplex algorithm iteratively selects a pivot element and carriesout a simplex computation step until the optimal solution is found, or when unboundedness of the LP is established. The SteepestEdge pivot selection technique utilizes information mainly from local replicas to search for the next pivot element. The pivot row and column are selectively broadcasted to the PEs before a pivot computation step, by efficiently utilizing the geometry of the toroidal mesh interconnection network. Every individual PE maintains locally and keeps consistent its replicas so that interprocessor communication due to data dependencies is further reduced. The presence of a pipelined inteconnection network, like the mesh network of MP1 and MP2 MasPar models allows the global reduction operations necessary in the selection of pivot columns and rows to be performed in time O(log nR + log nC ), in (nR \Theta nC ) PE arrays. This particular combination of pivot selection, matrix representation, and selective data replication is shown to be highly efficient in the solution of linear prog...
Symbiosis between Linear Algebra and Optimization
, 1999
"... The efficiency and effectiveness of most optimization algorithms hinges on the numerical linear algebra algorithms that they utilize. Effective linear algebra is crucial to their success, and because of this, optimization applications have motivated fundamental advances in numerical linear algebra. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The efficiency and effectiveness of most optimization algorithms hinges on the numerical linear algebra algorithms that they utilize. Effective linear algebra is crucial to their success, and because of this, optimization applications have motivated fundamental advances in numerical linear algebra. This essay will highlight contributions of numerical linear algebra to optimization, as well as some optimization problems encountered within linear algebra that contribute to a symbiotic relationship.
Alternative methods for representing the inverse of linear programming basis matrices, to appear
 in the Progress in Mathematical Programming 19751989, Special publication, Australian Society of Operational Research, Editor
, 1990
"... ..."
A hierarchical control distribution approach for large scale over actuated systems
 in ‘Proc. American Control Conference
, 2007
"... Abstract — A general hierarchical methodology for control distribution in highly redundant system is presented. The new method makes use of distribution functions to approximate the feasible solution set and to keep in check the “curse of dimensionality”. To improve the performance of the distributi ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract — A general hierarchical methodology for control distribution in highly redundant system is presented. The new method makes use of distribution functions to approximate the feasible solution set and to keep in check the “curse of dimensionality”. To improve the performance of the distribution function approach a hierarchical approach is proposed which decomposes a large scale control distribution problem in to many small scale control distribution problems to compromise the need for realtime computation against optimality. The main advantage of the proposed hierarchical approach is the decoupling of many small scale problems from each other. The convergence and accuracy of the proposed method are demonstrated by numerical studies. I.