Results 1 
5 of
5
On the solution of equality constrained quadratic programming problems arising . . .
, 1998
"... ..."
Computing a Search Direction for LargeScale LinearlyConstrained Nonlinear Optimization Calculations
, 1993
"... . We consider the computation of Newtonlike search directions that are appropriate when solving largescale linearlyconstrained nonlinear optimization problems. We investigate the use of both direct and iterative methods and consider efficient ways of modifying the Newton equations in order to ens ..."
Abstract

Cited by 12 (8 self)
 Add to MetaCart
. We consider the computation of Newtonlike search directions that are appropriate when solving largescale linearlyconstrained nonlinear optimization problems. We investigate the use of both direct and iterative methods and consider efficient ways of modifying the Newton equations in order to ensure global convergence of the underlying optimization methods. 1 Parallel Algorithms Team, CERFACS, 42 Ave. G. Coriolis, 31057 Toulouse Cedex, France 2 IANCNR, c/o Dipartimento di Matematica, 209, via Abbiategrasso 27100 Pavia, Italy 3 Department of Mathematics, University of California, 405 Hilgard Avenue, Los Angeles, CA 900241555, USA 4 Central Computing Department, Rutherford Appleton Laboratory, Chilton, Oxfordshire, OX11 0QX, England 5 Current reports available by anonymous ftp from the directory "pub/reports" on camelot.cc.rl.ac.uk (internet 130.246.8.61) Keywords: Largescale problems, unconstrained optimization, linearly constrained optimization, direct methods, iterative...
Separators and Structure Prediction in Sparse Orthogonal Factorization
, 1993
"... In the factorization A = QR of a matrix A, the orthogonal matrix Q can be represented either explicitly (as a matrix) or implicitly (as a matrix H of Householder vectors). We derive both upper and lower bounds on the number of nonzeros in H and the number of nonzeros in Q, in the case where the ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
In the factorization A = QR of a matrix A, the orthogonal matrix Q can be represented either explicitly (as a matrix) or implicitly (as a matrix H of Householder vectors). We derive both upper and lower bounds on the number of nonzeros in H and the number of nonzeros in Q, in the case where the graph of A T A has "good" separators and A need not be square. We also derive an upper bound on the number of nonzeros in the nullbasis part of Q in the case where A is the edgevertex incidence matrix of a planar graph. The significance of these results is that they both illuminate and amplify a folk theorem of sparse QR factorization, which holds that the matrix H of Householder vectors represents the orthogonal factor of A much more compactly than Q itself. To facilitate discussion of this and related issues, we review several related results which have appeared previously. Keywords: Sparse matrix algorithms, QR factorization, separators, column intersection graph, strong Hall...
Combinatorial Algorithms for Computing Column Space Bases That Have Sparse Inverses
 ETNA
"... Abstract. This paper presents a new combinatorial approach towards constructing a sparse, implicit basis for the null space of a sparse, underdetermined matrix. Our approach is to compute a column space basis of that has a sparse inverse, which could be used to represent a null space basis in impli ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract. This paper presents a new combinatorial approach towards constructing a sparse, implicit basis for the null space of a sparse, underdetermined matrix. Our approach is to compute a column space basis of that has a sparse inverse, which could be used to represent a null space basis in implicit form. We investigate three different algorithms for computing column space bases: two greedy algorithms implemented using graph matchings, and a third, which employs a divide and conquer strategy implemented with hypergraph partitioning followed by a matching. Our results show that for many matrices from linear programming, structural analysis, and circuit simulation, it is possible to compute column space bases having sparse inverses, contrary to conventional wisdom. The hypergraph partitioning method yields sparser basis inverses and has low computational time requirements, relative to the greedy approaches. We also discuss the complexity of selecting a column space basis when it is known that such a basis exists in block diagonal form with a given small block size. Key words. sparse column space basis, sparse null space basis, block angular matrix, block diagonal matrix, matching, hypergraph partitioning, inverse of a basis AMS subject classifications. 65F50, 68R10, 90C20 1. Introduction. Many
RANDOM SEARCH ALGORITHMS FOR THE SPARSE NULL VECTOR PROBLEM
, 804
"... Abstract. We consider the following problem: Given a matrix A, find minimal subsets of columns of A with cardinality no larger than a given bound that are linear dependent or nearly so. This problem arises in various forms in optimization, electrical engineering, and statistics. In its full generali ..."
Abstract
 Add to MetaCart
Abstract. We consider the following problem: Given a matrix A, find minimal subsets of columns of A with cardinality no larger than a given bound that are linear dependent or nearly so. This problem arises in various forms in optimization, electrical engineering, and statistics. In its full generality, the problem is known to be NPcomplete. We present a Monte Carlo method that finds such subsets with high confidence. We also give a deterministic method that is capable of proving that no subsets of linearly dependent columns up to a certain cardinality exist. The performance of both methods is analyzed and illustrated with numerical experiments. 1.