Results 1  10
of
164
Iterative Solution of Linear Systems
 Acta Numerica
, 1992
"... this paper is as follows. In Section 2, we present some background material on general Krylov subspace methods, of which CGtype algorithms are a special case. We recall the outstanding properties of CG and discuss the issue of optimal extensions of CG to nonHermitian matrices. We also review GMRES ..."
Abstract

Cited by 100 (8 self)
 Add to MetaCart
this paper is as follows. In Section 2, we present some background material on general Krylov subspace methods, of which CGtype algorithms are a special case. We recall the outstanding properties of CG and discuss the issue of optimal extensions of CG to nonHermitian matrices. We also review GMRES and related methods, as well as CGlike algorithms for the special case of Hermitian indefinite linear systems. Finally, we briefly discuss the basic idea of preconditioning. In Section 3, we turn to Lanczosbased iterative methods for general nonHermitian linear systems. First, we consider the nonsymmetric Lanczos process, with particular emphasis on the possible breakdowns and potential instabilities in the classical algorithm. Then we describe recent advances in understanding these problems and overcoming them by using lookahead techniques. Moreover, we describe the quasiminimal residual algorithm (QMR) proposed by Freund and Nachtigal (1990), which uses the lookahead Lanczos process to obtain quasioptimal approximate solutions. Next, a survey of transposefree Lanczosbased methods is given. We conclude this section with comments on other related work and some historical remarks. In Section 4, we elaborate on CGNR and CGNE and we point out situations where these approaches are optimal. The general class of Krylov subspace methods also contains parameterdependent algorithms that, unlike CGtype schemes, require explicit information on the spectrum of the coefficient matrix. In Section 5, we discuss recent insights in obtaining appropriate spectral information for parameterdependent Krylov subspace methods. After that, 4 R.W. Freund, G.H. Golub and N.M. Nachtigal
Latent Semantic Indexing (LSI) and TREC2
 The Second Text REtrieval Conference (TREC2
, 1994
"... this paper. The "ltc" weights were computed on this matrix. 3.2 SVD analysis ..."
Abstract

Cited by 97 (3 self)
 Add to MetaCart
this paper. The "ltc" weights were computed on this matrix. 3.2 SVD analysis
SVDPACKC (Version 1.0) User's Guide
, 1993
"... SVDPACKC comprises four numerical (iterative) methods for computing the singular value decomposition (SVD) of large sparse matrices using ANSI C. This software package implements Lanczos and subspace iterationbased methods for determining several of the largest singular triplets (singular values an ..."
Abstract

Cited by 63 (4 self)
 Add to MetaCart
SVDPACKC comprises four numerical (iterative) methods for computing the singular value decomposition (SVD) of large sparse matrices using ANSI C. This software package implements Lanczos and subspace iterationbased methods for determining several of the largest singular triplets (singular values and corresponding left and rightsingular vectors) for large sparse matrices. The package has been ported to a variety of machines ranging from supercomputers to workstations: CRAY YMP, IBM RS/6000550, DEC 5000100, HP 9000750, SPARCstation 2, and Macintosh II/fx. This document (i) explains each algorithm in some detail, (ii) explains the input parameters for each program, (iii) explains how to compile/execute each program, and (iv) illustrates the performance of each method when we compute lower rank approximations to sparse termdocument matrices from information retrieval applications. A userfriendly software interface to the package for UNIXbased systems and the Macintosh II/fx is als...
On the convergence of reflective Newton methods for largescale nonlinear minimization subject to bounds
, 1992
"... . We consider a new algorithm, a reflective Newton method, for the problem of minimizing a smooth nonlinear function of many variables, subject to upper and/or lower bounds on some of the variables. This approach generates strictly feasible iterates by following piecewise linear paths ("reflection" ..."
Abstract

Cited by 60 (4 self)
 Add to MetaCart
. We consider a new algorithm, a reflective Newton method, for the problem of minimizing a smooth nonlinear function of many variables, subject to upper and/or lower bounds on some of the variables. This approach generates strictly feasible iterates by following piecewise linear paths ("reflection" paths) to generate improved iterates. The reflective Newton approach does not require identification of an "activity set". In this report we establish that the reflective Newton approach is globally and quadratically convergent. Moreover, we develop a specific example of this general reflective path approach suitable for largescale and sparse problems. 1 Research partially supported by the Applied Mathematical Sciences Research Program (KC04 02) of the Office of Energy Research of the U.S. Department of Energy under grant DEFG0286ER25013. A000, and in part by NSF, AFOSR, and ONR through grant DMS8920550, and by the Cornell Theory Center which receives major funding from the National Sci...
An Implicitly Restarted Lanczos Method for Large Symmetric. . .
 ETNA
, 1994
"... . The Lanczos process is a well known technique for computing a few, say k, eigenvalues and associated eigenvectors of a large symmetric nn matrix. However, loss of orthogonality of the computed Krylov subspace basis can reduce the accuracy of the computed approximate eigenvalues. In the implicitly ..."
Abstract

Cited by 54 (13 self)
 Add to MetaCart
. The Lanczos process is a well known technique for computing a few, say k, eigenvalues and associated eigenvectors of a large symmetric nn matrix. However, loss of orthogonality of the computed Krylov subspace basis can reduce the accuracy of the computed approximate eigenvalues. In the implicitly restarted Lanczos method studied in the present paper, this problem is addressed by fixing the number of steps in the Lanczos process at a prescribed value, k +p, where p typically is not much larger, and may be smaller, than k. Orthogonality of the k + p basis vectors of the Krylov subspace is secured by reorthogonalizing these vectors when necessary. The implicitly restarted Lanczos method exploits that the residual vector obtained by the Lanczos process is a function of the initial Lanczos vector. The method updates the initial Lanczos vector through an iterative scheme. The purpose of the iterative scheme is to determine an initial vector such that the associated residual vector is tiny....
ReducedOrder Modeling Techniques Based on Krylov Subspaces and Their Use in Circuit Simulation
 Applied and Computational Control, Signals, and Circuits
, 1998
"... In recent years, reducedorder modeling techniques based on Krylovsubspace iterations, especially the Lanczos algorithm and the Arnoldi process, have become popular tools to tackle the largescale timeinvariant linear dynamical systems that arise in the simulation of electronic circuits. This pape ..."
Abstract

Cited by 53 (10 self)
 Add to MetaCart
In recent years, reducedorder modeling techniques based on Krylovsubspace iterations, especially the Lanczos algorithm and the Arnoldi process, have become popular tools to tackle the largescale timeinvariant linear dynamical systems that arise in the simulation of electronic circuits. This paper reviews the main ideas of reducedorder modeling techniques based on Krylov subspaces and describes the use of reducedorder modeling in circuit simulation. 1 Introduction Krylovsubspace methods, most notably the Lanczos algorithm [81, 82] and the Arnoldi process [5], have long been recognized as powerful tools for largescale matrix computations. Matrices that occur in largescale computations usually have some special structures that allow to compute matrixvector products with such a matrix (or its transpose) much more efficiently than for a dense, unstructured matrix. The most common structure is sparsity, i.e., only few of the matrix entries are nonzero. Computing a matrixvector pr...
Automating the Assignment of Submitted Manuscripts to Reviewers
 In Research and Development in Information Retrieval
, 1992
"... The 117 manuscripts submitted for the Hypertext'91 conference were assigned to members of the review committee, using a variety of automated methods based on information retrieval principles and Latent Semantic Indexing. Fifteen reviewers provided exhaustive ratings for the submitted abstracts, indi ..."
Abstract

Cited by 53 (2 self)
 Add to MetaCart
The 117 manuscripts submitted for the Hypertext'91 conference were assigned to members of the review committee, using a variety of automated methods based on information retrieval principles and Latent Semantic Indexing. Fifteen reviewers provided exhaustive ratings for the submitted abstracts, indicating how well each abstract matched their interests. The automated methods do a fairly good job of assigning relevant papers for review, but they are still somewhat poorer than assignments made manually by human experts and substantially poorer than an assignment perfectly matching the reviewers' own ranking of the papers. A new automated assignment method called "n of 2n" achieves better performance than human experts by sending reviewers more papers than they actually have to review and then allowing them to choose part of their review load themselves. Keywords: Conferences, Program Committees, Reviewers, Referees, Manuscripts, Papers, Assignment, Matching, Interests, Latent Semantic Ind...
Latent Semantic Indexing (LSI): TREC3 Report
 Overview of the Third Text REtrieval Conference
, 1995
"... This paper reports on recent developments of the Latent Semantic Indexing (LSI) retrieval method for TREC3. LSI uses a reduceddimension vector space to represent words and documents. An important aspect of this representation is that the association between terms is automatically captured, explici ..."
Abstract

Cited by 52 (0 self)
 Add to MetaCart
This paper reports on recent developments of the Latent Semantic Indexing (LSI) retrieval method for TREC3. LSI uses a reduceddimension vector space to represent words and documents. An important aspect of this representation is that the association between terms is automatically captured, explicitly represented, and used to improve retrieval. We used LSI for both TREC3 routing and adhoc tasks. For the routing tasks an LSI space was constructed using the training documents. We compared profiles constructed using just the topic words (no training) with profiles constructed using the average of relevant documents (no use of the topic words). Not surprisingly, the centroid of the relevant documents was 30% better than the topic words. This simple feedback method was quite good compared to the routing performance of other systems. Various combinations of information from the topic words and relevant documents provide small additional improvements in performance. For the adhoc task we c...
LSI meets TREC: A Status Report
 In: D. Harman (Ed.), The First Text REtrieval Conference (TREC1), National Institute of Standards and Technology Special Publication
, 1993
"... this article as a query, no relevant articles about virtual reality were returned. Now that we have a larger number of hopefully more accurate relevance judgements, we will repeat this basic comparison. We will then use these two baseline runs to explore: a) combining the relevant documents and the ..."
Abstract

Cited by 49 (1 self)
 Add to MetaCart
this article as a query, no relevant articles about virtual reality were returned. Now that we have a larger number of hopefully more accurate relevance judgements, we will repeat this basic comparison. We will then use these two baseline runs to explore: a) combining the relevant documents and the original topic; b) selecting only some relevant documents and/or discriminating terms; and c) representing the query vector as several points of interest rather than a single average. 3.2.2 Failure analyses