#### DMCA

## Using Linear Algebra for Intelligent Information Retrieval (1995)

### Cached

### Download Links

Venue: | SIAM REVIEW |

Citations: | 670 - 18 self |

### Citations

3704 | Latent semantic analysis
- Dumais, T
- 2004
(Show Context)
Citation Context ... literally match terms in irrelevant documents. A better approach would allow users to retrieve information on the basis of a conceptual topic or meaning of a document. Latent Semantic Indexing (LSI) =-=[4]-=- tries to overcome the problems of lexical matching by using statistically derived conceptual indices instead of individual words for retrieval. LSI assumes that there is some underlying or latent str... |

744 | Improving retrieval performance by relevance feedback
- Salton, Buckley
- 1990
(Show Context)
Citation Context ...ecify their information needs adequately, especially on the first try. In interactive retrieval situations, it is possible to take advantage of user feedback about relevant and non-relevant documents =-=[25]-=-. Systems can use information about which documents are relevant in many ways. Typically the weight given to terms occurring in relevant documents is increased and the weight of terms occurring in non... |

430 | Information Filtering and Information Retrieval: Two Sides of the Same Coin
- Belkin, Croft
- 1992
(Show Context)
Citation Context ...t these dimensions are, in fact, capturing a major portion of the meaningful structure. 5.3. Information Filtering. Information filtering is a problem that is closely related to information retrieval =-=[1]-=-. In information filtering applications, a user has a relatively stable long-term interest or profile, and new documents are constantly received and matched against this standing interest. Selective d... |

305 |
Automatic Information Organization and Retrieval
- Salton
- 1968
(Show Context)
Citation Context ... a summary measure of performance. Results were obtained for LSI and compared against published or computed results for other retrieval techniques, notably the standard keyword vector method in SMART =-=[24]-=-. For several information science test collections, the average precision using LSI ranged from comparable to to 30% better than that obtained using standard keyword vector methods. See [4, 12, 6] for... |

283 |
Improving the retrieval of information from external sources. behavior research methods
- Dumais
- 1991
(Show Context)
Citation Context ...es the frequency in which term i occurs in document j. Since every word does not normally appear in each document, the matrix A is usually sparse. In practice, local and global weightings are applied =-=[6]-=- to increase/decrease the importance of terms within or among documents. Specifically, we can write a ij = L(i; j) \Theta G(i); (5) 4 Berry and Dumais where L(i; j) is the local weighting for term i i... |

276 |
Personalized information delivery: An analysis of information filtering methods
- Foltz, Dumais
- 1992
(Show Context)
Citation Context ...roximation to A for any unitarily invariant norm [21]. Hence, min rank(B)=k kA \Gamma Bk2 = kA \Gamma Akk2 = oe k+1 : (3) 2.1. Latent Semantic Indexing. In order to implement Latent Semantic Indexing =-=[4, 11]-=- a matrix of terms by documents must be constructed. The elements of the term-document matrix are the occurrences of each word in a particular document, i.e., A = [a ij ]; (4) where a ij denotes the f... |

180 | Dimensions of meaning
- Schütze
- 1992
(Show Context)
Citation Context ... finding near neighbors in high-dimension spaces). 5.7. Related Work. A number of other researchers are using related linear algebra methods for information retrieval and classification work. Schutze =-=[26]-=- and Gallant [13] have used SVD and related dimension reduction ideas for word sense disambiguation and information retrieval work. Hull [16] and Yang and Chute [28] have used LSI/SVD as the first ste... |

136 |
Information retrieval using a singular value decomposition model of latent semantic structure
- Furnas, Deerwester, et al.
- 1988
(Show Context)
Citation Context ... in SMART [24]. For several information science test collections, the average precision using LSI ranged from comparable to to 30% better than that obtained using standard keyword vector methods. See =-=[4, 12, 6]-=- for details of these evaluations. The LSI method performs best relative to standard vector methods when the queries and relevant documents do not share many words, and at high levels of recall. Term ... |

117 | Distribution of mathematical software via electronic mail
- Dongarra, Grosse
- 1987
(Show Context)
Citation Context ...ct of interest, but smaller, more topically coherent units of text (e.g., paragraphs, sections) could be represented as well. For example, LSI has been incorporated as a fuzzy search option in NETLIB =-=[5]-=- for retrieving algorithms, code descriptions, and short articles from the NA-Digest electronic newsletter. Regardless of how the original descriptor-object matrix is derived, a reduced-dimension appr... |

111 |
Large scale singular value computations
- Berry
- 1992
(Show Context)
Citation Context ...Retrieval 3 2. Background. The singular value decomposition is commonly used in the solution of unconstrained linear least squares problems, matrix rank estimation, and canonical correlation analysis =-=[2]-=-. Given an m \Theta n matrix A, where without loss of generality msn and rank(A) = r, the singular value decomposition of A, denoted by SVD(A), is defined as A = U \SigmaV T (1) where U T U = V T V = ... |

98 | Improving text retrieval for the routing problem using latent semantic indexing
- Hull
- 1994
(Show Context)
Citation Context ...for information retrieval and classification work. Schutze [26] and Gallant [13] have used SVD and related dimension reduction ideas for word sense disambiguation and information retrieval work. Hull =-=[16]-=- and Yang and Chute [28] have used LSI/SVD as the first step in conjunction with statistical classification (e.g. discriminant analysis). Using the LSI-derived dimensions effectively reduces the numbe... |

90 |
Fully Automatic Cross-language Document Retrieval Using Latent Semantic Indexing
- K, Littman
- 1990
(Show Context)
Citation Context ... several languages) can match documents in any language. What is required for cross-language applications is a common space in which words from many languages are represented. Landauer and Littman in =-=[20]-=- described one method for creating such an LSI space. The original term-document matrix is formed using a collection of abstracts that have versions in more than one language (French and English, in t... |

73 | Automating the assignment of submitted manuscripts to reviewers
- Dumais, Nielsen
- 1992
(Show Context)
Citation Context ...relevant to users' queries. A query was matched to the nearest documents and project descriptions and the authors organization was returned as the most relevant internal group. In another application =-=[9]-=-, LSI was used to automate the assignment of reviewers to submitted conference papers. Several hundred reviewers were described by means of texts they had written, and this formed the basis of the LSI... |

57 |
Using latent semantic indexing for information filtering
- Foltz, W, et al.
- 1990
(Show Context)
Citation Context ...d if it is similar enough to the interest vector it is recommended to the user. Learning methods like relevance feedback can be used to improve the representation of interest vectors over time. Foltz =-=[10]-=- compared LSI and keyword vector methods for filtering Netnews articles, and found 12%-- 23% advantages for LSI. Dumais and Foltz in [11] compared several different methods for representing users inte... |

42 |
Neural networks for full-scale protein sequence classification: sequence encoding with singular value decomposition
- Wu, Fung, et al.
- 1995
(Show Context)
Citation Context ...ep in conjunction with statistical classification (e.g. discriminant analysis). Using the LSI-derived dimensions effectively reduces the number of predictor variables for classification. Wu et al. in =-=[27]-=- also used LSI/SVD to reduce Using Linear Algebra for Intelligent Information Retrieval 23 the training set dimension for a neural network protein classification system used in human genome research. ... |

33 | Information management tools for updating an SVD-encoded indexing scheme
- O’Brien
- 1994
(Show Context)
Citation Context ...runcated SVD of this matrix, creating the LSI database of singular values and vectors for retrieval, matching user queries to documents, and adding new terms or documents to an existing LSI databases =-=[4, 23]-=-. The bulk of LSI processing time is spent in computing the truncated SVD of the large sparse term by document matrices. Section 2 is a review of basic concepts needed to understand LSI. Section 3 use... |

28 |
A practical approach for representing contexts and for performing word sense disambiguation using neural networks
- GALLANT
- 1991
(Show Context)
Citation Context ...ghbors in high-dimension spaces). 5.7. Related Work. A number of other researchers are using related linear algebra methods for information retrieval and classification work. Schutze [26] and Gallant =-=[13]-=- have used SVD and related dimension reduction ideas for word sense disambiguation and information retrieval work. Hull [16] and Yang and Chute [28] have used LSI/SVD as the first step in conjunction ... |

22 |
B.: Information ltering and information retrieval: two sides of the same coin
- Belkin, Croft
- 1992
(Show Context)
Citation Context ...hat these dimensions are, in fact, capturing a major portion of the meaningful structure. 5.3. Information Filtering. Information ltering is a problem that is closely related to information retrieval =-=[1]-=-. In information ltering applications, a user has a relatively stable long-term interest or pro le, and new documents are constantly received and matched against this standing interest. Selective diss... |

16 |
An application of least squares fit mapping to text information retrieval
- YANG, CHUTE
- 1993
(Show Context)
Citation Context ...l and classification work. Schutze [26] and Gallant [13] have used SVD and related dimension reduction ideas for word sense disambiguation and information retrieval work. Hull [16] and Yang and Chute =-=[28]-=- have used LSI/SVD as the first step in conjunction with statistical classification (e.g. discriminant analysis). Using the LSI-derived dimensions effectively reduces the number of predictor variables... |

15 |
Latent semantic analysis and the measurement of knowledge
- Landauer, Dumais
- 1994
(Show Context)
Citation Context ...ost as good results for retrieving English abstracts and Japanese Kanji ideographs, and for multilingual translations (English and Greek) of the Bible [29]. Modeling Human Memory. Landauer and Dumais =-=[19]-=- have recently used LSI spaces to model some of the associative relationships observed in human memory. They were interested in term-term similarities. LSI is often described intuitively as a method f... |

10 | Cross-Language Information Retrieval Using Latent Semantic Indexing
- Young
- 1994
(Show Context)
Citation Context ... have been used so that a larger or smaller set of documents would be returned. The cosine is merely used to rank-order documents and its explicit value is not always an adequate measure of relevance =-=[23, 29]-=-. 3.2. Comparison with Lexical Matching. In this example, LSI has been applied using two factors, i.e. A2 is used to approximate the original 16\Theta 17 term-document matrix. Using a cosine threshold... |

9 |
Handbook for automatic computation II, linear algebra
- GOLUB, REINSCH
- 1971
(Show Context)
Citation Context ...ith r = rank(A)sp = min(m;n) and define Ak = k X i=1 u i \Delta oe i \Delta v T i ; (2) then min rank(B)=k kA \Gamma Bk 2 F = kA \Gamma Akk 2 F = oe 2 k+1 + \Delta \Delta \Delta + oe 2 p : Proof. See =-=[15]-=-. In other words, Ak , which is constructed from the k-largest singular triplets of A, is the closest rank-k matrix to A [14]. In fact, Ak is the best approximation to A for any unitarily invariant no... |

8 |
Using latent semantic indexing for information ltering
- Foltz
- 1990
(Show Context)
Citation Context ...d if it is similar enough to the interest vector it is recommended to the user. Learning methods like relevance feedback can be used to improve the representation of interest vectors over time. Foltz =-=[10]-=- compared LSI and keyword vector methods for ltering Netnews articles, and found 12%{ 23% advantages for LSI. Dumais and Foltz in [11] compared several di erent methods for representing users interest... |

5 |
et al., SVDPACKC: Version 1.0 User’s Guide
- Berry
- 1993
(Show Context)
Citation Context ...gular values and singular vectors of the original term-documents matrix A as an alternative to recomputing the SVD of ~ A in Equation (9). In general, the cost of computing the SVD of a sparse matrix =-=[3]-=- can be generally expressed as I \Theta cost (G T Gx) + trp \Theta cost (Gx); where I is the number of iterations required by a Lanczos-type procedure [2] to approximate the eigensystem of G T G and t... |

5 |
The relevance density method for multi-topic queries in information retrieval
- Kane-Esrig, Streeter, et al.
- 1991
(Show Context)
Citation Context ...es can be either terms (as in most information retrieval applications), documents or combinations of the two (as in relevance feedback). Queries can even be represented as multiple points of interest =-=[17]-=-. Similarly, the objects returned to the user are typically documents, but there is no reason that similar terms could not be returned. Returning nearby terms is useful for some applications like onli... |

5 |
Symmetric gage functions and unitarilly invariant norms
- MIRSKY
- 1960
(Show Context)
Citation Context ...n other words, Ak , which is constructed from the k-largest singular triplets of A, is the closest rank-k matrix to A [14]. In fact, Ak is the best approximation to A for any unitarily invariant norm =-=[21]-=-. Hence, min rank(B)=k kA \Gamma Bk2 = kA \Gamma Akk2 = oe k+1 : (3) 2.1. Latent Semantic Indexing. In order to implement Latent Semantic Indexing [4, 11] a matrix of terms by documents must be constr... |

4 |
A comparison of some novel and traditional lexical distance metrics for for spelling correction
- KUKICH
- 1990
(Show Context)
Citation Context ...its standard recognizer mode. Even though the error rates were 8:8% at the word level, information retrieval performance using LSI was not disrupted (compared with the same uncorrupted texts). Kukich =-=[18]-=- used LSI for a related problem, spelling correction. In this application, the rows were unigrams and bigrams and the columns were correctly spelled words. An input word (correctly or incorrectly spel... |

4 |
Information retrieval of imperfectly recognized handwriting, Behavior and Information Technology
- Nielsen, Phillips, et al.
- 1994
(Show Context)
Citation Context ... documents which contained a correctly spelled version of Dumais, then Dumais will probably be near Duniais in the k-dimensional space determined by Ak (see Equation 2 or Figure 1). Nielsen et al. in =-=[22]-=- used LSI to index a small collection of abstracts input by a commercially available pen machine in its standard recognizer mode. Even though the error rates were 8:8% at the word level, information r... |

4 |
Large Scale Singular Value Computations", Int
- Berry
- 1992
(Show Context)
Citation Context ...Retrieval 3 2. Background. The singular value decomposition is commonly used in the solution of unconstrained linear least squares problems, matrix rank estimation, and canonical correlation analysis =-=[2]-=-. Given an m n matrix A, where without loss of generality m n and rank(A) =r, the singular value decomposition of A, denoted by SVD(A), is de ned as (1) A = U V T where U T U = V T V = In and = diag( ... |

3 |
meets TREC: A status report., in The First Text REtrieval Conference
- LSI
- 1993
(Show Context)
Citation Context ...tching. TREC. Recently, LSI has been used for both information filtering and information retrieval in TREC (Text REtrieval Conference), a large-scale retrieval conference conference sponsored by NIST =-=[7, 8]-=-. The TREC collection contains more than 1; 000; 000 documents (representing more that 3 gigabytes of ASCII text), 200 queries, and relevance judgements pooled from the return sets of more than 30 sys... |

3 |
Improving Retrieval Performance byRelevance Feedback
- Salton, Buckley
- 1988
(Show Context)
Citation Context ...specify their information needs adequately, especially on the rst try. In interactive retrieval situations, it is possible to take advantage of user feedback about relevant and non-relevant documents =-=[25]-=-. Systems can use information about which documents are relevant in many ways. Typically the weight given to terms occurring in relevant documents is increased and the weight of terms occurring in non... |

2 |
Semantic Indexing (LSI) and TREC-2., in The Second Text REtrieval Conference
- Latent
- 1994
(Show Context)
Citation Context ...tching. TREC. Recently, LSI has been used for both information filtering and information retrieval in TREC (Text REtrieval Conference), a large-scale retrieval conference conference sponsored by NIST =-=[7, 8]-=-. The TREC collection contains more than 1; 000; 000 documents (representing more that 3 gigabytes of ASCII text), 200 queries, and relevance judgements pooled from the return sets of more than 30 sys... |

1 |
SVDPACKC: Version 1.0 User's Guide,Tech
- Berryetal
- 1993
(Show Context)
Citation Context ...gular values and singular vectors of the original term-documents matrix A as an alternative to recomputing the SVD of ~ A in Equation (9). In general, the cost of computing the SVD of a sparse matrix =-=[3]-=- can be generally expressed as I cost (G T Gx)+trp cost (Gx)� where I is the number of iterations required by a Lanczos-type procedure [2] to approximate the eigensystem of G T G and trp is the number... |

1 |
An application ofleast squares t mapping to text information retrieval
- Yang, Chute
- 1993
(Show Context)
Citation Context ...val and classi cation work. Schutze [26] and Gallant [13]have used SVD and related dimension reduction ideas for word sense disambiguation and information retrieval work. Hull [16] and Yang and Chute =-=[28]-=- have used LSI/SVD as the rst step in conjunction with statistical classi cation (e.g. discriminant analysis). Using the LSI-derived dimensions e ectively reduces the number of predictor variables for... |