DMCA
A.: Learning syntactic patterns for automatic hypernym discovery. (2004)
Venue: | Advances in Neural Information Processing Systems |
Citations: | 223 - 6 self |
Citations
3742 |
WordNet: An Electronic Lexical Database
- Fellbaum
- 1998
(Show Context)
Citation Context ...ency path” features extracted from parse trees, we introduce a general-purpose formalization and generalization of these patterns. Given a training set of text containing known hypernym pairs, our algorithm automatically extracts useful dependency paths and applies them to new corpora to identify novel pairs. On our evaluation task (determining whether two nouns in a news article participate in a hypernym relationship), our automatically extracted database of hypernyms attains both higher precision and higher recall than WordNet. 1 Introduction Semantic taxonomies and thesauri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, ... |
1261 | Automatic acquisition of hyponyms from large text corpora.
- Hearst
- 1992
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
629 | Distributional clustering of English words
- Pereira, Tishby, et al.
- 1993
(Show Context)
Citation Context ...rs Hearst Patterns And/Or Other Pattern Figure 5: Hypernym classifiers on hand-labeled test set using coordinate information will increase the recall of our hypernym classifier: if we are confident that two nouns ni, nj are coordinate terms, and that nj is a hyponym of nk, we may then infer with higher probability that ni is similarly a hyponym of nk—despite never having encountered the pair (ni, nk) within a single sentence. 6.1 Coordinate Term Classification Prior work for identifying coordinate terms includes automatic word sense clustering methods based on distributional similarity (e.g., [12, 14]) or on pattern-based techniques, specifically using the coordination pattern “X, Y, and Z” (e.g., [2]). We construct both types of classifier. First we construct a vector-space model similar to [12] using single MINIPAR dependency links as our distributional features.6 We use the normalized similarity score from this model for coordinate term classification. We evaluate this classifier on our handlabeled test set, where of 5,387 total pairs, 131 are labeled as “coordinate”. For purposes of comparison we construct a series of classifiers from WordNet, which make the binary decision of determin... |
342 | Learning surface text patterns for a question answering system.
- Ravichandran, Hovy
- 2002
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
309 | Discovery of Inference Rules for Question Answering.
- Lin, Pantel
- 2001
(Show Context)
Citation Context ... 6 shows how this classifier can be improved by adding a new source of knowledge, coordinate terms. 2 Representing lexico-syntactic patterns with dependency paths The first goal of our work is to automatically identify lexico-syntactic patterns indicative of hypernymy. In order to do this, we need a representation space for expressing these patterns. We propose the use of dependency paths as a general-purpose formalization of the space of lexico-syntactic patterns. Dependency paths have been used successfully in the past to represent lexico-syntactic relations suitable for semantic processing [11]. A dependency parser produces a dependency tree that represents the syntactic relations between words by a list of edge tuples of the form: (word1,CATEGORY1:RELATION:CATEGORY2, word2). In this formulation each word is the stemmed form of the word or multi-word phrase (so that “authors” becomes “author”), and corresponds to a specific node in the dependency tree; each category is the part of speech label of the corresponding word (e.g., N for noun or PREP for preposition); and the relation is the directed syntactic relationship exhibited between word1 and word2 (e.g., OBJ for object, MOD for m... |
157 | Tackling the Poor Assumptions of Naive Bayes Text Classifiers.
- Rennie, Shih, et al.
- 2003
(Show Context)
Citation Context ... vector for each such noun pair. Each entry of the 69,592-dimension vector represents a particular dependency path, and contains the total number of times that that path was the shortest path connecting that noun pair in some dependency tree in our corpus. We thus define as our task the binary classification of a noun pair as a hypernym pair based on its feature vector of dependency paths. We use the WordNet-labeled Known Hypernym / Known Non-Hypernym training set defined in Section 3. We train a number of classifiers on this data set, including multinomial Naive Bayes, complement Naive Bayes [16], and logistic regression. We perform model selection using 10-fold cross validation on this training set, evaluating each model based on its maximum F-Score averaged across all folds. The summary of average maximum F-scores is presented in Table 3, and the precision/recall plot of our best models is presented in Figure 3. For comparison, we evaluate two simple classifiers based on past work using only a handful of hand-engineered features; the first simply detects the presence of at least one of Hearst’s patterns, arguably the previous best classifier consisting only of lexico-syntactic patte... |
136 | A corpus-based approach for building semantic lexicons
- Riloff, Shepherd
- 1997
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
82 |
Automatically Labeling Semantic Classes.
- Pantel, Ravichandran
- 2004
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
77 | Learning Semantic Constraints for the Automatic Discovery of Part-Whole Relations.
- Girju, Badulescu, et al.
- 2003
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
71 | Combining independent modules to solve multiple-choice synonym and analogy problems.
- Turney, Littman, et al.
- 2003
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
68 | Customizing a lexicon to better suit a computational task.
- Hearst, Schutze
- 1993
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
48 | Unsupervised methods for developing taxonomies by combining syntactic and statistical information.
- Widdows
- 2003
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
47 | Using LSA and Noun Coordination Information to Improve the Precision and Recall of Automatic Hyponymy Extraction.
- Cederberg, Widdows
- 2003
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
43 |
Automatic Acquisition of a Hypernym-Labeled Noun Hierarchy from Text.
- Caraballo
- 2001
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
43 | Supersense Tagging of Unknown Nouns in WordNet.
- Ciaramita, Johnson
- 2003
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
40 |
The DARPA TIPSTER project.
- Harman
- 1992
(Show Context)
Citation Context ... ratio of hypernym-to-non-hypernym pairs observed in our hand-labeled test set (discussed below). We evaluated our binary classifiers in two ways. For both sets of evaluations, our classifier was given a pair of nouns from an unseen sentence and had to make a hypernym vs. nonhypernym decision. In the first style of evaluation, we compared the performance of our classifiers against the Known Hypernym versus Known Non-Hypernym labels assigned by 1The corpus contains articles from the Associated Press, Wall Street Journal, and Los Angeles Times, drawn from the TIPSTER 1, 2, 3, and TREC 5 corpora [7]. Our most recent experiments (presented in Section 6) include articles from Wikipedia (a popular web encyclopedia), extracted with the help of Tero Karvinen’s Tero-dump software. 2We access WordNet 2.0 via Jason Rennie’s WordNet::QueryData interface. 3A noun sense is determined to be “frequently-used” if it occurs at least once in the sense-tagged Brown Corpus Semantic Concordance files (as reported in the cntlist file distributed as part of WordNet 2.0). This determination is made so as to reduce the number of false hypernym/hyponym classifications due to highly polysemous nouns (nouns which... |
36 | Clustering by Committee.
- Pantel
- 2003
(Show Context)
Citation Context ...rs Hearst Patterns And/Or Other Pattern Figure 5: Hypernym classifiers on hand-labeled test set using coordinate information will increase the recall of our hypernym classifier: if we are confident that two nouns ni, nj are coordinate terms, and that nj is a hyponym of nk, we may then infer with higher probability that ni is similarly a hyponym of nk—despite never having encountered the pair (ni, nk) within a single sentence. 6.1 Coordinate Term Classification Prior work for identifying coordinate terms includes automatic word sense clustering methods based on distributional similarity (e.g., [12, 14]) or on pattern-based techniques, specifically using the coordination pattern “X, Y, and Z” (e.g., [2]). We construct both types of classifier. First we construct a vector-space model similar to [12] using single MINIPAR dependency links as our distributional features.6 We use the normalized similarity score from this model for coordinate term classification. We evaluate this classifier on our handlabeled test set, where of 5,387 total pairs, 131 are labeled as “coordinate”. For purposes of comparison we construct a series of classifiers from WordNet, which make the binary decision of determin... |
28 |
Dependency-based Evaluation of MINIPAR. Workshop on the Evaluation of Parsing Systems,
- Lin
- 1998
(Show Context)
Citation Context ...bel of the corresponding word (e.g., N for noun or PREP for preposition); and the relation is the directed syntactic relationship exhibited between word1 and word2 (e.g., OBJ for object, MOD for modifier, or CONJ for conjunction), and corresponds to a specific link in the tree. We may then define our space of lexico-syntactic patterns to be all shortest paths of four links or less between any two nouns in a dependency tree. Figure 1 shows the partial dependency tree for the sentence fragment “...such authors as Herrick and Shakespeare” generated by the broad-coverage dependency parser MINIPAR [10]. ... authors such-N:pre:PreDet as -N:mod:Prep Herrick-Prep:pcomp-n:N Shakespeare -Prep:pcomp-n:N and -N:punc:U -N:conj:N Figure 1: MINIPAR dependency tree example with transform NPX and other NPY : (and,U:PUNC:N),-N:CONJ:N, (other,A:MOD:N) NPX or other NPY : (or,U:PUNC:N),-N:CONJ:N, (other,A:MOD:N) NPY such as NPX : N:PCOMP-N:PREP,such as,such as,PREP:MOD:N Such NPY as NPX : N:PCOMP-N:PREP,as,as,PREP:MOD:N,(such,PREDET:PRE:N) NPY including NPX : N:OBJ:V,include,include,V:I:C,dummy node,dummy node,C:REL:N NPY , especially NPX : -N:APPO:N,(especially,A:APPO-MOD:N) Table 1: Dependency path repre... |
14 | Hierarchical Semantic Classification: Word Sense Disambiguation with World Knowledge.
- Ciaramita, Hofmann, et al.
- 2003
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
3 |
Noun-phrase cooccurerence statistics for semi-automatic-semantic lexicon construction
- Roark, Charniak
- 1998
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |
2 |
Semantic classification of unknown words in Chinese
- Tseng
- 2003
(Show Context)
Citation Context ...ri such as WordNet [5] are a key source of knowledge for natural language processing applications, and provide structured information about semantic relations between words. Building such taxonomies, however, is an extremely slow and labor-intensive process. Further, semantic taxonomies are invariably limited in scope and domain, and the high cost of extending or customizing them for an application has often limited their usefulness. Consequently, there has been significant recent interest in finding methods for automatically learning taxonomic relations and constructing semantic hierarchies. [1, 2, 3, 4, 6, 8, 9, 13, 15, 17, 18, 19, 20, 21] In this paper, we build an automatic classifier for the hypernym/hyponym relation. A noun X is a hyponym of a noun Y if X is a subtype or instance of Y. Thus “Shakespeare” is a hyponym of “author” (and conversely “author” is a hypernym of “Shakespeare”), “dog” is a hyponym of “canine”, “desk” is a hyponym of “furniture”, and so on. Much of the previous work on automatic semantic classification of words has been based on a key insight first articulated by Hearst [8], that the presence of certain “lexico-syntactic patterns” can indicate a particular semantic relationship between two nouns. Hear... |