Results 1  10
of
12
Without a ‘doubt’? Unsupervised discovery of downwardentailing operators
 In Proceedings of NAACL HLT
, 2009
"... An important part of textual inference is making deductions involving monotonicity, that is, determining whether a given assertion entails restrictions or relaxations of that assertion. For instance, the statement ‘We know the epidemic spread quickly ’ does not entail ‘We know the epidemic spread qu ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
An important part of textual inference is making deductions involving monotonicity, that is, determining whether a given assertion entails restrictions or relaxations of that assertion. For instance, the statement ‘We know the epidemic spread quickly ’ does not entail ‘We know the epidemic spread quickly via fleas’, but ‘We doubt the epidemic spread quickly’ entails ‘We doubt the epidemic spread quickly via fleas’. Here, we present the first algorithm for the challenging lexicalsemantics problem of learning linguistic constructions that, like ‘doubt’, are downward entailing (DE). Our algorithm is unsupervised, resourcelean, and effective, accurately recovering many DE operators that are missing from the handconstructed lists that textualinference systems currently use. 1
Monotonic Reasoning from a ProofTheoretic Perspective
, 1999
"... The article presents the rst results we have obtained studying natural reasoning from a prooftheoretic perspective. In particular, we focus our attention on monotonicity reasoning: Inferences are made using structurally parsed sentences on which monotonic positions are displayed. The monotonici ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
The article presents the rst results we have obtained studying natural reasoning from a prooftheoretic perspective. In particular, we focus our attention on monotonicity reasoning: Inferences are made using structurally parsed sentences on which monotonic positions are displayed. The monotonicity markers are propagated through the proofs via the combined structural and logical rules for the unary operators of Multimodal Categorial Grammar (MMCG). We have chosen to work with such an expressive `grammar logic' in order to avoid both the use of extralogical marking devices as made in [SV91] and a too complex lexicon [Dow94].
Studies on Polarity Sensitivity
, 1996
"... The aim of this thesis is to investigate the linguistic phenomenon of polarity sensitivity. It is motivated by the belief that the complexity of the phenomenon requires a more articulated analysis than the standard one based on licensing conditions. Traditionally, the term of polarity sensitive is u ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
The aim of this thesis is to investigate the linguistic phenomenon of polarity sensitivity. It is motivated by the belief that the complexity of the phenomenon requires a more articulated analysis than the standard one based on licensing conditions. Traditionally, the term of polarity sensitive is used to identify items whose distribution is considered to be affected by the positivity or negativity of the context of occurrence. The notion of negative context covers more than environments containing overt negation or negative quantifiers. Elements that induce a negative context are potential licensers for negative polarity sensitive items. The phenomenon of polarity sensitivity has been approached from a variety of perspectives in the literature. The cluster of data associated with it raises semantically and syntactically important questions. There is reduced agreement on the definition of pertinent negativity. Sensitive elements show meaning variations when taken in isolation or in con...
The Categorial FineStructure of Natural Language
, 2003
"... Categorial grammar analyzes linguistic syntax and semantics in terms of type theory and lambda calculus. A major attraction of this approach is its unifying power, as its basic function/argument structures occur across the foundations of mathematics, language and computation. This paper considers, i ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Categorial grammar analyzes linguistic syntax and semantics in terms of type theory and lambda calculus. A major attraction of this approach is its unifying power, as its basic function/argument structures occur across the foundations of mathematics, language and computation. This paper considers, in a light examplebased manner, where this elegant logical paradigm stands when confronted with the wear and tear of reality. Starting from a brief history of the Lambek tradition since the 1980s, we discuss three main issues: (a) the fit of the lambda calculus engine to characteristic semantic structures in natural language, (b) the coexistence of the original typetheoretic and more recent modal interpretations of categorial logics, and (c) the place of categorial grammars in the complex total architecture of natural language, which involves  amongst others  mixtures of interpretation and inference.
Generalized Syllogistic Inference System based on Inclusion and Exclusion Relations (Extended Abstract)
"... Entailment relations are of central importance in the enterprise of natural language semantics. In modern logic, entailment relations are characterized from two viewpoints: the modeltheoretic and prooftheoretic ones. By contrast, most approaches to formalizing entailment relations in natural langua ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Entailment relations are of central importance in the enterprise of natural language semantics. In modern logic, entailment relations are characterized from two viewpoints: the modeltheoretic and prooftheoretic ones. By contrast, most approaches to formalizing entailment relations in natural languages have been solely based on modeltheoretic conceptions. Thus,
Surface Structure Constraints on Negative Polarity and Word Order in Hindi and English
 in 302 SENTENTIAL NEGATION AND NEGATIVE CONCORD Logic, Language and Information (ESSLI’99
, 1999
"... Hindi negative polarity items (NPIs), unlike English NPIs, can occur in subject position, outside the scope of negation at sstructure. I present an analysis for this phenomenon set in categorial grammar, where licensing occurs at sstructure in both English and Hindi. This treatment has the followi ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Hindi negative polarity items (NPIs), unlike English NPIs, can occur in subject position, outside the scope of negation at sstructure. I present an analysis for this phenomenon set in categorial grammar, where licensing occurs at sstructure in both English and Hindi. This treatment has the following advantages: it is monostratal and strictly lexical; NPI licensing is driven by monotonicity marking on licensors and NPIs rather than ccommand, which extends the empiricial coverage of our analysis to NPI licensors which don't have any overt negation present in them; and the correct semantics is built up compositionally due to the CurryHoward correspondence.
Unsupervised Learning of Semantic Relation Composition
"... This paper presents an unsupervised method for deriving inference axioms by composing semantic relations. The method is independent of any particular relation inventory. It relies on describing semantic relations using primitives and manipulating these primitives according to an algebra. The method ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper presents an unsupervised method for deriving inference axioms by composing semantic relations. The method is independent of any particular relation inventory. It relies on describing semantic relations using primitives and manipulating these primitives according to an algebra. The method was tested using a set of eight semantic relations yielding 78 inference axioms which were evaluated over PropBank. 1
Reasoning With Quantifiers
 Cognition
, 2003
"... In the semantics of natural language, quantification may have received more attention than any other subject, and one of the main topics in psychological studies on deductive reasoning is syllogistic inference, which is just a restricted form of reasoning with quantifiers. But thus far the semantica ..."
Abstract
 Add to MetaCart
In the semantics of natural language, quantification may have received more attention than any other subject, and one of the main topics in psychological studies on deductive reasoning is syllogistic inference, which is just a restricted form of reasoning with quantifiers. But thus far the semantical and psychological enterprises have remained disconnected. This paper aims to show how our understanding of syllogistic reasoning may benefit from semantical research on quantification. I present a very simple logic that pivots on the monotonicity properties of quantified statements  properties that are known to be crucial not only to quantification but to a much wider range of semantical phenomena. This logic is shown to account for the experimental evidence available in the literature as well as for the data from a new experiment with cardinal quantifiers ("at least n" and "at most n"), which cannot be explained by any other theory of syllogistic reasoning. q 2002 Elsevier Science B.V. All rights reserved.
Natural Language Reasoning: A ProofTheoretic Perspective
"... Contents b 1 Project summary, background assumptions An interesting area of research which brings together linguists and logicians, is the investigation of natural language as a vehicle for reasoning. Work in this area provides clear evidence for the strong links that exist between syntax and sem ..."
Abstract
 Add to MetaCart
Contents b 1 Project summary, background assumptions An interesting area of research which brings together linguists and logicians, is the investigation of natural language as a vehicle for reasoning. Work in this area provides clear evidence for the strong links that exist between syntax and semantics in natural language. To draw inferences in natural reasoning, in fact, both syntactic and semantic information are needed. I would like to study the interplay between these two aspects, and investigate the laws involved in the derivation of inferences in natural language from a logical/mathematical point of view. I will start from the assumption that to understand natural reasoning it is not necessary to use the intermediate step of a `formal' language in which rst to translate linguistic expressions. Human beings reason without using this artifact. Since I am interested in formulating mechanisms of reasoning that model the human cognitive activities, they should be able to work on s