Results 1  10
of
21
Equations and rewrite rules: a survey
 In Formal Language Theory: Perspectives and Open Problems
, 1980
"... bY ..."
Compositionality as an empirical problem
 In Chris Barker and Pauline Jacobson (eds.) Direct Compositionality
, 2007
"... Gottlob Frege (1892) is credited with the socalled “principle of compositionality”, also called “Frege’s Principle”, which one often hears expressed this way: Frege’s Principle (socalled) “The meaning of a sentence is a function of the meanings of the words in it and the way they are combined synt ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
Gottlob Frege (1892) is credited with the socalled “principle of compositionality”, also called “Frege’s Principle”, which one often hears expressed this way: Frege’s Principle (socalled) “The meaning of a sentence is a function of the meanings of the words in it and the way they are combined syntactically.” (Exactly how Frege himself understood “Frege’s Principle ” is not our concern here; 1 rather, it is the understanding that this slogan has acquired in contemporary linguistics that we want to pursue, and this has little further to do with Frege.) But why should linguists care what compositionality is or whether natural languages “are compositional ” or not? 2.1.1 An “Empirical Issue”? Often we hear that “compositionality is an empirical issue ” (meaning the question whether natural language is compositional or not)—usually asserted as a preface to expressing skepticism about a “yes ” answer. In the most general sense of Frege’s Principle, however, the fact that natural languages are compositional is beyond any serious doubt. Consider that:
Extending the coverage of a CCG system
 Journal of Language and Computation
, 2004
"... ABSTRACT: We demonstrate ways to enhance the coverage of a symbolic NLP system through dataintensive and machine learning techniques, while preserving the advantages of using a principled symbolic grammar formalism. We automatically acquire a large syntactic CCG lexicon from the Penn Treebank and c ..."
Abstract

Cited by 17 (8 self)
 Add to MetaCart
ABSTRACT: We demonstrate ways to enhance the coverage of a symbolic NLP system through dataintensive and machine learning techniques, while preserving the advantages of using a principled symbolic grammar formalism. We automatically acquire a large syntactic CCG lexicon from the Penn Treebank and combine it with semantic and morphological information from another handbuilt lexicon using decision tree and maximum entropy classifiers. We also integrate statistical preprocessing methods in our system.
A HeadtoHead Comparison of de Bruijn Indices and Names
 IN PROC. INT. WORKSHOP ON LOGICAL FRAMEWORKS AND METALANGUAGES: THEORY AND PRACTICE
, 2006
"... Often debates about pros and cons of various techniques for formalising lambdacalculi rely on subjective arguments, such as de Bruijn indices are hard to read for humans or nominal approaches come close to the style of reasoning employed in informal proofs. In this paper we will compare four formal ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
Often debates about pros and cons of various techniques for formalising lambdacalculi rely on subjective arguments, such as de Bruijn indices are hard to read for humans or nominal approaches come close to the style of reasoning employed in informal proofs. In this paper we will compare four formalisations based on de Bruijn indices and on names from the nominal logic work, thus providing some hard facts about the pros and cons of these two formalisation techniques. We conclude that the relative merits of the different approaches, as usual, depend on what task one has at hand and which goals one pursues with a formalisation.
A Semantics for Static Type Inference
 Information and Computation
, 1993
"... Curry's system for Fdeducibility is the basis for static type inference algorithms for programming languages such as ML. If a natural "preservation of types by conversion" rule is added to Curry's system, it becomes undecidable, but complete relative to a variety of model cl ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
Curry's system for Fdeducibility is the basis for static type inference algorithms for programming languages such as ML. If a natural "preservation of types by conversion" rule is added to Curry's system, it becomes undecidable, but complete relative to a variety of model classes. We show completeness for Curry's system itself, relative to an extended notion of model that validates reduction but not conversion.
Providing Robustness for a CCG System
 IN PROCEEDINGS OF THE WORKSHOP ON LINGUISTIC THEORY AND GRAMMAR IMPLEMENTATION, ESSLLI
, 2000
"... We demonstrate ways to preserve the advantages of using a symbolic grammar formalism as the basis of an NLP system while enhancing its robustness. We automatically acquire a CCG lexicon, combine it with semantic and morphological information from another handbuilt, underspecified lexicon, and in ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We demonstrate ways to preserve the advantages of using a symbolic grammar formalism as the basis of an NLP system while enhancing its robustness. We automatically acquire a CCG lexicon, combine it with semantic and morphological information from another handbuilt, underspecified lexicon, and integrate it with statistical preprocessing methods.
Soundness and Principal Contexts for a Shallow Polymorphic Type System based on Classical Logic
"... In this paper we investigate how to adapt the wellknown notion of MLstyle polymorphism (shallow polymorphism) to a term calculus based on a CurryHoward correspondence with classical sequent calculus, namely, theX icalculus. We show that the intuitive approach is unsound, and pinpoint the precise ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
In this paper we investigate how to adapt the wellknown notion of MLstyle polymorphism (shallow polymorphism) to a term calculus based on a CurryHoward correspondence with classical sequent calculus, namely, theX icalculus. We show that the intuitive approach is unsound, and pinpoint the precise nature of the problem. We define a suitably refined type system, and prove its soundness. We then define a notion of principal contexts for the type system, and provide an algorithm to compute these, which is proved to be sound and complete with respect to the type system. In the process, we formalise and prove correctness of generic unification, which generalises Robinson’s unification to shallowpolymorphic types. Key words: CurryHoward, classical logic, generic unification, principal types, cut elimination 1.
Basic Relevant Theories for Combinators at Levels I and II Koushik Pal
, 2005
"... Abstract: The system B+ is the minimal positive relevant logic. B+ is trivially extended to B+T on adding a greatest truth (Church constant) T. If we leave ∨ out of the formation apparatus, we get the fragment B∧T. It is known that the set of all B∧T theories provides a good model for the combinator ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: The system B+ is the minimal positive relevant logic. B+ is trivially extended to B+T on adding a greatest truth (Church constant) T. If we leave ∨ out of the formation apparatus, we get the fragment B∧T. It is known that the set of all B∧T theories provides a good model for the combinators CL at LevelI, which is the theory level. Restoring ∨ to get back B+T was not previously fruitful at LevelI, because the set of all B+T theories is not a model of CL. It was to be expected from semantic completeness arguments for relevant logics that basic combinator laws would hold when restricted to prime B+T theories. Overcoming some previous difficulties, we show that this is the case, at Level I. But this does not form a model for CL. This paper also looks for corresponding results at LevelII, where we deal with sets of theories that we call propositions. We adapt work by Ghilezan to note that at LevelII also there is a model of CL in B∧T propositions. However, the corresponding result for B+T propositions extends smoothly to LevelII only in part. Specifically, only
Combinatory Logics for Lambda Calculi with Patterns
"... We propose a combinatory logic system (CL) [16, 6–9] for a λcalculus with patterns, obtaining a consistent extension of classical CL. Our goal is to find an appropriate bridge between the two formalisms, and take advantage of some of the positive aspects of each. To our knowledge, this is the first ..."
Abstract
 Add to MetaCart
We propose a combinatory logic system (CL) [16, 6–9] for a λcalculus with patterns, obtaining a consistent extension of classical CL. Our goal is to find an appropriate bridge between the two formalisms, and take advantage of some of the positive aspects of each. To our knowledge, this is the first formulation of such a calculus. We use as a starting point the λP calculus [13, 17], with the following syntax for its set of terms: M, N, P:: = x  (MN)  (λP.M) where x ranges over a given denumerably infinite set of variables X, and the following reduction rule: (λP.M)P σ →βP M σ. The Rigid Pattern Condition (RPC) and its more syntactic variant RPC + have been defined by the authors of λP in order to ensure confluence. According to RPC +, all patterns must be linear, classical λterms, with no active variables, and also normal forms. As in classical CL, our system CLP eliminates abstractions and bound variables; while allowing functions to impose restrictions over their arguments through pattern matching, in the same spirit as in λP.
WideCoverage CCG . . .
, 2010
"... This dissertation presents the development of a widecoverage semantic parser capable of handling quantifier scope ambiguities with a novel way. In contrast with traditional approaches that deliver an underspecified representation and focus on enumerating the possible readings “offline ” after the e ..."
Abstract
 Add to MetaCart
This dissertation presents the development of a widecoverage semantic parser capable of handling quantifier scope ambiguities with a novel way. In contrast with traditional approaches that deliver an underspecified representation and focus on enumerating the possible readings “offline ” after the end of the syntactic analysis, our parser handles the ambiguities during the derivation using a semantic device known as generalized skolem term. This approach combines most of the benefits of the existing methods and provides solutions to their deficiencies with a natural way. Furthermore, this takes place in the context of the grammar itself, without resorting to adhoc complex mechanisms. As a grammar formalism for this work we use Combinatory Categorial Grammar (CCG), exploiting its lexicalized nature and the surfacecompositional semantics that provides. The logical forms are represented in firstorder logic, using λcalculus as a “glue ” language, in the tradition of Montague. We base our parser on the OpenCCG framework, and we augment it by applying a wellestablished supertagger and by developing a headdriven probabilistic model. Our model is trained on CCGbank, a CCG version of the Penn Treebank. For the semantic component we develop a Java library