Results 1  10
of
12
Equations and rewrite rules: a survey
 In Formal Language Theory: Perspectives and Open Problems
, 1980
"... bY ..."
Extending the coverage of a CCG system
 Journal of Language and Computation
, 2004
"... ABSTRACT: We demonstrate ways to enhance the coverage of a symbolic NLP system through dataintensive and machine learning techniques, while preserving the advantages of using a principled symbolic grammar formalism. We automatically acquire a large syntactic CCG lexicon from the Penn Treebank and c ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
ABSTRACT: We demonstrate ways to enhance the coverage of a symbolic NLP system through dataintensive and machine learning techniques, while preserving the advantages of using a principled symbolic grammar formalism. We automatically acquire a large syntactic CCG lexicon from the Penn Treebank and combine it with semantic and morphological information from another handbuilt lexicon using decision tree and maximum entropy classifiers. We also integrate statistical preprocessing methods in our system.
Compositionality as an empirical problem
 In Chris Barker and Pauline Jacobson (eds.) Direct Compositionality
, 2007
"... Gottlob Frege (1892) is credited with the socalled “principle of compositionality”, also called “Frege’s Principle”, which one often hears expressed this way: Frege’s Principle (socalled) “The meaning of a sentence is a function of the meanings of the words in it and the way they are combined synt ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
Gottlob Frege (1892) is credited with the socalled “principle of compositionality”, also called “Frege’s Principle”, which one often hears expressed this way: Frege’s Principle (socalled) “The meaning of a sentence is a function of the meanings of the words in it and the way they are combined syntactically.” (Exactly how Frege himself understood “Frege’s Principle ” is not our concern here; 1 rather, it is the understanding that this slogan has acquired in contemporary linguistics that we want to pursue, and this has little further to do with Frege.) But why should linguists care what compositionality is or whether natural languages “are compositional ” or not? 2.1.1 An “Empirical Issue”? Often we hear that “compositionality is an empirical issue ” (meaning the question whether natural language is compositional or not)—usually asserted as a preface to expressing skepticism about a “yes ” answer. In the most general sense of Frege’s Principle, however, the fact that natural languages are compositional is beyond any serious doubt. Consider that:
A HeadtoHead Comparison of de Bruijn Indices and Names
 In Proc. Int. Workshop on Logical Frameworks and MetaLanguages: Theory and Practice
, 2006
"... Often debates about pros and cons of various techniques for formalising lambdacalculi rely on subjective arguments, such as de Bruijn indices are hard to read for humans or nominal approaches come close to the style of reasoning employed in informal proofs. In this paper we will compare four formal ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
Often debates about pros and cons of various techniques for formalising lambdacalculi rely on subjective arguments, such as de Bruijn indices are hard to read for humans or nominal approaches come close to the style of reasoning employed in informal proofs. In this paper we will compare four formalisations based on de Bruijn indices and on names from the nominal logic work, thus providing some hard facts about the pros and cons of these two formalisation techniques. We conclude that the relative merits of the different approaches, as usual, depend on what task one has at hand and which goals one pursues with a formalisation.
A Semantics for Static Type Inference
 Information and Computation
, 1993
"... Curry's system for Fdeducibility is the basis for static type inference algorithms for programming languages such as ML. If a natural "preservation of types by conversion" rule is added to Curry's system, it becomes undecidable, but complete relative to a variety of model classes. We show compl ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
Curry's system for Fdeducibility is the basis for static type inference algorithms for programming languages such as ML. If a natural "preservation of types by conversion" rule is added to Curry's system, it becomes undecidable, but complete relative to a variety of model classes. We show completeness for Curry's system itself, relative to an extended notion of model that validates reduction but not conversion.
Providing Robustness for a CCG System
 IN PROCEEDINGS OF THE WORKSHOP ON LINGUISTIC THEORY AND GRAMMAR IMPLEMENTATION, ESSLLI
, 2000
"... We demonstrate ways to preserve the advantages of using a symbolic grammar formalism as the basis of an NLP system while enhancing its robustness. We automatically acquire a CCG lexicon, combine it with semantic and morphological information from another handbuilt, underspecified lexicon, and in ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We demonstrate ways to preserve the advantages of using a symbolic grammar formalism as the basis of an NLP system while enhancing its robustness. We automatically acquire a CCG lexicon, combine it with semantic and morphological information from another handbuilt, underspecified lexicon, and integrate it with statistical preprocessing methods.
Soundness and Principal Contexts for a Shallow Polymorphic Type System based on Classical Logic
"... In this paper we investigate how to adapt the wellknown notion of MLstyle polymorphism (shallow polymorphism) to a term calculus based on a CurryHoward correspondence with classical sequent calculus, namely, theX icalculus. We show that the intuitive approach is unsound, and pinpoint the precise ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper we investigate how to adapt the wellknown notion of MLstyle polymorphism (shallow polymorphism) to a term calculus based on a CurryHoward correspondence with classical sequent calculus, namely, theX icalculus. We show that the intuitive approach is unsound, and pinpoint the precise nature of the problem. We define a suitably refined type system, and prove its soundness. We then define a notion of principal contexts for the type system, and provide an algorithm to compute these, which is proved to be sound and complete with respect to the type system. In the process, we formalise and prove correctness of generic unification, which generalises Robinson’s unification to shallowpolymorphic types. Key words: CurryHoward, classical logic, generic unification, principal types, cut elimination 1.
A Functional Formulation of FirstOrder Logic "with Infinity" without Bound Variables
"... We present a system of combinatory logic EFT (for "external function theory") equivalent to firstorder logic with equality with the additional axioms "there are at least n objects" for each concrete n. This work was inspired by the system of Tarski and Givant, based on relation algebras, which they ..."
Abstract
 Add to MetaCart
We present a system of combinatory logic EFT (for "external function theory") equivalent to firstorder logic with equality with the additional axioms "there are at least n objects" for each concrete n. This work was inspired by the system of Tarski and Givant, based on relation algebras, which they show to be able to interpret firstorder theories while avoiding the use of bound variables. The TarskiGivant system, like other systems which have been proposed to demonstrate that firstorder logic can be done without bound variables (Quine's predicate functor logic, for example), would be quite difficult to use in practice. We believe that EFT is unlike the earlier proposals in that (with extensions) it is a practical medium for mathematical reasoning. We have written software which implements reasoning in EFT , and our experience with it so far supports this claim (but the assistance of the software in carrying out certain tedious forms of reasoning automatically is helpful). 2 Introd...
Combinatory Logics for Lambda Calculi with Patterns
"... We propose a combinatory logic system (CL) [16, 6–9] for a λcalculus with patterns, obtaining a consistent extension of classical CL. Our goal is to find an appropriate bridge between the two formalisms, and take advantage of some of the positive aspects of each. To our knowledge, this is the first ..."
Abstract
 Add to MetaCart
We propose a combinatory logic system (CL) [16, 6–9] for a λcalculus with patterns, obtaining a consistent extension of classical CL. Our goal is to find an appropriate bridge between the two formalisms, and take advantage of some of the positive aspects of each. To our knowledge, this is the first formulation of such a calculus. We use as a starting point the λP calculus [13, 17], with the following syntax for its set of terms: M, N, P:: = x  (MN)  (λP.M) where x ranges over a given denumerably infinite set of variables X, and the following reduction rule: (λP.M)P σ →βP M σ. The Rigid Pattern Condition (RPC) and its more syntactic variant RPC + have been defined by the authors of λP in order to ensure confluence. According to RPC +, all patterns must be linear, classical λterms, with no active variables, and also normal forms. As in classical CL, our system CLP eliminates abstractions and bound variables; while allowing functions to impose restrictions over their arguments through pattern matching, in the same spirit as in λP.
Basic Relevant Theories for Combinators at Levels I and II Koushik Pal
, 2005
"... Abstract: The system B+ is the minimal positive relevant logic. B+ is trivially extended to B+T on adding a greatest truth (Church constant) T. If we leave ∨ out of the formation apparatus, we get the fragment B∧T. It is known that the set of all B∧T theories provides a good model for the combinator ..."
Abstract
 Add to MetaCart
Abstract: The system B+ is the minimal positive relevant logic. B+ is trivially extended to B+T on adding a greatest truth (Church constant) T. If we leave ∨ out of the formation apparatus, we get the fragment B∧T. It is known that the set of all B∧T theories provides a good model for the combinators CL at LevelI, which is the theory level. Restoring ∨ to get back B+T was not previously fruitful at LevelI, because the set of all B+T theories is not a model of CL. It was to be expected from semantic completeness arguments for relevant logics that basic combinator laws would hold when restricted to prime B+T theories. Overcoming some previous difficulties, we show that this is the case, at Level I. But this does not form a model for CL. This paper also looks for corresponding results at LevelII, where we deal with sets of theories that we call propositions. We adapt work by Ghilezan to note that at LevelII also there is a model of CL in B∧T propositions. However, the corresponding result for B+T propositions extends smoothly to LevelII only in part. Specifically, only