Results 1  10
of
92
A rational analysis of the selection task as optimal data selection
 67 – 215535 Deliverable 4.1
, 1994
"... Human reasoning in hypothesistesting tasks like Wason's (1966, 1968) selection task has been depicted as prone to systematic biases. However, performance on this task has been assessed against a now outmoded falsificationist philosophy of science. Therefore, the experimental data is reassessed in t ..."
Abstract

Cited by 160 (8 self)
 Add to MetaCart
Human reasoning in hypothesistesting tasks like Wason's (1966, 1968) selection task has been depicted as prone to systematic biases. However, performance on this task has been assessed against a now outmoded falsificationist philosophy of science. Therefore, the experimental data is reassessed in the light of a Bayesian model of optimal data selection in inductive hypothesis testing. The model provides a rational analysis (Anderson, 1990) of the selection task that fits well with people's performance on both abstract and thematic versions of the task. The model suggests that reasoning in these tasks may be rational rather than subject to systematic bias. Over the past 30 years, results in the psychology of reasoning have raised doubts about human rationality. The assumption of human rationality has a long history. Aristotle took the capacity for rational thought to be the defining characteristic of human beings, the capacity that separated us from the animals. Descartes regarded the ability to use language and to reason as the hallmarks of the mental that separated it from the merely physical. Many contemporary philosophers of mind also appeal to a basic principle of rationality in accounting for everyday, folk psychological explanation whereby we explain each other's behavior in terms of our beliefs and desires (Cherniak, 1986; Cohen, 1981; Davidson, 1984; Dennett, 1987; but see Stich, 1990). These philosophers, both ancient and modern, share a common view of rationality: To be rational is to reason according to rules (Brown, 1989). Logic and mathematics provide the normative rules that tell us how we should reason. Rationality therefore seems to demand that the human cognitive system embodies the rules of logic and mathematics. However, results in the psychology of reasoning appear to show that people do not reason according to these rules. In both deductive (Evans, 1982, 1989;
Conditionals: A theory of meaning, pragmatics, and inference
 Psychological Review
, 2002
"... The authors outline a theory of conditionals of the form If A then C and If A then possibly C. The 2 sorts of conditional have separate core meanings that refer to sets of possibilities. Knowledge, pragmatics, and semantics can modulate these meanings. Modulation can add information about temporal a ..."
Abstract

Cited by 73 (26 self)
 Add to MetaCart
The authors outline a theory of conditionals of the form If A then C and If A then possibly C. The 2 sorts of conditional have separate core meanings that refer to sets of possibilities. Knowledge, pragmatics, and semantics can modulate these meanings. Modulation can add information about temporal and other relations between antecedent and consequent. It can also prevent the construction of possibilities to yield 10 distinct sets of possibilities to which conditionals can refer. The mental representation of a conditional normally makes explicit only the possibilities in which its antecedent is true, yielding other possibilities implicitly. Reasoners tend to focus on the explicit possibilities. The theory predicts the major phenomena of understanding and reasoning with conditionals. You reason about conditional relations because much of your knowledge is conditional. If you get caught speeding, then you pay a fine. If you have an operation, then you need time to recuperate. If you have money in the bank, then you can cash a check. Conditional reasoning is a central part of thinking, yet people do not always reason correctly. The lawyer Jan Schlictmann in a celebrated trial (see Harr, 1995, pp. 361–362) elicited the following information from an expert witness about the source of a chemical pollutant trichloroethylene (TCE):
Rational explanation of the selection task
 Psychological Review
, 1996
"... M. Oaksford and N. Chater (O&C; 1994) presented the first quantitative model of P. C. Wason's ( 1966, 1968) selection task in.which performance is rational. J. St B T Evans and D. E. Over (1996) reply that O&C's account is normatively incorrect and cannot model K. N. Kirby's (1994b) or P. Pollard an ..."
Abstract

Cited by 46 (4 self)
 Add to MetaCart
M. Oaksford and N. Chater (O&C; 1994) presented the first quantitative model of P. C. Wason's ( 1966, 1968) selection task in.which performance is rational. J. St B T Evans and D. E. Over (1996) reply that O&C's account is normatively incorrect and cannot model K. N. Kirby's (1994b) or P. Pollard and J. St B T Evans's (1983) data. It is argued that an equivalent measure satisfies their normative concerns and that a modification of O&C's model accounts for their empirical concerns. D. Laming (1996) argues that O&C made unjustifiable psychological assumptions and that a "correct" Bayesian analysis agrees with logic. It is argued that O&C's model makes normative and psychological sense and that Laming's analysis is not Bayesian. A. Almor and S. A. Sloman (1996) argue that O&C cannot explain their data. It is argued that Almor and Sloman's data do not bear on O&C's model because they alter the nature of the task. It is concluded that O&C's model remains the most compelling and comprehensive account of the selection task. Research on Wason's (1966, 1968) selection task questions human rationality because performance is not "logically correct?' Recently, Oaksford and Chater (O&C; 1994) provided a rational analysis (Anderson, 1990, 1991) of the selection task that appeared to vindicate human rationality. O&C argued that the selection task is an inductive, rather than a deductive, reasoning task: Participants must assess the truth or falsity of a general rule from specific instances. In particular, participants face a problem of optimal data selection (Lindley, 1956): They must decide which of four cards (p, notp, q, or notq) is likely to provide the most useful data to test a conditional rule,/fp then q. The "logical " solution is to select the p and the notq cards. O&C argued that this solution presupposes falsificationism (Popper, 1959), which argues that only data that can disconfirm, not confirm, hypotheses are of interest. In contrast, O&C's rational analysis uses a Bayesian approach to inductive
Quasiquotation in Lisp
"... Quasiquotation is the technology commonly used in Lisp to write programgenerating programs. In this paper I will review the history and development of this technology, and explain why it works so well in practice. ..."
Abstract

Cited by 41 (0 self)
 Add to MetaCart
Quasiquotation is the technology commonly used in Lisp to write programgenerating programs. In this paper I will review the history and development of this technology, and explain why it works so well in practice.
Natural Language Processing Using a Propositional Semantic Network with Structured Variables
 Minds and Machines
, 1993
"... We describe a knowledge representation and inference formalism, based on an intensional propositional semantic network, in which variables are structured terms consisting of quantifier, type, and other information. This has three important consequences for natural language processing. First, this le ..."
Abstract

Cited by 29 (13 self)
 Add to MetaCart
We describe a knowledge representation and inference formalism, based on an intensional propositional semantic network, in which variables are structured terms consisting of quantifier, type, and other information. This has three important consequences for natural language processing. First, this leads to an extended, more "natural" formalism whose use and representations are consistent with the use of variables in natural language in two ways: the structure of representations mirrors the structure of the language and allows reuse phenomena such as pronouns and ellipsis. Second, the formalism allows the specification of description subsumption as a partial ordering on related concepts (variable nodes in a semantic network) that relates more general concepts to more specific instances of that concept, as is done in language. Finally, this structured variable representation simplifies the resolution of some representational difficulties with certain classes of natural language sentences...
Simplification  A general constraint propagation technique for propositional and modal tableaux
, 1998
"... . Tableau and sequent calculi are the basis for most popular interactive theorem provers for formal verification. Yet, when it comes to automatic proof search, tableaux are often slower than DavisPutnam, SAT procedures or other techniques. This is partly due to the absence of a bivalence principle ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
. Tableau and sequent calculi are the basis for most popular interactive theorem provers for formal verification. Yet, when it comes to automatic proof search, tableaux are often slower than DavisPutnam, SAT procedures or other techniques. This is partly due to the absence of a bivalence principle (viz. the cutrule) but there is another source of inefficiency: the lack of constraint propagation mechanisms. This paper proposes an innovation in this direction: the rule of simplification, which plays for tableaux the role of subsumption for resolution and of unit for the DavisPutnam procedure. The simplicity and generality of simplification make possible its extension in a uniform way from propositional logic to a wide range of modal logics. This technique gives an unifying view of a number of tableauxlike calculi such as DPLL, KE, HARP, hypertableaux, BCP, KSAT. We show its practical impact with experimental results for random 3SAT and the industrial IFIP benchmarks for hardware ve...
What is an Inference Rule?
 Journal of Symbolic Logic
, 1992
"... What is an inference rule? This question does not have a unique answer. One usually nds two distinct standard answers in the literature: validity inference ( ` v ' if for every substitution , the validity of [] entails the validity of [']), and truth inference ( ` t ' if for every substitution , ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
What is an inference rule? This question does not have a unique answer. One usually nds two distinct standard answers in the literature: validity inference ( ` v ' if for every substitution , the validity of [] entails the validity of [']), and truth inference ( ` t ' if for every substitution , the truth of [] entails the truth of [']). In this paper we introduce a general semantic framework that allows us to investigate the notion of inference more carefully. Validity inference and truth inference are in some sense the extremal points in our framework. We investigate the relationship between various types of inference in our general framework, and consider the complexity of deciding if an inference rule is sound, in the context of a number of logics of interest: classical propositional logic, a nonstandard propositional logic, various propositional modal logics, and rstorder logic.
Higher Order Logic
 In Handbook of Logic in Artificial Intelligence and Logic Programming
, 1994
"... Contents 1 Introduction : : : : : : : : : : : : : : : : : : : : : : : : : : : : 2 2 The expressive power of second order Logic : : : : : : : : : : : 3 2.1 The language of second order logic : : : : : : : : : : : : : 3 2.2 Expressing size : : : : : : : : : : : : : : : : : : : : : : : : 4 2.3 Definin ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
Contents 1 Introduction : : : : : : : : : : : : : : : : : : : : : : : : : : : : 2 2 The expressive power of second order Logic : : : : : : : : : : : 3 2.1 The language of second order logic : : : : : : : : : : : : : 3 2.2 Expressing size : : : : : : : : : : : : : : : : : : : : : : : : 4 2.3 Defining data types : : : : : : : : : : : : : : : : : : : : : 6 2.4 Describing processes : : : : : : : : : : : : : : : : : : : : : 8 2.5 Expressing convergence using second order validity : : : : : : : : : : : : : : : : : : : : : : : : : 9 2.6 Truth definitions: the analytical hierarchy : : : : : : : : 10 2.7 Inductive definitions : : : : : : : : : : : : : : : : : : : : : 13 3 Canonical semantics of higher order logic : : : : : : : : : : : : 15 3.1 Tarskian semantics of second order logic : : : : : : : : : 15 3.2 Function and re
A satisfiability tester for nonclausal propositional calculus
 Information and Computation
, 1988
"... An algorithm for satisfiability testing in the propositional calculus with a worst case running time that grows at a rate less than 2 (.25+ε) L is described, where L can be either the length of the input expression or the number of occurrences of literals (i.e., leaves) in it. This represents a new ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
An algorithm for satisfiability testing in the propositional calculus with a worst case running time that grows at a rate less than 2 (.25+ε) L is described, where L can be either the length of the input expression or the number of occurrences of literals (i.e., leaves) in it. This represents a new upper bound on the complexity of nonclausal satisfiability testing. The performance is achieved by using lemmas concerning assignments and pruning that preserve satisfiability, together with choosing a "good " variable upon which to recur. For expressions in conjunctive normal form, it is shown that an upper bound is 2.128 L.
Conditionals and consequences
 Journal of Applied Logic
, 2007
"... Abstract. We examine the notion of conditionals and the role of conditionals in inductive logics and arguments. We identify three mistakes commonly made in the study of, or motivation for, nonclassical logics. A nonmonotonic consequence relation based on evidential probability is formulated. With r ..."
Abstract

Cited by 13 (11 self)
 Add to MetaCart
Abstract. We examine the notion of conditionals and the role of conditionals in inductive logics and arguments. We identify three mistakes commonly made in the study of, or motivation for, nonclassical logics. A nonmonotonic consequence relation based on evidential probability is formulated. With respect to this acceptance relation some rules of inference of System P are unsound, and we propose refinements that hold in our framework. 1 Three mistakes Pure Mathematics is the class of all propositions of the form ‘p implies q’... And logical constants are all notions definable in terms of the following: Implication, the relation of a term to a class of which it is a member... [45, p.3]. Thus begins the precursor of Principia Mathematica, Russell’s Principles of Mathematics, and thus begins the sad and confusing twentieth century tale of implication.