Results 1 
5 of
5
Randomness and Coincidences: Reconciling Intuition and Probability Theory
, 2001
"... We argue that the apparent inconsistency between people's intuitions about chance and the normative predictions of probability theory, as expressed in judgments about randomness and coincidences, can be resolved by focussing on the evidence observations provide about the processes that generate ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
We argue that the apparent inconsistency between people's intuitions about chance and the normative predictions of probability theory, as expressed in judgments about randomness and coincidences, can be resolved by focussing on the evidence observations provide about the processes that generated them, rather than their likelihood. This argument is supported by probabilistic modeling of sequence and number production, together with two experiments that examine people's judgments about coincidences.
Counting Distinctions: On the Conceptual Foundations of Shannon’s Information Theory Contents
, 2009
"... ..."
Probability Revision, The Uniformity Rule, and the Chan Darwiche Metric
"... Abstract The author has proposed a rule of probability revision dictating that identical learning be reflected in identical ratios of new to old odds. Following this rule ensures that the final result of a sequence of probability revisions is undisturbed by an alteration in the temporal order of the ..."
Abstract
 Add to MetaCart
Abstract The author has proposed a rule of probability revision dictating that identical learning be reflected in identical ratios of new to old odds. Following this rule ensures that the final result of a sequence of probability revisions is undisturbed by an alteration in the temporal order of the learning prompting these revisions. There is also a close connection between this rule and an intriguing metric on probability measures introduced by Chan and Darwiche.
Probability Primer 1
"... Probabilistic models aim to explain human cognition by appealing to the principles of probability theory and statistics, which dictate how an agent should act rationally in situations that involve uncertainty. While probability theory was originally developed as a means of analyzing games of chance, ..."
Abstract
 Add to MetaCart
Probabilistic models aim to explain human cognition by appealing to the principles of probability theory and statistics, which dictate how an agent should act rationally in situations that involve uncertainty. While probability theory was originally developed as a means of analyzing games of chance, it was quickly realized that probabilities could be used to analyze rational actions in a wide range of contexts (e.g., Bayes, 1763/1958; Laplace, 1795/1951). Probabilistic models have come to be used in many disciplines, and are currently the method of choice for an enormous range of applications, including artificial systems for medical inference, bioinformatics, and computer vision. Applying probabilistic models to human cognition thus provides the opportunity to draw upon work in computer science, engineering, mathematics, and statistics, often producing quite surprising connections. There are two challenges involved in developing probabilistic models of cognition. The first challenge is specifying a suitable model. This requires considering the computational problem faced by an agent, the knowledge available to that agent, and the appropriate way to represent that knowledge. The second challenge is evaluating model