Results 1  10
of
31
Seeing versus doing: Two modes of accessing causal knowledge
 Journal of Experimental Psychology: Learning, Memory, and Cognition
, 2005
"... The ability to derive predictions for the outcomes of potential actions from observational data is one of the hallmarks of true causal reasoning. We present four learning experiments with deterministic and probabilistic data showing that people indeed make different predictions from causal models, w ..."
Abstract

Cited by 25 (7 self)
 Add to MetaCart
The ability to derive predictions for the outcomes of potential actions from observational data is one of the hallmarks of true causal reasoning. We present four learning experiments with deterministic and probabilistic data showing that people indeed make different predictions from causal models, whose parameters were learned in a purely observational learning phase, depending on whether learners believe that an event within the model has been merely observed (“seeing”) or was actively manipulated (“doing”). The predictions reflect sensitivity both to the structure of the causal models and to the size of their parameters. This competency is remarkable because the predictions for potential interventions were very different from the patterns that had actually been observed. Whereas associative and probabilistic theories fail, recent developments of causal Bayes net theories provide tools for modeling this competency. Causal knowledge underlies our ability to predict future events, to explain the occurrence of present events, and to achieve goals by means of actions. Thus, causal knowledge belongs to one of our most central cognitive competencies. However, the nature of causal knowledge has been debated. A number of philosophers and
The Exploitation of Regularities in the Environment by the Brain
 Behavioral and Brain Sciences
"... Statistical regularities of the environment are important for learning, memory, intelligence,
inductive inference, and in fact for any area of cognitive science where an informationprocessing
brain promotes survival by exploiting them. This has been recognised by many
of those interested in cognitiv ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
Statistical regularities of the environment are important for learning, memory, intelligence,
inductive inference, and in fact for any area of cognitive science where an informationprocessing
brain promotes survival by exploiting them. This has been recognised by many
of those interested in cognitive function, starting with Helmholtz, Mach and Pearson, and
continuing through Craik, Tolman, Attneave, and Brunswik. In the current era many of us
have begun to show how neural mechanisms exploit the regular statistical properties of
natural images. Shepard proposed that the apparent trajectory of an object when seen
successively at two positions results from internalising the rules of kinematic geometry, and
although kinematic geometry is not statistical in nature, this is clearly a related idea. Here
it is argued that Shepard's term, "internalisation", is insufficient because it is also
necessary to derive an advantage from the process. Having mechanisms selectively sensitive
to the spatiotemporal patterns of excitation commonly experienced when viewing moving
objects would facilitate the detection, interpolation, and extrapolation of such motions, and
might explain the twisting motions that are experienced. Although Shepard's explanation
in terms of Chasles' rule seems doubtful, his theory and experiments illustrate that local
twisting motions are needed for the analysis of moving objects and provoke thoughts about
how they might be detected.
When did Bayesian inference become “Bayesian"?
 BAYESIAN ANALYSIS
, 2006
"... While Bayes’ theorem has a 250year history, and the method of inverse probability that flowed from it dominated statistical thinking into the twentieth century, the adjective “Bayesian” was not part of the statistical lexicon until relatively recently. This paper provides an overview of key Bayesi ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
While Bayes’ theorem has a 250year history, and the method of inverse probability that flowed from it dominated statistical thinking into the twentieth century, the adjective “Bayesian” was not part of the statistical lexicon until relatively recently. This paper provides an overview of key Bayesian developments, beginning with Bayes’ posthumously published 1763 paper and continuing up through approximately 1970, including the period of time when “Bayesian” emerged as the label of choice for those who advocated Bayesian methods.
The New Challenge: From a Century of Statistics to an Age of Causation
 COMPUTING SCIENCE AND STATISTICS
, 1997
"... Some of the main users of statistical methods  economists, social scientists, and epidemiologists  are discovering that their fields rest not on statistical but on causal foundations. The blurring of these foundations over the years follows from the lack of mathematical notation capable of disti ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
Some of the main users of statistical methods  economists, social scientists, and epidemiologists  are discovering that their fields rest not on statistical but on causal foundations. The blurring of these foundations over the years follows from the lack of mathematical notation capable of distinguishing causal from equational relationships. By providing formal and natural explication of such relations, graphical methods have the potential to revolutionize how statistics is used in knowledgerich applications. Statisticians, in response, are beginning to realize that causality is not a metaphysical deadend but a meaningful concept with clear mathematical underpinning. The paper surveys these developments and outlines future challenges.
Models and statistical inference: The controversy between Fisher and NeymanPearson
 British Journal for the Philosophy of Science
, 2006
"... The main thesis of the paper is that in the case of modern statistics, the differences between the various concepts of models were the key to its formative controversies. The mathematical theory of statistical inference was mainly developed by Ronald A. Fisher, Jerzy Neyman, and Egon S. Pearson. Fis ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
The main thesis of the paper is that in the case of modern statistics, the differences between the various concepts of models were the key to its formative controversies. The mathematical theory of statistical inference was mainly developed by Ronald A. Fisher, Jerzy Neyman, and Egon S. Pearson. Fisher on the one side and Neyman–Pearson on the other were involved often in a polemic controversy. The common view is that Neyman and Pearson made Fisher’s account more stringent mathematically. It is argued, however, that there is a profound theoretical basis for the controversy: both sides held conflicting views about the role of mathematical modelling. At the end, the influential programme of Exploratory Data Analysis is considered to be advocating another, more instrumental conception of models.
Probabilistic Models of Early Vision
, 2002
"... How do our brains transform patterns of light striking the retina into useful knowledge about objects and events of the external world? Thanks to intense research into the mechanisms of vision, much is now known about this process. However, we do not yet have anything close to a complete picture, an ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
How do our brains transform patterns of light striking the retina into useful knowledge about objects and events of the external world? Thanks to intense research into the mechanisms of vision, much is now known about this process. However, we do not yet have anything close to a complete picture, and many questions remain unanswered. In addition to its clinical relevance and purely academic significance, research on vision is important because a thorough understanding of biological vision would probably help solve many major problems in computer vision.
The Logic of Counterfactuals in Causal Inference (Discussion Of 'Causal Inference without Counterfactuals' by A. P. Dawid)
 JOURNAL OF AMERICAN STATISTICAL ASSOCIATION
, 2000
"... ..."
A ParadigmBased Solution to the Riddle of Induction
, 1998
"... This paper appeared in Synthese 117, pp. 419484, 1998 ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper appeared in Synthese 117, pp. 419484, 1998
Bayesian and NonBayesian Approaches to Scientific Modeling and Inference in Economics and Econometrics
"... After brief remarks on the history of modeling and inference techniques in economics and econometrics, attention is focused on the emergence of economic science in the 20th century. First, the broad objectives of science and the PearsonJeffreys' "unity of science" principle will be r ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
After brief remarks on the history of modeling and inference techniques in economics and econometrics, attention is focused on the emergence of economic science in the 20th century. First, the broad objectives of science and the PearsonJeffreys' "unity of science" principle will be reviewed. Second, key Bayesian and nonBayesian practical scientific inference and decision methods will be compared using applied examples from economics, econometrics and business. Third, issues and controversies on how to model the behavior of economic units and systems will be reviewed and the structural econometric modeling, time series analysis (SEMTSA) approach will be described and illustrated using a macroeconomic modeling and forecasting problem involving analyses of data for 18 industrialized countries over the years since the 1950s. Point and turning point forecasting results and their implications for macroeconomic modeling of economies will be summarized. Last, a few remarks will be made ...