Results 1  10
of
16
The adaptive nature of human categorization
 Psychological Review
, 1991
"... A rational model of human categorization behavior is presented that assumes that categorization reflects the derivation of optimal estimates of the probability of unseen features of objects. A Bayesian analysis is performed of what optimal estimations would be if categories formed a disjoint partiti ..."
Abstract

Cited by 211 (2 self)
 Add to MetaCart
A rational model of human categorization behavior is presented that assumes that categorization reflects the derivation of optimal estimates of the probability of unseen features of objects. A Bayesian analysis is performed of what optimal estimations would be if categories formed a disjoint partitioning of the object space and if features were independently displayed within a category. This Bayesian analysis is placed within an incremental categorization algorithm. The resulting rational model accounts for effects of central tendency of categories, effects of specific instances, learning of linearly nonseparable categories, effects of category labels, extraction of basic level categories, baserate effects, probability matching in categorization, and trialbytrial learning functions. Although the rational model considers just I level of categorization, it is shown how predictions can be enhanced by considering higher and lower levels. Considering prediction at the lower, individual level allows integration of this rational analysis of categorization with the earlier rational analysis of memory (Anderson & Milson, 1989). Anderson (1990) presented a rational analysis ot 6 human cognition. The term rational derives from similar "rationalman" analyses in economics. Rational analyses in other fields are sometimes called adaptationist analyses. Basically, they are efforts to explain the behavior in some domain on the assumption that the behavior is optimized with respect to some criteria of adaptive importance. This article begins with a general characterization ofhow one develops a rational theory of a particular cognitive phenomenon. Then I present the basic theory of categorization developed in Anderson (1990) and review the applications from that book. Since the writing of the book, the theory has been greatly extended and applied to many new phenomena. Most of this article describes these new developments and applications. A Rational Analysis Several theorists have promoted the idea that psychologists might understand human behavior by assuming it is adapted to the environment (e.g., Brunswik, 1956; Campbell, 1974; Gib
Multilevel linear modelling for FMRI group analysis using Bayesian inference
 Neuroimage
, 2004
"... Functional magnetic resonance imaging studies often involve the acquisition of data from multiple sessions and/or multiple subjects. A hierarchical approach can be taken to modelling such data with a general linear model (GLM) at each level of the hierarchy introducing different random effects varia ..."
Abstract

Cited by 33 (6 self)
 Add to MetaCart
Functional magnetic resonance imaging studies often involve the acquisition of data from multiple sessions and/or multiple subjects. A hierarchical approach can be taken to modelling such data with a general linear model (GLM) at each level of the hierarchy introducing different random effects variance components. Inferring on these models is nontrivial with frequentist solutions being unavailable. A solution is to use a Bayesian framework. One important ingredient in this is the choice of prior on the variance components and toplevel regression parameters. Due to the typically small numbers of sessions or subjects in neuroimaging, the choice of prior is critical. To alleviate this problem, we introduce to neuroimage modelling the approach of reference priors, which drives the choice of prior such that it is noninformative in an informationtheoretic sense. We propose two inference techniques at the top level for multilevel hierarchies (a fast approach and a slower more accurate approach). We also demonstrate that we can infer on the top level of multilevel hierarchies by inferring on the levels of the hierarchy separately and passing summary statistics of a noncentral multivariate t distribution between them.
Classification in Very High Dimensional Problems with Handfuls of Examples
"... Abstract. Modern classification techniques perform well when the number of training examples exceed the number of features. If, however, the number of features greatly exceed the number of training examples, then these same techniques can fail. To address this problem, we present a hierarchical Baye ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract. Modern classification techniques perform well when the number of training examples exceed the number of features. If, however, the number of features greatly exceed the number of training examples, then these same techniques can fail. To address this problem, we present a hierarchical Bayesian framework that shares information between features by modeling similarities between their parameters. We believe this approach is applicable to many sparse, high dimensional problems and especially relevant to those with both spatial and temporal components. One such problem is fMRI time series, and we present a case study that shows how we can successfully classify in this domain with 80,000 original features and only 2 training examples per class. 1
Bayesian colorcorrection method for noncolorimetric digital image sensors
 Paper presented at the 12th IS&T/SID Color Imaging Conference
, 2004
"... digital image sensors ..."
Reliability of Computational Science
, 2006
"... Today’s computers allow us to simulate large, complex physical problems. Many times the mathematical models describing such problems are based on a relatively small amount of available information such as experimental measurements. The question arises whether the computed data could be used as the b ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Today’s computers allow us to simulate large, complex physical problems. Many times the mathematical models describing such problems are based on a relatively small amount of available information such as experimental measurements. The question arises whether the computed data could be used as the basis for decision in critical engineering, economic, medicine applications. The representative list of engineering accidents occurred in the past years and their reasons illustrates the question. The paper describes a general framework for Verification and Validation which deals with this question. The framework is then applied to an illustrative engineering problem, in which the basis for decision is a specific quantity of interest, namely the probability that the quantity does not exceed a given value. The V&V framework is applied and explained in detail. The result of the analysis is the computation of the failure probability as well as a quantification of the confidence in the computation, depending on the amount of available experimental data. 1
Multiview Extensive Partition Operators for Semantic Video Object Extraction
, 2001
"... Occlusion/disocclusion is one of the fundamental problems for semantic video object (SVO) extraction, where pixelwise accuracy is required. This issue is critical because the degradation in tracking due to object occlusion/disocclusion significantly increases the amount of user interaction required ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Occlusion/disocclusion is one of the fundamental problems for semantic video object (SVO) extraction, where pixelwise accuracy is required. This issue is critical because the degradation in tracking due to object occlusion/disocclusion significantly increases the amount of user interaction required in offline video editing applications. In this paper, we present an approach based on the application of an extensive operator on a lattice of partitions, which exploits information from various views of the scene, based on a probabilistic formulation. Our multiview operator builds on the regional application of the Maximum a Posteriori principle, by integrating a singleview region classification stage with a multiview stage that improves classification for those disoccluded regions labeled as uncertain. Results on several real sequences show that our approach improves the SVO tracking compared to the singleview case and that, as a result, increases the quality of the extracted SVOs and reduces the total amount of user interaction.
A tutorial introduction to Bayesian models of cognitive development
"... We present an introduction to Bayesian inference as it is used in probabilistic models of cognitive development. Our goal is to provide an intuitive and accessible guide to the what, the how, and the why of the Bayesian approach: what sorts of problems and data the framework is most relevant for, an ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We present an introduction to Bayesian inference as it is used in probabilistic models of cognitive development. Our goal is to provide an intuitive and accessible guide to the what, the how, and the why of the Bayesian approach: what sorts of problems and data the framework is most relevant for, and how and why it may be useful for developmentalists. We emphasize a qualitative understanding of Bayesian inference, but also include information about additional resources for those interested in the cognitive science applications, mathematical foundations, or machine learning details in more depth. In addition, we discuss some important interpretation issues that often arise when evaluating Bayesian models in cognitive science.
Approximation of the Likelihood Function in Bayesian Technique for the Solution of Inverse Problems
 Proceedings of International Symposium on Inverse Problems, Design and Optimization (IPDO2007), (edited by
"... This work deals with the use of radial basis functions for the interpolation of the likelihood function, in a parameter estimation problem solved with the Bayesian technique. The proposed interpolation of the likelihood function is applied to a testcase involving the estimation of parameters in the ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This work deals with the use of radial basis functions for the interpolation of the likelihood function, in a parameter estimation problem solved with the Bayesian technique. The proposed interpolation of the likelihood function is applied to a testcase involving the estimation of parameters in the dispersion of a tracer in saturated soils. The use of the interpolated likelihood function reduces significantly the computational cost associated with the implementation of Markov Chain Monte Carlo methods to the solution of the present inverse problem.
www.answersingenesis.org/contents/379/arj/v2/Chalk_Part_of_Flood.pdf Chalk and “Upper Cretaceous ” Deposits
"... Thick chalk deposits exist in several parts of the world, including Europe, Australia and the USA. The bulk of this chalk is considered to belong to what is referred to as the “Upper Cretaceous ” period. Geologists working within a framework of uniformitarianism (or actualism) claim that they result ..."
Abstract
 Add to MetaCart
Thick chalk deposits exist in several parts of the world, including Europe, Australia and the USA. The bulk of this chalk is considered to belong to what is referred to as the “Upper Cretaceous ” period. Geologists working within a framework of uniformitarianism (or actualism) claim that they result from millions of years of accumulation of coccoliths. If we are to take the new understanding of the age of the earth from RATE studies seriously, then it is necessary to explain the chalk by mechanisms which do not involve such long timescales. Snelling (1994) attempted to explain the chalk deposits within a timescale of a few days, so that chalk could be considered as part of the visible evidence for the Noachian Flood. Tyler (1996) then tried to show that the model proposed by Snelling was not tenable, and described how chalk had to be interpreted as a postFlood deposit, but within a short timescale. This document shows two things. First, that the certain features of the “Upper Cretaceous ” period correspond closely with the biblical account of the Noachian Flood around day 150. Second, that uniformitarian explanations for “chalk ” are inadequate to explain their deposition, reworking and geomorphology and that only by considering the rapid events in the middle of the Noachian Flood can their deposition and characteristics be explained. En passant we make two additional discoveries, viz (i) that the concept of the geological column is not robust over small distances, and (ii) that there is independent support to the RATE studies that show that the earth is young. A consequence of this geoscientific study is that geology is a powerful visible witness to the testimony of the Bible, and such facts should therefore be used in evangelism. Specifically, the real fossil record, rather than the constructed geological column, disproves evolution. The geoscience also shows that active promotion of what was commonly known as the European Recolonization Model (or its variants where the bulk of the strata are judged to be “postFlood”) to explain geology was illfounded.