## Distinctive features, categorical perception, and probability learning: some applications of a neural model (1977)

Venue: | Psychological Review |

Citations: | 124 - 1 self |

### BibTeX

@ARTICLE{Anderson77distinctivefeatures,,

author = {James A. Anderson and Jack W. Silverstein and Stephen A. Ritz and Randall S. Jones},

title = {Distinctive features, categorical perception, and probability learning: some applications of a neural model},

journal = {Psychological Review},

year = {1977},

volume = {84},

pages = {413--451}

}

### Years of Citing Articles

### OpenURL

### Abstract

A previously proposed model for memory based on neurophysiological considerations is reviewed. We assume that (a) nervous system activity is usefully represented as the set of simultaneous individual neuron activities in a group of neurons; (b) different memory traces make use of the same synapses; and (c) synapses associate two patterns of neural activity by incrementing synaptic connectivity proportionally to the product of pre- and postsynaptic activity, forming a matrix of synaptic connectivities. We extend this model by (a) introducing positive feedback of a set of neurons onto itself and (b) allowing the individual neurons to saturate. A hybrid model, partly analog and partly binary, arises. The system has certain characteristics reminiscent of analysis by distinctive features. Next, we apply the model to "categorical perception. " Finally, we discuss probability learning. The model can predict overshooting, recency data, and probabilities occurring in systems with more than two events with reasonably good accuracy. In the beginner's mind there are many possibilities, but in the expert's there are few. —Shunryu Suzuki 1970 I.