## Learning continuous probability distributions with symmetric diffusion networks (1993)

Venue: | Cognitive Science |

Citations: | 32 - 7 self |

### BibTeX

@ARTICLE{Movellan93learningcontinuous,

author = {Javierr. Movellan and James L. Mcclelland},

title = {Learning continuous probability distributions with symmetric diffusion networks},

journal = {Cognitive Science},

year = {1993},

volume = {17},

pages = {463--496}

}

### Years of Citing Articles

### OpenURL

### Abstract

in this article we present symmetric diffusion networks, a family of networks that instantiate the principles of continuous, stochastic, adaptive and interactive pro-pagation of information. Using methods of Markovlon diffusion theory, we for-malize the activation dynamics of these networks and then show that they can be trained to reproduce entire muitivariote probability distributions an their outputs using the contrastive Hebbian learning rule (CHL).,We show that CHL performs gradient descent on an error function that captures differences between desired and obtolned continuous multivoriate probability distributions. This allows the learning algorithm to go beyond expected values of output units and to approxi-mate complete probability distributions on continuous muitivariote activation spaces. We argue that learning continuous distributions is an important task underlying a variety of real-life situations that were beyond the scope of previous connectionist networks. Deterministic networks, like back propagation, cannot ieorn this task because they ore limited to learning average values of indepen-dent output units. Previous stochastic connectionist networks could learn pro-bobility distributions but they were limited to discrete variables. Simulations show that symmetric diffusion networks can be trained with the CHL rule to op-proximate discrete and continuous probability distributions of various types. 1.