## Dynamical Properties of Strongly Interacting Markov Chains (2003)

Venue: | Neural Networks |

Citations: | 13 - 5 self |

### BibTeX

@ARTICLE{Ay03dynamicalproperties,

author = {Nihat Ay and Thomas Wennekers},

title = {Dynamical Properties of Strongly Interacting Markov Chains},

journal = {Neural Networks},

year = {2003},

volume = {16},

pages = {1483--1497}

}

### OpenURL

### Abstract

by

### Citations

1421 |
On information and sufficiency
- Kullback, Leibler
- 1951
(Show Context)
Citation Context ...on of the so-called mutual information of two units: Consider N binary units 1, 2,...,N and a joint probability distribution p on the configuration set {0, 1} N . Then the Kullback-Leibler divergence =-=[13, 8]-=- of p from the set of factorizable distributions is a natural measure for the “spatial” interdependence of the units: I(p) := inf D(p � p1 ⊗···⊗pN) . (1) pk , 1≤k≤N, distributions on {0, 1} The Kullba... |

439 |
Corticonics. Neural circuits of the cerebral cortex
- Abeles
- 1991
(Show Context)
Citation Context ...s K (ν) (1 | ω) and perturb it by a small random number, ξ, equally distributed on [−ɛ, ɛ], where ɛ is the learning rate (ɛ = .025, if not stated otherwise). Perturbed values are clipped to the range =-=[0, 1]-=-, and the K (ν) (0 | ω) are again fixed by the normalization condition. That is, for randomly chosen ν ∈ V and ω ∈{0, 1} N we set K (ν) t+1(1 | ω) = φ � K (ν) t (1 | ω)+ξ � (10) K (ν) t+1(0 | ω) = 1−K... |

316 | Self-organization in a perceptual network
- Linsker
- 1988
(Show Context)
Citation Context ...pproaches to the understanding of first principles for neural organization and learning, where information theory provides an appropriate framework for the formulation and analysis of such principles =-=[15, 16, 6, 20]-=-. A well known measure that quantifies relations of interacting units is a generalized version of the so-called mutual information of two units: Consider N binary units 1, 2,...,N and a joint probabil... |

199 |
From basic network principles to neural architecture: emergence of orientation-selective cells
- Linsker
- 1986
(Show Context)
Citation Context ...pproaches to the understanding of first principles for neural organization and learning, where information theory provides an appropriate framework for the formulation and analysis of such principles =-=[15, 16, 6, 20]-=-. A well known measure that quantifies relations of interacting units is a generalized version of the so-called mutual information of two units: Consider N binary units 1, 2,...,N and a joint probabil... |

139 | A simple coding procedure enhances a neuron’s information capacity. Zeitschrift fur Naturforschung 36 - Laughlin - 1981 |

118 | A measure for brain complexity: relating functional segregation and integration in the nervous system - Tononi, Sporns, et al. - 1994 |

107 | Dynamics of neuronal firing correlation - modulation of effective connectivity - Aertsen, Gerstein, et al. - 1989 |

89 | Synchronous oscillations in neuronal systems: Mechanisms and functions - Gray - 1994 |

73 | Information geometry on hierarchy of probability distributions
- Amari
(Show Context)
Citation Context ... D(p � p1 ⊗···⊗pN) . (1) pk , 1≤k≤N, distributions on {0, 1} The Kullback-Leibler divergence represents the basis of many approaches to neural complexity [19, 9] and has been theoretically studied in =-=[4, 5]-=- from the information geometric point of view, where it is referred to as (stochastic) interaction. In order to capture the intrinsically temporal aspects of interaction, I has been extended in [7] to... |

59 | The correlation theory of brain function. Internal report, Max-Planck-Institute of Biophysical Chemistry - Malsburg - 1981 |

32 |
Neural coding: Higher-order temporal patterns in the neurostatistics of cell assemblies
- Martignon, Deco, et al.
(Show Context)
Citation Context ...ial” interdependence of the units: I(p) := inf D(p � p1 ⊗···⊗pN) . (1) pk , 1≤k≤N, distributions on {0, 1} The Kullback-Leibler divergence represents the basis of many approaches to neural complexity =-=[19, 9]-=- and has been theoretically studied in [4, 5] from the information geometric point of view, where it is referred to as (stochastic) interaction. In order to capture the intrinsically temporal aspects ... |

21 | Cell Assemblies, Associative Memory and Temporal Structure in Brain Signals - Wennekers, Palm - 2000 |

20 | Detecting higher-order interactions among the spiking events in a group of neurons - Martignon, Hasseln, et al. - 1995 |

8 | Locality of Global Stochastic Interaction in Directed Acyclic Networks
- Ay
- 2002
(Show Context)
Citation Context ...pproaches to the understanding of first principles for neural organization and learning, where information theory provides an appropriate framework for the formulation and analysis of such principles =-=[15, 16, 6, 20]-=-. A well known measure that quantifies relations of interacting units is a generalized version of the so-called mutual information of two units: Consider N binary units 1, 2,...,N and a joint probabil... |

5 |
An information geometric approach to a theory of pragmatic structuring, The Annals of Probability 30
- Ay
- 2002
(Show Context)
Citation Context ... D(p � p1 ⊗···⊗pN) . (1) pk , 1≤k≤N, distributions on {0, 1} The Kullback-Leibler divergence represents the basis of many approaches to neural complexity [19, 9] and has been theoretically studied in =-=[4, 5]-=- from the information geometric point of view, where it is referred to as (stochastic) interaction. In order to capture the intrinsically temporal aspects of interaction, I has been extended in [7] to... |