Results 1  10
of
33
Global Spontaneous Activity and Local Structured (learned) Delay Activity in Cortex
, 1996
"... to any of the stimuli learned have rates which gradually increase with the amplitude of synaptic potentiation. b. When the average LTP increases beyond a critical value, specific local attractors appear abruptly against the background of the global uniform spontaneous attractor. This happens with e ..."
Abstract

Cited by 188 (20 self)
 Add to MetaCart
to any of the stimuli learned have rates which gradually increase with the amplitude of synaptic potentiation. b. When the average LTP increases beyond a critical value, specific local attractors appear abruptly against the background of the global uniform spontaneous attractor. This happens with either gradual or discrete stochastic LTP. 4. The above findings predict that in the process of learning unfamiliar stimuli, there is a stage in which all neurons selective to any of the learned stimuli enhance their spontaneous activity relative to the rest. Then, abruptly, selective delay activity appear. Both facts could be observed in single unit recordings in delayed match to sample experiments. 5. Beyond this critical learning strength the local module has two types of collective activity. It either participates in the global spontaneous activity, or it maintains a stimulus selective elevated activity distribution. The particular mode of behavior depends on the stimulus: if it is unfa
Representation of spatial orientation by the intrinsic dynamics of the headdirection cell ensemble: A theory
 J. Neurosci
, 1996
"... The headdirection (HD) cells found in the limbic system in freely moving rats represent the instantaneous head direction of the animal in the horizontal plane regardless of the location of the animal. The internal direction represented by these cells uses both selfmotion information for inettiall ..."
Abstract

Cited by 130 (4 self)
 Add to MetaCart
The headdirection (HD) cells found in the limbic system in freely moving rats represent the instantaneous head direction of the animal in the horizontal plane regardless of the location of the animal. The internal direction represented by these cells uses both selfmotion information for inettially based updating and familiar visual landmarks for calibration. Here, a model of the dynamics of the HD cell ensemble is presented. The stability of a localized static activity profile in the network and a dynamic shift mechanism are explained naturally by synaptic weight distribution components with even and odd symmetry, respectively. Under symmetric weights or symmetric reciprocal connections, a stable activity profile close to the known directional tuning curves will emerge. By adding a slight asymmetry to the weights, the activity profile will shift continuously without 1
Information Theory of Complex Networks: on evolution and architectural constraints
 In
, 2004
"... Complex networks are characterized by highly heterogeneous distributions of links, often pervading the presence of key properties such as robustness under node removal. Several correlation measures have been defined in order to characterize the structure of these nets. Here we show that mutual infor ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
Complex networks are characterized by highly heterogeneous distributions of links, often pervading the presence of key properties such as robustness under node removal. Several correlation measures have been defined in order to characterize the structure of these nets. Here we show that mutual information, noise and joint entropies can be properly defined on a static graph. These measures are computed for a number of real networks and analytically estimated for some simple standard models. It is shown that real networks are clustered in a welldefined domain of the entropy/noise space. By using simulated annealing optimization, it is shown that optimally heterogeneous nets actually cluster around the same narrow domain, suggesting that strong constraints actually operate on the possible universe of complex networks. The evolutionary implications are discussed.
Border ownership from intracortical interactions in visual area V2
 Neuron
, 2005
"... In contrast, Figure 1F is generally interpreted as a gray ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
In contrast, Figure 1F is generally interpreted as a gray
A Smoothing Regularizer for Feedforward and Recurrent Neural Networks
, 1996
"... We derive a smoothing regularizer for dynamic network models by requiring robustness in prediction performance to perturbations of the training data. The regularizer can be viewed as a generalization of the first order Tikhonov stabilizer to dynamic models. For two layer networks with recurrent conn ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
We derive a smoothing regularizer for dynamic network models by requiring robustness in prediction performance to perturbations of the training data. The regularizer can be viewed as a generalization of the first order Tikhonov stabilizer to dynamic models. For two layer networks with recurrent connections described by Y (t) = f \Gamma WY (t \Gamma ø) + V X(t) \Delta ; Z(t) = UY (t) ; the training criterion with the regularizer is D = 1 N N X t=1 jjZ(t) \Gamma Z (\Phi; I(t))jj 2 + ae ø 2 (\Phi) ; where \Phi = fU; V; Wg is the network parameter set, Z(t) are the targets, I(t) = fX(s); s = 1; 2; \Delta \Delta \Delta ; tg represents the current and all historical input information, N is the size of the training data set, ae ø 2 (\Phi) is the regularizer, and is a regularization parameter. The closedform expression for the regularizer for timelagged recurrent networks is: ae ø (\Phi) = fljjU jjjjV jj 1 \Gamma fljjW jj h 1 \Gamma e fljjW jj\Gamma1 ø i ; ...
Quoy M., “Structure and dynamics of random recurrent neural networks”, submitted
, 2005
"... On behalf of: ..."
A Network of Chaotic Elements for Information Processing
, 1996
"... A Globally Coupled Map (GCM) model is a network of chaotic elements that are globally coupled with each other. In this paper, first, a modified GCM model called the "Globally Coupled Map using the Symmetric map (SGCM)" is proposed. The SGCM is designed for informationprocessing applications. The S ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
A Globally Coupled Map (GCM) model is a network of chaotic elements that are globally coupled with each other. In this paper, first, a modified GCM model called the "Globally Coupled Map using the Symmetric map (SGCM)" is proposed. The SGCM is designed for informationprocessing applications. The SGCM has attractors called "cluster frozen attractors," each of which is taken to represent information. This paper also describes the following characteristics of the SGCM which are important to informationprocessing applications: (a) The SGCM falls into one of the cluster frozen attractors over a wide range of parameters. This means that the information representation is stable over parameters; (b) Represented information can be preserved or broken by controlling parameters; (c) The cluster partitioning is restricted, i.e., the representation of information has a limitation. Finally, our techniques for applying the SGCM to information processing are shown, considering these characteris...
Dynamic Approximation of Spatiotemporal Receptive Fields in Nonlinear Neural Field Models
"... This article presents an approximation method to reduce the spatiotemporal behavior of localized activation peaks (also called "bumps") in nonlinear neural #eld equations to a set of coupled ordinary differential equations (ODEs) for only the amplitudes and tuning widths of these peaks. This enables ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
This article presents an approximation method to reduce the spatiotemporal behavior of localized activation peaks (also called "bumps") in nonlinear neural #eld equations to a set of coupled ordinary differential equations (ODEs) for only the amplitudes and tuning widths of these peaks. This enables a simpli#ed analysis of steadystate receptive #elds and their stability, as well as spatiotemporal point spread functions and dynamic tuning properties. A lowestorder approximation for peak amplitudes alone shows that much of the wellstudied behavior of small neural systems (e.g., the WilsonCowan oscillator) should carry over to localized solutions in neural #elds. Full spatiotemporal response pro#les can further be reconstructed from this lowdimensional approximation. The method is applied to two standard neural #eld models: a onelayer model with differenceofgaussians connectivity kernel and a twolayer excitatoryinhibitory network. Similar models have been previously employed in numerical studies addressing orientation tuning of cortical simple cells. Explicit formulas for tuning properties, instabilities, and oscillation frequencies are given, and exemplary spatiotemporal response functions, reconstructed from the lowdimensional approximation, are compared with full network simulations
Orientation Tuning Properties of Simple Cells in Area V1 Derived from an Approximate Analysis of Nonlinear Neural Field Models
"... this article, we consider #eld models for orientation tuning in more general settings and approximate solutions of truly nonlinear mutually coupled excitatory and inhibitory #elds. We derive analytical expressions for qualitative properties of steadystate solutions (#xed points) of these #eld equat ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
this article, we consider #eld models for orientation tuning in more general settings and approximate solutions of truly nonlinear mutually coupled excitatory and inhibitory #elds. We derive analytical expressions for qualitative properties of steadystate solutions (#xed points) of these #eld equations, taking into account a whole class of nonlinear rate functions that obey a power law above some threshold and that are zero below. These functions can well approximate rate functions of cortical cells in the region of low and intermediate #ring rates. The algebraic expressions derived for stationary #ring rates and tuning widths can be solved numerically and also analytically in an approximate way. This way, tuning properties of cortical simple cells can be linked to anatomical and physiological data, and the qualitative effects found by BenYishai et al. (1995) and Somers et al. (1995) can be characterized more quantitatively. In addition, further properties of cortical receptive #elds can be predicted, especially when rate functions of cortical cells are assumed to be nonlinear
Effects of fast presynaptic noise in attractor neural networks
 Neural Comp
, 2006
"... We study both analytically and numerically the effect of presynaptic noise on the transmission of information in attractor neural networks. The noise occurs on a very short–time scale compared to that for the neuron dynamics and it produces short– time synaptic depression. This is inspired in recent ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
We study both analytically and numerically the effect of presynaptic noise on the transmission of information in attractor neural networks. The noise occurs on a very short–time scale compared to that for the neuron dynamics and it produces short– time synaptic depression. This is inspired in recent neurobiological findings that show that synaptic strength may either increase or decrease on a short– time scale depending on presynaptic activity. We thus describe a mechanism by which fast presynaptic noise enhances the neural network sensitivity to an external stimulus. The reason for this is that, in general, the presynaptic noise induces nonequilibrium behavior and, consequently, the space of fixed points is qualitatively modified in such a way that the system can easily scape from the attractor. As a result, the model shows, in addition to pattern recognition, class identification and categorization, which may be relevant to the understanding of some of the brain complex tasks. 1