###
*Helias* M: Noise suppression and surplus synchrony by coincidence detection

- PLoS Comput Biol

"... ar ..."

### The perfect integrator driven by Poisson input and its approximation in the diffusion limit

"... In this note we consider the perfect integrator driven by Poisson process input. We derive its equilibrium and response properties and contrast them to the approximations obtained by applying the diffusion approximation. In particular, the probability density in the vicinity of the threshold differs ..."

Abstract
- Add to MetaCart

charge causes a depolarization starting from Vr. We consider a population of identical neurons and assume a uniformly distributed membrane voltage between reset and threshold initially. In what follows we apply the formalism outlined in

*Helias*et al. [2010]. The first and second infinitesimal moment### POSTER PRESENTATION Neurons hear their echo

"... The functional implications of correlations in cortical networks are still highly debated [1] and theoreticians are intensely searching for a self-consistent solution of the correlation structure in recurrent networks. Feedforward descriptions have been presented as approximations [2] and different ..."

Abstract
- Add to MetaCart

The functional implications of correlations in cortical networks are still highly debated [1] and theoreticians are intensely searching for a self-consistent solution of the correlation structure in recurrent networks. Feedforward descriptions have been presented as approximations [2] and different aspects of correlation functions in the asynchronous irregular state have been accurately predicted, such as the zero time lag correlation [3] and its scaling with network size on a coarse time scale [1]. Previous approaches do, however, not explain the differences between the correlation functions for excitatory and inhibitory neurons and they do not describe their temporal structure, an experimentally observable feature that has important functional consequences for synaptic plasticity [4].

### POSTER PRESENTATION Open Access Calcium current improves coincidence detection of the LIF model

"... Dendritic spikes are known to improve efficacy of synaptic inputs in causing action potentials [1]. The cal-cium spike at distal apical dendrites of layer 5 pyramidal neurons has been observed in-vitro and argued to sup-port the propagation of synaptic inputs from distal tufts to the soma [2]. When ..."

Abstract
- Add to MetaCart

Dendritic spikes are known to improve efficacy of synaptic inputs in causing action potentials [1]. The cal-cium spike at distal apical dendrites of layer 5 pyramidal neurons has been observed in-vitro and argued to sup-port the propagation of synaptic inputs from distal tufts to the soma [2]. When combined with a back-propagat-ing action potential, a smaller distal current is sufficient to trigger a calcium spike [3]. Recently, it has also been shown in-vivo that dendritic spikes contribute to the neuronal activity [4,5]. Calcium spikes have been modeled in multi-compart-ment point neuron models using first order kinetics [6]. Here we show that calcium spikes, in the regime of large synchronous inputs on top of a background of weakly fluctuating synaptic noise can be well approxi-mated by a threshold-triggered current of fixed wave-form. The exact contribution of the calcium spike to the somatic membrane potential can then be analytically derived. Accurate predictions are only obtained if corre-lations between the membrane potential and synaptic conductances are taken into account [7]. Comparing neuron models with and without calcium dynamics, we find that the calcium current increases the sensitivity of the neuronâ€™s spiking response to sufficiently large coincident input. In numerical simulations carried out with NEST [8], we investigate the effect of the jitter of close to synchronous inputs on the probability to elicit a calcium spike. With increased jitter, fewer calcium spikes are elicited and their average amplitude decreases.

### Open Access

"... The pooled spike trains of populations of neurons are typically modeled as Poisson processes [2]. It is known, though, that the superposition of point processes is a Poisson process if and only if all components are Poisson processes [3]. However, neocortical neurons spike more regularly [1]. Partly ..."

Abstract
- Add to MetaCart

The pooled spike trains of populations of neurons are typically modeled as Poisson processes [2]. It is known, though, that the superposition of point processes is a Poisson process if and only if all components are Poisson processes [3]. However, neocortical neurons spike more regularly [1]. Partly this is because they often have a refractory period, but also because the membrane potential is hyperpolarized after each spike, as illustrated in Figure 1A. Here we analyze neuronal spike trains recorded intracellularly in vivo from rat somatosensory cortex. We match them with a Poisson process with dead-time [4], which is the simplest model of neuronal activity that incorporates refractory effects. The deadtime here models the effective refractoriness of the neuron, which can be larger than the refractory period due

### ORAL PRESENTATION Open Access

"... Decorrelation of low-frequency neural activity by inhibitory feedback ..."

### POSTER PRESENTATION Open Access

"... Identifying and exploiting the anatomical origin of population rate oscillations in multi-layered spiking networks ..."

Abstract
- Add to MetaCart

Identifying and exploiting the anatomical origin of population rate oscillations in multi-layered spiking networks

### POSTER PRESENTATION Open Access Influence of different types of downscaling on a cortical microcircuit model

"... Neural network models are routinely downscaled in terms of numbers of neurons or synapses because of a lack of computational resources or the limited capacity of a given neuromorphic hardware. Unfortunately the downscaling is often performed without explicit mention of the limita-tions this entails ..."

Abstract
- Add to MetaCart

Neural network models are routinely downscaled in terms of numbers of neurons or synapses because of a lack of computational resources or the limited capacity of a given neuromorphic hardware. Unfortunately the downscaling is often performed without explicit mention of the limita-tions this entails [1]. This is relevant since downscaling can substantially affect the dynamics. For instance, redu-cing the number of neurons N while preserving in-degrees K increases shared inputs and hence correlations. Theoretical results on scaling are derived using simpli-fying assumptions. Therefore we use simulations to sys-tematically investigate the effects of downscaling on the dynamics of a layered microcircuit model of early sen-sory cortex [2]. The model consists of eight excitatory and inhibitory populations of leaky integrate-and-fire

### POSTER PRESENTATION Open Access Recurrence and external sources differentially shape network correlations

"... The presence of correlated neuronal activity as such is not surprising, but rather a natural consequence of network connectivity and dynamics, in particular direct synaptic connections and shared local and non-local presynaptic sources [1]. The intriguing feature, however, is the modu-lation of corr ..."

Abstract
- Add to MetaCart

The presence of correlated neuronal activity as such is not surprising, but rather a natural consequence of network connectivity and dynamics, in particular direct synaptic connections and shared local and non-local presynaptic sources [1]. The intriguing feature, however, is the modu-lation of correlations in relation to behavior [2]. These task-dependent changes may indicate that correlated spike timing is used for the storage, transmission, and proces-sing of information. Moreover, correlated synaptic activa-tion strongly influences the power and the spatial reach of the local field potential [3], a commonly recorded signal in experimental neuroscience. A theoretical understanding of correlations requires the representation of (1) the recurrent connectivity and (2) external and internal sources of temporally varying or fluc-

### From The Twenty Third Annual Computational Neuroscience Meeting: CNS*2014

"... The theory describing correlated activity emerging in recurrent networks relies on the single neuron response to a modulation of its input, i.e. the transfer function. For the leaky integrate-and-fire neuron model exposed to unfiltered synaptic noise the transfer function can be derived analytically ..."

Abstract
- Add to MetaCart

The theory describing correlated activity emerging in recurrent networks relies on the single neuron response to a modulation of its input, i.e. the transfer function. For the leaky integrate-and-fire neuron model exposed to unfiltered synaptic noise the transfer function can be derived analytically [1,2]. In this context the effect of synaptic filtering on the response properties has also been studied intensively at the beginning of the last dec-ade [3,4]. Analytical results were derived in the low as well as in the high frequency limit. The main finding is that the linear response amplitude of model neurons exposed to filtered synaptic noise does not decay to zero in the high frequency limit. A numerical method has also been developed to study the influence of synaptic noise on the response properties [5]. Here we first revi-sit the transfer function for neuron models without synaptic filtering and simplify the derivation exploiting analogies between the one dimensional Fokker-Planck equation and the quantum harmonic oscillator. We treat the problem of synaptic filtering with short time constants by reducing the corresponding two dimen-sional Fokker-Planck equation to one dimension with effective boundary conditions [6]. To this end we use the static and dynamic boundary conditions derived ear-lier by a perturbative treatment of the arising boundary layer problem [4]. Finally we compare the analytical results to direct simulations (Fig.1) and observe that the approximations are valid up to frequencies in the gamma range (60-80 Hz). Deviations are explained by the nature of the approximations.