Results 1 
3 of
3
Non Linear Neurons in the Low Noise Limit: A Factorial Code Maximizes Information Transfer
, 1994
"... We investigate the consequences of maximizing information transfer in a simple neural network (one input layer, one output layer), focussing on the case of non linear transfer functions. We assume that both receptive fields (synaptic efficacies) and transfer functions can be adapted to the environm ..."
Abstract

Cited by 141 (18 self)
 Add to MetaCart
We investigate the consequences of maximizing information transfer in a simple neural network (one input layer, one output layer), focussing on the case of non linear transfer functions. We assume that both receptive fields (synaptic efficacies) and transfer functions can be adapted to the environment. The main result is that, for bounded and invertible transfer functions, in the case of a vanishing additive output noise, and no input noise, maximization of information (Linsker'sinfomax principle) leads to a factorial code  hence to the same solution as required by the redundancy reduction principle of Barlow. We show also that this result is valid for linear, more generally unbounded, transfer functions, provided optimization is performed under an additive constraint, that is which can be written as a sum of terms, each one being specific to one output neuron. Finally we study the effect of a non zero input noise. We find that, at first order in the input noise, assumed to be small ...
A BergerLevy Energy Efficient Neuron Model with Unequal Synaptic Weights
"... Abstract—How neurons in the cerebral cortex process and transmit information is a longstanding question in systems neuroscience. To analyze neuronal activity from an informationenergy efficiency standpoint, Berger and Levy calculated the maximum Shannon mutual information transfer per unit of ener ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract—How neurons in the cerebral cortex process and transmit information is a longstanding question in systems neuroscience. To analyze neuronal activity from an informationenergy efficiency standpoint, Berger and Levy calculated the maximum Shannon mutual information transfer per unit of energy expenditure of an idealized integrateandfire (IIF) neuron whose excitatory synapses all have the same weight. Here, we extend their IIF model to a biophysically more realistic one in which synaptic weights are unequal. Using information theory, random Poisson measures, and the maximum entropy principle, we show that the probability density function (pdf) of interspike interval (ISI) duration induced by the bits per joule (bpj) maximizing pdf fΛ(λ) of the excitatory postsynaptic potential (EPSP) intensity remains equal to the delayed gamma distribution of the IIF model. We then show that, in the case of unequal weights, fΛ(·) satisfies an inhomogeneous CauchyEuler equation with variable coefficients for which we provide the general solution form. I.
Molecular Biology, and
, 2003
"... Entropy and information in neural spike trains: Progress on the sampling problem ..."
Abstract
 Add to MetaCart
Entropy and information in neural spike trains: Progress on the sampling problem