Results 1  10
of
37
Vector Reconstruction from Firing Rates
, 1994
"... . In a number of systems including wind detection in the cricket, visual motion perception and coding of arm movement direction in the monkey and place cell response to position in the rat hippocampus, firing rates in a population of tuned neurons are correlated with a vector quantity. We examine an ..."
Abstract

Cited by 112 (7 self)
 Add to MetaCart
. In a number of systems including wind detection in the cricket, visual motion perception and coding of arm movement direction in the monkey and place cell response to position in the rat hippocampus, firing rates in a population of tuned neurons are correlated with a vector quantity. We examine and compare several methods that allow the coded vector to be reconstructed from measured firing rates. In cases where the neuronal tuning curves resemble cosines, linear reconstruction methods work as well as more complex statistical methods requiring more detailed information about the responses of the coding neurons. We present a new linear method, the optimal linear estimator (OLE), that on average provides the best possible linear reconstruction. This method is compared with the more familiar vector method and shown to produce more accurate reconstructions using far fewer recorded neurons. Introduction To determine how information is represented by nervous systems, we need to understand ...
Nonlinear Multivariate Analysis of Neurophysiological Signals
 Progress in Neurobiology
, 2005
"... Multivariate time series analysis is extensively used in neurophysiology with the aim of studying the relationship between simultaneously recorded signals. Recently, advances on information theory and nonlinear dynamical systems theory have allowed the study of various types of synchronization from ..."
Abstract

Cited by 40 (1 self)
 Add to MetaCart
Multivariate time series analysis is extensively used in neurophysiology with the aim of studying the relationship between simultaneously recorded signals. Recently, advances on information theory and nonlinear dynamical systems theory have allowed the study of various types of synchronization from time series. In this work, we first describe the multivariate linear methods most commonly used in neurophysiology and show that they can be extended to assess the existence of nonlinear interdependences between signals. We then review the concepts of entropy and mutual information followed by a detailed description of nonlinear methods based on the concepts of phase synchronization, generalized synchronization and event synchronization. In all cases, we show how to apply these methods to study different kinds of neurophysiological data. Finally, we illustrate the use of multivariate surrogate data test for the assessment of the strength (strong or weak) and the type (linear or nonlinear) of interdependence between neurophysiological signals.
Detecting and estimating signals in noisy cable structures: II. Information theoretical analysis
, 1999
"... This is the second in a series of papers which attempt to recast classical singleneuron biophysics in information theoretical terms. Classical cable theory focuses on analyzing the voltage or current attenuation of a synaptic signal as it propagates from its dendritic input location to the spike in ..."
Abstract

Cited by 40 (5 self)
 Add to MetaCart
This is the second in a series of papers which attempt to recast classical singleneuron biophysics in information theoretical terms. Classical cable theory focuses on analyzing the voltage or current attenuation of a synaptic signal as it propagates from its dendritic input location to the spike initiation zone. On the other hand, we are interested in analyzing the amount of information lost about the signal in this process due to the presence of various noise sources distributed throughout the neuronal membrane. We use a stochastic version of the linear onedimensional cable equation to derive closedform expressions for the secondorder moments of the fluctuations of the membrane potential associated with different membrane current noise sources: thermal noise, noise due to the random opening and closing of sodium and potassium channels and noise due to the presence of "spontaneous" synaptic input. We consider two different scenarios. In the signal estimation paradigm, the timecour...
Neural coding and decoding: communication channels and quantization
 Network: Computation in Neural Systems
, 2001
"... We present a novel analytical approach for studying neural encoding. As a
first step we model a neural sensory system as a communication channel.
Using the method of typical sequence in this context, we show that a
coding scheme is an almost bijective relation between equivalence classes of
stimulus ..."
Abstract

Cited by 36 (8 self)
 Add to MetaCart
We present a novel analytical approach for studying neural encoding. As a
first step we model a neural sensory system as a communication channel.
Using the method of typical sequence in this context, we show that a
coding scheme is an almost bijective relation between equivalence classes of
stimulus/response pairs. The analysis allows a quantitative determination of the
type of information encoded in neural activity patterns and, at the same time,
identification of the code with which that information is represented. Due to the
high dimensionality of the sets involved, such a relation is extremely difficult
to quantify. To circumvent this problem, and to use whatever limited data set is
available most efficiently, we use another technique from information theory—
quantization. We quantize the neural responses to a reproduction set of small
finite size. Amongmany possible quantizations, we choose one which preserves
as much of the informativeness of the original stimulus/response relation as
possible, through the use of an informationbased distortion function. This
method allows us to study coarse but highly informative approximations of a
coding scheme model, and then to refine them automatically when more data
become available.
Reconstruction of Natural Scenes from Ensemble Responses in the Lateral Geniculate Nucleus
 J. Neurosci
, 1999
"... critical test of our understanding of sensory coding, however, is to take an opposite approach: to reconstruct sensory inputs from recorded neuronal responses. The decoding approach can provide an objective assessment of what and how much information is available in the neuronal responses. Although ..."
Abstract

Cited by 30 (3 self)
 Add to MetaCart
critical test of our understanding of sensory coding, however, is to take an opposite approach: to reconstruct sensory inputs from recorded neuronal responses. The decoding approach can provide an objective assessment of what and how much information is available in the neuronal responses. Although the f unction of the brain is not necessarily to reconstruct sensory inputs faithfully, these studies may lead to new insights into the f unctions of neuronal circuits in sensory processing (Rieke et al., 1997). The decoding approach has been used to study several sensory systems (Bialek et al., 1991; Theunissen and Miller, 1991; Rieke et al., 1993, 1997; Roddey and Jacobs, 1996; Warland et al., 1997; Dan et al., 1998). Most of these studies aimed to reconstruct temporal signals from the response of a single neuron (Bialek et al., 1991; Rieke et al., 1993, 1995; Roddey and Jacobs, 1996) or a small number of neurons (Warland et al., 1997). An important challenge in understanding the mammalia
Decoding Neuronal Firing And Modeling Neural Networks
 Quart. Rev. Biophys
, 1994
"... Introduction Biological neural networks are large systems of complex elements interacting through a complex array of connections. Individual neurons express a large number of active conductances (Connors et al., 1982; Adams & Gavin, 1986; Llin'as, 1988; McCormick, 1990; Hille, 1992) and exhibit a w ..."
Abstract

Cited by 25 (4 self)
 Add to MetaCart
Introduction Biological neural networks are large systems of complex elements interacting through a complex array of connections. Individual neurons express a large number of active conductances (Connors et al., 1982; Adams & Gavin, 1986; Llin'as, 1988; McCormick, 1990; Hille, 1992) and exhibit a wide variety of dynamic behaviors on time scales ranging from milliseconds to many minutes (Llin'as, 1988; HarrisWarrick & Marder, 1991; Churchland & Sejnowski, 1992; Turrigiano et al., 1994). Neurons in cortical circuits are typically coupled to thousands of other neurons (Stevens, 1989) and very little is known about the strengths of these synapses (although see Rosenmund et al., 1993; Hessler et al., 1993; Smetters & Nelson, 1993). The complex firing patterns of large neuronal populations are difficult to describe let alone understand. There is little point in accurately modeling each membrane potential in a large neural
Adaptive Stochastic Resonance
 Proceedings of the IEEE: special issue on intelligent signal processing
, 1998
"... This paper shows how adaptive systems can learn to add an optimal amount of noise to some nonlinear feedback systems. Noise can improve the signaltonoise ratio of many nonlinear dynamical systems. This "stochastic resonance" effect occurs in a wide range of physical and biological systems. The SR ..."
Abstract

Cited by 17 (9 self)
 Add to MetaCart
This paper shows how adaptive systems can learn to add an optimal amount of noise to some nonlinear feedback systems. Noise can improve the signaltonoise ratio of many nonlinear dynamical systems. This "stochastic resonance" effect occurs in a wide range of physical and biological systems. The SR effect may also occur in engineering systems in signal processing, communications, and control. The noise energy can enhance the faint periodic signals or faint broadband signals that force the dynamical systems. Most SR studies assume full knowledge of a system's dynamics and its noise and signal structure. Fuzzy and other adaptive systems can learn to induce SR based only on samples from the process. These samples can tune a fuzzy system's ifthen rules so that the fuzzy system approximates the dynamical system and its noise response. The paper derives the SR optimality conditions that any stochastic learning system should try to achieve. The adaptive system learns the SR effect as the sys...
Noiseenhanced detection of subthreshold signals with carbon nanotubes
 IEEE Trans. Nanotechnol
, 2006
"... Abstract—Electrical noise can help pulsetrain signal detection at the nanolevel. Experiments on a singlewalled carbon nanotube transistor confirmed that a threshold exhibited stochastic resonance (SR) for finitevariance and infinitevariance noise: small amounts of noise enhanced the nanotube det ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
Abstract—Electrical noise can help pulsetrain signal detection at the nanolevel. Experiments on a singlewalled carbon nanotube transistor confirmed that a threshold exhibited stochastic resonance (SR) for finitevariance and infinitevariance noise: small amounts of noise enhanced the nanotube detector’s performance. The experiments used a carbon nanotube fieldeffect transistor to detect noisy subthreshold electrical signals. Two new SR hypothesis tests in the Appendix also confirmed the SR effect in the nanotube transistor. Three measures of detector performance showed the SR effect: Shannon’s mutual information, the normalized correlation measure, and an inverted bit error rate compared the input and output discretetime random sequences. The nanotube detector had a thresholdlike input–output characteristic in its gate effect. It produced little current for subthreshold digital input voltages that fed the transistor’s gate. Three types of synchronized
Analysis of Neural Coding Through Quantization with an InformationBased Distortion Measure
"... We discuss an analytical approach through which the neural symbols and corresponding stimulus space of a neuron or neural ensemble can be discovered simultaneously and quantitatively, making few assumptions about the nature of the code or relevant features. The basis for this approach is to conceptu ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
We discuss an analytical approach through which the neural symbols and corresponding stimulus space of a neuron or neural ensemble can be discovered simultaneously and quantitatively, making few assumptions about the nature of the code or relevant features. The basis for this approach is to conceptualize a neural coding scheme as a collection of stimulusresponse classes akin to a dictionary or 'codebook', with each class corresponding to a spike pattern 'codeword' and its corresponding stimulus feature in the codebook. The neural codebook is derived by quantizing the neural responses into a small reproduction set, and optimizing the quantization to minimize an informationbased distortion function. We apply this approach to the analysis of coding in sensory interneurons of a simple invertebrate sensory system. For a simple sensory characteristic (tuning curve), we demonstrate a case for which the classical definition of tuning does not describe adequately the performance of the studied cell. Considering a more involved sensory operation (sensory discrimination), we also show that, for some cells in this system, a significant amount of information is encoded in patterns of spikes that would not be discovered through analyses based on linear stimulusresponse measures.
Capacity and energy cost of information in biological and silicon photoreceptors
 Proceedings of the IEEE
, 2001
"... We outline a theoretical framework to analyze information processing in biological sensory organs and in engineered microsystems. We employ the mathematical tools of communication theory and model natural or synthetic physical structures as microscale communication networks, studying them under phys ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
We outline a theoretical framework to analyze information processing in biological sensory organs and in engineered microsystems. We employ the mathematical tools of communication theory and model natural or synthetic physical structures as microscale communication networks, studying them under physical constraints at two different levels of abstraction. At the functional level, we examine the operational and task specification, while at the physical level, we examine the material specification and realization. Both levels of abstraction are characterized by Shannon’s channel capacity, as determined by the channel bandwidth, the signal power, and the noise power. The link between the functional level and the physical level of abstraction is established through models for transformations on the signal, physical constraints on the system, and noise that degrades the signal. As a specific example, we present a comparative study of information capacity (in bits per second) versus energy cost of information (in joules per bit) in a biological and in a silicon adaptive photoreceptor. The communication channel model for each of the two systems is a cascade of linear bandlimiting sections followed by additive noise. We model the filters and the noise from first principles whenever possible and phenomenologically otherwise. The parameters for the blowfly model are determined from biophysical data available in the literature, and the parameters of the silicon model are determined from our experimental data. This comparative study is a first step toward a fundamental and quantitative understanding of the tradeoffs between system performance and associated costs such as size, reliability, and energy requirements for natural and engineered sensory microsystems. I.