Results 1  10
of
73
Nonlinear Multivariate Analysis of Neurophysiological Signals
 Progress in Neurobiology
, 2005
"... Multivariate time series analysis is extensively used in neurophysiology with the aim of studying the relationship between simultaneously recorded signals. Recently, advances on information theory and nonlinear dynamical systems theory have allowed the study of various types of synchronization from ..."
Abstract

Cited by 107 (5 self)
 Add to MetaCart
Multivariate time series analysis is extensively used in neurophysiology with the aim of studying the relationship between simultaneously recorded signals. Recently, advances on information theory and nonlinear dynamical systems theory have allowed the study of various types of synchronization from time series. In this work, we first describe the multivariate linear methods most commonly used in neurophysiology and show that they can be extended to assess the existence of nonlinear interdependences between signals. We then review the concepts of entropy and mutual information followed by a detailed description of nonlinear methods based on the concepts of phase synchronization, generalized synchronization and event synchronization. In all cases, we show how to apply these methods to study different kinds of neurophysiological data. Finally, we illustrate the use of multivariate surrogate data test for the assessment of the strength (strong or weak) and the type (linear or nonlinear) of interdependence between neurophysiological signals.
On Decoding the Responses of a Population of Neurons from Short Time Windows
, 1999
"... The effectiveness of various stimulus identification (decoding) procedures for extracting the information carried by the responses of a population of neurons to a set of repeatedly presented stimuli is studied analytically, in the limit of short time windows. It is shown that in this limit, the enti ..."
Abstract

Cited by 46 (5 self)
 Add to MetaCart
The effectiveness of various stimulus identification (decoding) procedures for extracting the information carried by the responses of a population of neurons to a set of repeatedly presented stimuli is studied analytically, in the limit of short time windows. It is shown that in this limit, the entire information content of the responses can sometimes be decoded, and when this is not the case, the lost information is quantified. In particular, the mutual information extracted by taking into account only the most likely stimulus in each trial turns out to be, if not equal, much closer to the true value than that calculated from all the probabilities that each of the possible stimuli in the set was the actual one. The relation between the mutual information extracted by decoding and the percentage of correct stimulus decodings is also derived analytically in the same limit, showing that the metric content index can be estimated reliably from a few cells recorded from brief periods. Computer simulations as well as the activity of real neurons recorded in the primate hippocampus serve to confirm these results and illustrate the utility and limitations of the approach.
Neural coding and decoding: communication channels and quantization
 Network: Computation in Neural Systems
, 2001
"... We present a novel analytical approach for studying neural encoding. As a
first step we model a neural sensory system as a communication channel.
Using the method of typical sequence in this context, we show that a
coding scheme is an almost bijective relation between equivalence classes of
stimulus ..."
Abstract

Cited by 40 (8 self)
 Add to MetaCart
We present a novel analytical approach for studying neural encoding. As a
first step we model a neural sensory system as a communication channel.
Using the method of typical sequence in this context, we show that a
coding scheme is an almost bijective relation between equivalence classes of
stimulus/response pairs. The analysis allows a quantitative determination of the
type of information encoded in neural activity patterns and, at the same time,
identification of the code with which that information is represented. Due to the
high dimensionality of the sets involved, such a relation is extremely difficult
to quantify. To circumvent this problem, and to use whatever limited data set is
available most efficiently, we use another technique from information theory—
quantization. We quantize the neural responses to a reproduction set of small
finite size. Amongmany possible quantizations, we choose one which preserves
as much of the informativeness of the original stimulus/response relation as
possible, through the use of an informationbased distortion function. This
method allows us to study coarse but highly informative approximations of a
coding scheme model, and then to refine them automatically when more data
become available.
A Unified Approach to the Study of Temporal, Correlational and Rate Coding
"... We demonstrate that the information contained in the spike occurrence times of a population of neurons can be broken up into a series of terms, each of which reflect something about potential coding mechanisms. This is possible in the coding r'egime in which few spikes are emitted in the re ..."
Abstract

Cited by 37 (11 self)
 Add to MetaCart
We demonstrate that the information contained in the spike occurrence times of a population of neurons can be broken up into a series of terms, each of which reflect something about potential coding mechanisms. This is possible in the coding r'egime in which few spikes are emitted in the relevant time window. This approach allows us to study the additional information contributed by spike timing beyond that present in the spike counts; to examine the contributions to the whole information of different statistical properties of spike trains, such as firing rates and correlation functions; and forms the basis for a new quantitative procedure for the analysis of simultaneous multiple neuron recordings. It also provides theoretical constraints upon neural coding strategies. We find a transition between two coding r'egimes, depending upon the size of the relevant observation timescale. For time windows shorter than the timescale of the stimulusinduced response fluctuations, t...
The representation of information about faces in the temporal and frontal lobes
 Neuropsychologia
, 2006
"... frontal lobes ..."
(Show Context)
Representational Accuracy of Stochastic Neural Populations
, 2001
"... this article that the choice of a variability model has a major, nontrivial impact on the encoding properties of the neural population. The immense variability of individual response parameters, such as tuning widths or correlation coef#cients, has also been neglected in most previous work. Although ..."
Abstract

Cited by 26 (5 self)
 Add to MetaCart
this article that the choice of a variability model has a major, nontrivial impact on the encoding properties of the neural population. The immense variability of individual response parameters, such as tuning widths or correlation coef#cients, has also been neglected in most previous work. Although these parameter variations are always found in empirical data, they were considered functionally insignificant, and hence theoretical studies have almost always assumed uniform parameters throughout the population. We will show here that this uniform case is unfavorable in the sense that the introduction of parameter variability improves the encoding performance
Approaches to InformationTheoretic Analysis of Neural Activity
 Biol Theory
, 2006
"... Abstract Understanding how neurons represent, process, and manipulate information is one of the main goals of neuroscience. These issues are fundamentally abstract, and information theory plays a key role in formalizing and addressing them. However, application of information theory to experimental ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
(Show Context)
Abstract Understanding how neurons represent, process, and manipulate information is one of the main goals of neuroscience. These issues are fundamentally abstract, and information theory plays a key role in formalizing and addressing them. However, application of information theory to experimental data is fraught with many challenges. Meeting these challenges has led to a variety of innovative analytical techniques, with complementary domains of applicability, assumptions, and goals.
Tuning neocortical pyramidal neurons between integrators and coincident detectors
 J Comp Neurosci
, 2003
"... Abstract. Do cortical neurons operate as integrators or as coincidence detectors? Despite the importance of this question, no definite answer has been given yet, because each of these two views can find its own experimental support. Here we investigated this question using models of morphologically ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
Abstract. Do cortical neurons operate as integrators or as coincidence detectors? Despite the importance of this question, no definite answer has been given yet, because each of these two views can find its own experimental support. Here we investigated this question using models of morphologicallyreconstructed neocortical pyramidal neurons under in vivo like conditions. In agreement with experiments we find that the cell is capable of operating in a continuum between coincidence detection and temporal integration, depending on the characteristics of the synaptic inputs. Moreover, the presence of synaptic background activity at a level comparable to intracellular measurements in vivo can modulate the operating mode of the cell, and act as a switch between temporal integration and coincidence detection. These results suggest that background activity can be viewed as an important determinant of the integrative mode of pyramidal neurons. Thus, background activity not only sharpens cortical responses but it can also be used to tune an entire network between integration and coincidence detection modes. Keywords: cerebral cortex, synaptic background, computational model, operating mode
Information geometric measure for neural spikes
 Neural Computation
, 2002
"... The present study introduces informationgeometric measures to analyze neural ring patterns by taking not only the secondorder but also higherorder interactions among neurons into account. Information geometry provides useful tools and concepts for this purpose, including the orthogonality of coo ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
(Show Context)
The present study introduces informationgeometric measures to analyze neural ring patterns by taking not only the secondorder but also higherorder interactions among neurons into account. Information geometry provides useful tools and concepts for this purpose, including the orthogonality of coordinate parameters and the Pythagoras relation in the KullbackLeibler divergence. Based on this orthogonality, we show anovel method to analyze spike ring patterns by decomposing the interactions of neurons of various orders. As a result, purely pairwise, triplewise, and higherorder interactions are singled out. We also demonstrate the bene ts of our proposal by using real neural data, recorded in the prefrontal and parietal cortices of monkeys. 1
Optimal Stimulus Coding by Neural Populations using Rate Codes
 J COMPUT NEUROSCI
, 2002
"... We create a framework based on Fisher information for determining the most effective population coding scheme for representing a continuousvalued stimulus attribute over its entire range. Using this scheme, we derive optimal single and multineuron rate codes for homogeneous populations using se ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
We create a framework based on Fisher information for determining the most effective population coding scheme for representing a continuousvalued stimulus attribute over its entire range. Using this scheme, we derive optimal single and multineuron rate codes for homogeneous populations using several statistical models frequently used to describe neural data. We show that each neuron's discharge rate should increase quadratically with the stimulus and that statistically independent neural outputs provides optimal coding. Only cooperative populations can achieve this condition in an informationally effective way.