Results 1 - 10
of
32
Maximum likelihood estimation of a stochastic integrate-and-fire neural encoding model
, 2004
"... We examine a cascade encoding model for neural response in which a linear filtering stage is followed by a noisy, leaky, integrate-and-fire spike generation mechanism. This model provides a biophysically more realistic alternative to models based on Poisson (memoryless) spike generation, and can eff ..."
Abstract
-
Cited by 83 (24 self)
- Add to MetaCart
(Show Context)
We examine a cascade encoding model for neural response in which a linear filtering stage is followed by a noisy, leaky, integrate-and-fire spike generation mechanism. This model provides a biophysically more realistic alternative to models based on Poisson (memoryless) spike generation, and can effectively reproduce a variety of spiking behaviors seen in vivo. We describe the maximum likelihood estimator for the model parameters, given only extracellular spike train responses (not intracellular voltage data). Specifically, we prove that the log likelihood function is concave and thus has an essentially unique global maximum that can be found using gradient ascent techniques. We develop an efficient algorithm for computing the maximum likelihood solution, demonstrate the effectiveness of the resulting estimator with numerical simulations, and discuss a method of testing the model’s validity using time-rescaling and density evolution techniques.
A neural mass model for MEG/EEG: coupling and neuronal dynamics
- NeuroImage
, 2003
"... Although MEG/EEG signals are highly variable, systematic changes in distinct frequency bands are commonly encountered. These frequency-specific changes represent robust neural correlates of cognitive or perceptual processes (for example, alpha rhythms emerge on closing the eyes). However, their func ..."
Abstract
-
Cited by 81 (21 self)
- Add to MetaCart
Although MEG/EEG signals are highly variable, systematic changes in distinct frequency bands are commonly encountered. These frequency-specific changes represent robust neural correlates of cognitive or perceptual processes (for example, alpha rhythms emerge on closing the eyes). However, their functional significance remains a matter of debate. Some of the mechanisms that generate these signals are known at the cellular level and rest on a balance of excitatory and inhibitory interactions within and between populations of neurons. The kinetics of the ensuing population dynamics determine the frequency of oscillations. In this work we extended the classical nonlinear lumped-parameter model of alpha rhythms, initially developed by Lopes da Silva and colleagues [Kybernetik 15 (1974) 27], to generate more complex dynamics. We show that the whole spectrum of MEG/EEG signals can be reproduced within the oscillatory regime of this model by simply changing the population kinetics. We used the model to examine the influence of coupling strength and propagation delay on the rhythms generated by coupled cortical areas. The main findings were that (1) coupling induces phase-locked activity, with a phase shift of 0 or π when the coupling is bidirectional, and (2) both coupling and propagation delay are critical determinants of the MEG/EEG spectrum. In forthcoming articles, we will use this model to (1) estimate how neuronal interactions are expressed in MEG/EEG oscillations and establish the construct validity of various indices of nonlinear coupling, and (2) generate event-related transients to derive physiologically informed basis functions for statistical modelling of average evoked responses.
Modelling event-related responses in the brain
- NeuroImage
, 2005
"... The aim of this work was to investigate the mechanisms that shape evoked electroencephalographic (EEG) and magneto-encephalographic (MEG) responses. We used a neuronally plausible model to characterise the dependency of response components on the models parameters. This generative model was a neural ..."
Abstract
-
Cited by 38 (9 self)
- Add to MetaCart
(Show Context)
The aim of this work was to investigate the mechanisms that shape evoked electroencephalographic (EEG) and magneto-encephalographic (MEG) responses. We used a neuronally plausible model to characterise the dependency of response components on the models parameters. This generative model was a neural mass model of hierarchically arranged areas using three kinds of inter-area connections (forward, backward and lateral). We investigated how responses, at each level of a cortical hierarchy, depended on the strength of connections or coupling. Our strategy was to systematically add connections and examine the responses of each successive architecture. We did this in the context of deterministic responses and then with stochastic spontaneous activity. Our aim was to show, in a simple way, how event-related dynamics depend on extrinsic connectivity. To emphasise the importance of nonlinear interactions, we tried to disambiguate the components of event-related potentials (ERPs) or event-related fields
Integrate-and-Fire Neurons Driven by Correlated Stochastic Input
, 2002
"... Neurons are sensitive to correlations among synaptic inputs. However, analytical models that explicitly include correlations are hard to solve analytically, so their influence on a neuron’s response has been difficult to ascertain. To gain some intuition on this problem, we studied the firing times ..."
Abstract
-
Cited by 26 (4 self)
- Add to MetaCart
Neurons are sensitive to correlations among synaptic inputs. However, analytical models that explicitly include correlations are hard to solve analytically, so their influence on a neuron’s response has been difficult to ascertain. To gain some intuition on this problem, we studied the firing times of two simple integrate-and-fire model neurons driven by a correlated binary variable that represents the total input current. Analytic expressions were obtained for the average firing rate and coefficient of variation (a measure of spike-train variability) as functions of the mean, variance, and correlation time of the stochastic input. The results of computer simulations were in excellent agreement with these expressions. In these models, an increase in correlation time in general produces an increase in both the average firing rate and the variability of the output spike trains. However, the magnitude of the changes depends differentially on the relative values of the input mean and variance: the increase in firing rate is higher when the variance is large relative to the mean, whereas the increase in variability is higher when the variance is relatively small. In addition, the firing rate always tends to a finite limit value as the correlation time increases toward infinity, whereas the coefficient of variation typically diverges. These results suggest that temporal correlations may play a major role in determining the variability as well as the intensity of neuronal spike trains.
Dynamics of Neuronal Populations: Eigenfunction Theory, Part 1, . . .
- NETWORK: COMPUT. NEURAL SYST
, 2003
"... A novel approach to cortical modeling was introduced by Knight et al. (1996). In their presentation cortical dynamics is formulated in terms of in- teracting populations of neurons, a perspective that is in part motivated by modern cortical imaging (For a review see Sirovich and Kaplan (2002)). The ..."
Abstract
-
Cited by 18 (3 self)
- Add to MetaCart
A novel approach to cortical modeling was introduced by Knight et al. (1996). In their presentation cortical dynamics is formulated in terms of in- teracting populations of neurons, a perspective that is in part motivated by modern cortical imaging (For a review see Sirovich and Kaplan (2002)). The approach
A Simple and Stable Numerical Solution for the Population Density Equation
, 2003
"... this article, I will consider only a gaussian distribution of membrane depolarizations of magnitude p.h/ p 2 .h N h/ 2 2 2 (2.13) or the nonstochastic limit 0, in which case h/; (2.14) where .x/ is the Dirac distribution. I will refer to ..."
Abstract
-
Cited by 17 (3 self)
- Add to MetaCart
this article, I will consider only a gaussian distribution of membrane depolarizations of magnitude p.h/ p 2 .h N h/ 2 2 2 (2.13) or the nonstochastic limit 0, in which case h/; (2.14) where .x/ is the Dirac distribution. I will refer to
Stochastic models of neuronal dynamics
, 2005
"... Cortical activity is the product of interactions among neuronal populations. Macroscopic electrophysiological phenomena are generated by these interactions. In principle, the mechanisms of these interactions afford constraints on biologically plausible models of electrophysiological responses. In ot ..."
Abstract
-
Cited by 16 (5 self)
- Add to MetaCart
(Show Context)
Cortical activity is the product of interactions among neuronal populations. Macroscopic electrophysiological phenomena are generated by these interactions. In principle, the mechanisms of these interactions afford constraints on biologically plausible models of electrophysiological responses. In other words, the macroscopic features of cortical activity can be modelled in terms of the microscopic behaviour of neurons. An evoked response potential (ERP) is the mean electrical potential measured from an electrode on the scalp, in response to some event. The purpose of this paper is to outline a population density approach to modelling ERPs. We propose a biologically plausible model of neuronal activity that enables the estimation of physiologically meaningful parameters from electrophysiological data. The model encompasses four basic characteristics of neuronal activity and organization: (i) neurons are dynamic units, (ii) driven by stochastic forces, (iii) organized into populations with similar biophysical properties and response characteristics and (iv) multiple populations interact to form functional networks. This leads to a formulation of population dynamics in terms of the Fokker–Planck equation. The solution of this equation is the temporal evolution of a probability density over state-space, representing the distribution of an ensemble of trajectories. Each trajectory corresponds to the changing state of a
Dynamics of neural populations: Stability and synchrony
- Network: Comput. Neural Syst
, 2006
"... A population formulation of neuronal activity is employed to study an excitatory network of (spiking) neurons receiving external input as well as recurrent feedback. At relatively low levels of feedback, the network exhibits time stationary asynchronous behavior. A stability analysis of this time st ..."
Abstract
-
Cited by 13 (3 self)
- Add to MetaCart
(Show Context)
A population formulation of neuronal activity is employed to study an excitatory network of (spiking) neurons receiving external input as well as recurrent feedback. At relatively low levels of feedback, the network exhibits time stationary asynchronous behavior. A stability analysis of this time stationary state leads to an analytical criterion for the critical gain at which time asynchronous behavior becomes unstable. At instability the dynamics can undergo a supercritical Hopf bifurcation and the population passes to a synchronous state. Under different conditions it can pass to synchrony through a subcritical Hopf bifurcation. And at high gain a network can reach a runaway state, in finite time, after which the network no longer supports bounded solutions. The introduction of time delayed feedback leads to a rich range of phenomena. For example, for a given external input, increasing gain produces transition from asynchrony, to synchrony, to asynchrony and finally can lead to divergence. Time delay is also shown to strongly mollify the amplitude of synchronous oscillations. Perhaps, of general importance, is the result that synchronous behavior can exist only for a narrow range of time delays, which range is an order of magnitude smaller than periods
Kinetic theory for neuronal network dynamics
- Communications in Mathematical Sciences
, 2006
"... Abstract. We present a detailed theoretical framework for statistical descriptions of neuronal networks and derive (1+1)-dimensional kinetic equations, without introducing any new parameters, directly from conductance-based integrate-and-fire neuronal networks. We describe the details of derivation ..."
Abstract
-
Cited by 11 (1 self)
- Add to MetaCart
(Show Context)
Abstract. We present a detailed theoretical framework for statistical descriptions of neuronal networks and derive (1+1)-dimensional kinetic equations, without introducing any new parameters, directly from conductance-based integrate-and-fire neuronal networks. We describe the details of derivation of our kinetic equation, proceeding from the simplest case of one excitatory neuron, to coupled networks of purely excitatory neurons, to coupled networks consisting of both excitatory and inhibitory neurons. The dimension reduction in our theory is achieved via novel moment closures. We also describe the limiting forms of our kinetic theory in various limits, such as the limit of mean-driven dynamics and the limit of infinitely fast conductances. We establish accuracy of our kinetic theory by comparing its prediction with the full simulations of the original point-neuron networks. We emphasize that our kinetic theory is dynamically accurate, i.e., it captures very well the instantaneous statistical properties of neuronal networks under time-inhomogeneous inputs.
The dynamic field theory and embodied cognitive dynamics
- In
, 2009
"... The goal of this chapter is to explain some of the core concepts of Dynamic Field Theory (DFT) and how this theory provides a formal framework for thinking about embodied cognitive dynamics. The DFT is now 15 years old. In 1993, Gregor Schöner and his colleagues published a proceedings paper present ..."
Abstract
-
Cited by 7 (0 self)
- Add to MetaCart
(Show Context)
The goal of this chapter is to explain some of the core concepts of Dynamic Field Theory (DFT) and how this theory provides a formal framework for thinking about embodied cognitive dynamics. The DFT is now 15 years old. In 1993, Gregor Schöner and his colleagues published a proceedings paper presenting a theory of how eye movements are planned, including their neural bases (Kopecz, Engels, & Schöner, 1993). Since that time, DFT has been extended to a range of topics including the planning of reaching movements (Bastian, Riehle, Erlhagen, & Schöner,