Results 1  10
of
42
Reduction of the HodgkinHuxley Equations to a SingleVariable Threshold Model
 NEURAL COMPUTATION
, 1997
"... It is generally believed that a neuron is a threshold element which fires when some variable u reaches a threshold. Here we pursue the question of whether this picture can be justified and study the fourdimensional neuron model of Hodgkin and Huxley as a concrete example. The model is approximat ..."
Abstract

Cited by 67 (22 self)
 Add to MetaCart
It is generally believed that a neuron is a threshold element which fires when some variable u reaches a threshold. Here we pursue the question of whether this picture can be justified and study the fourdimensional neuron model of Hodgkin and Huxley as a concrete example. The model is approximated by a response kernel expansion in terms of a single variable, the membrane voltage. The firstorder term is linear in the input and has the typical form of an elementary postsynaptic potential. Higherorder kernels take care of nonlinear interactions between input spikes. In contrast to the standard Volterra expansion the kernels depend on the firing time of the most recent output spike. In particular, a zeroorder kernel which describes the shape of the spike and the typical afterpotential is included. Our model neuron fires, if the membrane voltage, given by the truncated response kernel expansion crosses a threshold. The threshold model is tested on a spike train generated by t...
On deformation of Poisson manifolds of hydrodynamic type
 Comm. Math. Phys
"... We study a class of deformations of infinitedimensional Poisson manifolds of hydrodynamic type which are of interest in the theory of Frobenius manifolds. We prove two results. First, we show that the second cohomology group of these manifolds, in the PoissonLichnerowicz cohomology, is “essentiall ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
We study a class of deformations of infinitedimensional Poisson manifolds of hydrodynamic type which are of interest in the theory of Frobenius manifolds. We prove two results. First, we show that the second cohomology group of these manifolds, in the PoissonLichnerowicz cohomology, is “essentially” trivial. Then, we prove a conjecture of B. Dubrovin about the triviality of homogeneous formal deformations of the above manifolds. 1 Dubrovin’s conjecture In this paper we solve a problem proposed by B. Dubrovin in the framework of the theory of Frobenius manifolds [2]. It concerns the deformations of Poisson tensors of hydrodynamic type. The challenge is to show that a large class of these deformations are trivial. In an epitomized form the problem can be stated as follows. Let M be a Poisson manifold endowed with a Poisson bivector P0 fulfilling the Jacobi condition [P0, P0] = 0 with respect to the Schouten bracket on the algebra of multivector fields on M. A deformation of P0 is a formal series Pǫ = P0 + ǫP1 + ǫ 2 P2 + · · · in the space of bivector fields on M satisfying the Jacobi condition
Artificial Neural Networks With Adaptive Polynomial Activation Function
, 1992
"... The aim of this work is to study an extended multilayer perceptron made of neurons with an adaptive polynomial activation function. The adaptive polynomial neural network (APNN) gives a reduction in terms of dimensions and computational complexity both in learning and in forward phase compared with ..."
Abstract

Cited by 12 (9 self)
 Add to MetaCart
The aim of this work is to study an extended multilayer perceptron made of neurons with an adaptive polynomial activation function. The adaptive polynomial neural network (APNN) gives a reduction in terms of dimensions and computational complexity both in learning and in forward phase compared with traditional MLPs with a sigmoidal activation function. Many experiments have been extensively carried out both on pattern recognition and data processing problems. The relationship of the APNNs with the Volterra expansion is also discussed.
A unifying view of Wiener and Volterra theory and polynomial kernel regression
 Neural Computation
, 2006
"... Volterra and Wiener series are perhaps the best understood nonlinear system representations in signal processing. Although both approaches have enjoyed a certain popularity in the past, their application has been limited to rather lowdimensional and weakly nonlinear systems due to the exponential gr ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Volterra and Wiener series are perhaps the best understood nonlinear system representations in signal processing. Although both approaches have enjoyed a certain popularity in the past, their application has been limited to rather lowdimensional and weakly nonlinear systems due to the exponential growth of the number of terms that have to be estimated. We show that Volterra and Wiener series can be represented implicitly as elements of a reproducing kernel Hilbert space by utilizing polynomial kernels. The estimation complexity of the implicit representation is linear in the input dimensionality and independent of the degree of nonlinearity. Experiments show performance advantages in terms of convergence, interpretability, and system sizes that can be handled. 1
Time Series Forecasting using Wavelets with PredictorCorrector Boundary Treatment
 in 7th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
, 2001
"... 1 new method, the predictorcorrector technique. Rather than using an ad hoc boundary condition, such as periodic or reflective boundary condition, we use the predicted values from previous steps for wavelet transform and then iterate. Preliminary experiments have been carried out on synthetic and r ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
1 new method, the predictorcorrector technique. Rather than using an ad hoc boundary condition, such as periodic or reflective boundary condition, we use the predicted values from previous steps for wavelet transform and then iterate. Preliminary experiments have been carried out on synthetic and real datasets. Promising results have been obtained.
Generalized TomonagaSchwinger equation from the Hadamard formula”, Phys Rev D
"... A generalized Tomonaga–Schwinger equation, holding on the entire boundary of a finite spacetime region, has recently been considered as a tool for studying particle scattering amplitudes in backgroundindependent quantum field theory. The equation has been derived using lattice techniques under assu ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
A generalized Tomonaga–Schwinger equation, holding on the entire boundary of a finite spacetime region, has recently been considered as a tool for studying particle scattering amplitudes in backgroundindependent quantum field theory. The equation has been derived using lattice techniques under assumptions on the existence of the continuum limit. Here I show that in the context of continuous euclidean field theory the equation can be directly derived from the functional integral formalism, using a technique based on Hadamard’s formula for the variation of the propagator. 1
Instantaneous Characterization Of TimeVarying Nonlinear Systems
 IEEE Transactions on Biomedical Engineering
, 1992
"... A nonlinear system may be characterized by an orthogonal functional power series (FPS) computed from cross correlations between input and output variables. "Is the response changing over the course of the experiment?" is a fundamental question encountered in the analysis of both FPS and evoked poten ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
A nonlinear system may be characterized by an orthogonal functional power series (FPS) computed from cross correlations between input and output variables. "Is the response changing over the course of the experiment?" is a fundamental question encountered in the analysis of both FPS and evoked potentials (EP's). Regression on closedform functions of time produces a timevarying FPS or EP. Evaluation of these functions at a specified time point produces a system characterization for that instant. INTRODUCTION Analysis of neurophysiological responses evoked by uniform trains of sensory or shock stimuli are useful both for clinical purposes [3,4,9] and for the elucidation of fundamental processes [5,13] However, the use of uniform train stimulation provides limited information only; it rarely provides a thorough characterization of the system. This is because biological systems are, in general, nonlinear, i.e. the response to a stimulus depends in a complex way on the current state of t...
Nonlinear dynamic modeling of spike train transformations for hippocampalcortical prostheses
 IEEE Transactions on Biomedical Engineering
, 2007
"... Abstract—One of the fundamental principles of cortical brain regions, including the hippocampus, is that information is represented in the ensemble firing of populations of neurons, i.e., spatiotemporal patterns of electrophysiological activity. The hippocampus has long been known to be responsible ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract—One of the fundamental principles of cortical brain regions, including the hippocampus, is that information is represented in the ensemble firing of populations of neurons, i.e., spatiotemporal patterns of electrophysiological activity. The hippocampus has long been known to be responsible for the formation of declarative, or factbased, memories. Damage to the hippocampus disrupts the propagation of spatiotemporal patterns of activity through hippocampal internal circuitry, resulting in a severe anterograde amnesia. Developing a neural prosthesis for the damaged hippocampus requires restoring this multipleinput, multipleoutput transformation of spatiotemporal patterns of activity. Because the mechanisms underlying synaptic transmission and generation of electrical activity in neurons are inherently nonlinear, any such prosthesis must be based on a nonlinear multipleinput, multipleoutput model. In
The analysis of nonlinear synaptic transmission.J. Gen. Physiol
, 1977
"... synapse in the lobster cardiac ganglion, a new nonlinear systems analysis technique for discreteinput systems was developed and applied. From the output of the postsynaptic cell in response to randomly occurring presynaptic nerve impulses, a set of kernels, analogous to Wiener kernels, was computed ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
synapse in the lobster cardiac ganglion, a new nonlinear systems analysis technique for discreteinput systems was developed and applied. From the output of the postsynaptic cell in response to randomly occurring presynaptic nerve impulses, a set of kernels, analogous to Wiener kernels, was computed. The kernels up to third order served to characterize, with reasonable accuracy, the inputoutput properties of the synapse. A mathematical model of the synapse was also tested with a random impulse train and model predictions were compared with experimental synaptic output. Although the model proved to be even more accurate overall than the kernel characterization, there were slight but consistent errors in the model's performance. These were also reflected as differences between model and experimental kernels. It is concluded that a random train analysis provides a comprehensive and objective comparison between model and experiment and automatically provides an arbitrarily accurate characterization of a system's inputoutput behavior, even in complicated cases where other approaches are impractical.