Results 11  20
of
540
Experimental Issues in Coherent QuantumState Manipulation of Trapped Atomic Ions
, 1998
"... this paper, we investigate a subset of these topics which involve the coherent manipulation of quantum states of trapped atomic ions. The focus will be on a proposal to implement quantum logic and quantum computation using trapped ions [1]. However, we will also consider related work on the generati ..."
Abstract

Cited by 41 (9 self)
 Add to MetaCart
this paper, we investigate a subset of these topics which involve the coherent manipulation of quantum states of trapped atomic ions. The focus will be on a proposal to implement quantum logic and quantum computation using trapped ions [1]. However, we will also consider related work on the generation of nonclassical states of motion and entangled states of trapped ions [239]. Many of these ideas have been summarized in a recent review [40].
Statistical mechanics of multiple scales of neocortical interactions
 in Neocortical Dynamics and Human EEG Rhythms, (Edited by P.L. Nunez
, 1995
"... 14. Statistical mechanics of multiple scales of neocortical interactions ..."
Abstract

Cited by 36 (18 self)
 Add to MetaCart
14. Statistical mechanics of multiple scales of neocortical interactions
Learning Processes in Neural Networks
, 1991
"... We study the learning dynamics of neural networks from a general point of view. The environment from which the network learns is defined as a set of input stimuli. At discrete points in time, one of these stimuli is presented and an incremental learning step takes place. If the time between learning ..."
Abstract

Cited by 33 (7 self)
 Add to MetaCart
We study the learning dynamics of neural networks from a general point of view. The environment from which the network learns is defined as a set of input stimuli. At discrete points in time, one of these stimuli is presented and an incremental learning step takes place. If the time between learning steps is drawn from a Poisson distribution, the dynamics of an ensemble of learning processes is described by a continuoustime master equation. A learning algorithm that enables a neural network to adapt to a changing environment, must have a nonzero learning parameter. This constant adaptability, however, goes at the cost of fluctuations in the plasticities, such as synapses and thresholds. The ensemble description allows us to study the asymptotic behavior of the plasticities for a large class of neural networks. For small learning parameters we derive an expression for the size of the fluctuations in an unchanging environment. In a changing environment, there is a tradeoff between adap...
Statistical Mechanical Aids to Calculating Term Structure Models
 Rev. A
, 1990
"... This paper describes application of the very fast simulated reannealing and pathintegral methodologies to the estimation of the Brennan and Schwartz twofactor termstructure (timedependent) model of bond prices. It is shown that these methodologies can be utilized to estimate more complicated nf ..."
Abstract

Cited by 33 (30 self)
 Add to MetaCart
This paper describes application of the very fast simulated reannealing and pathintegral methodologies to the estimation of the Brennan and Schwartz twofactor termstructure (timedependent) model of bond prices. It is shown that these methodologies can be utilized to estimate more complicated nfactor nonlinear models. Applications to other systems are stressed. Statistical mechanical aids2 Lester Ingber 1. INTRODUCTION
OnLine Learning Processes in Artificial Neural Networks
, 1993
"... We study online learning processes in artificial neural networks from a general point of view. Online learning means that a learning step takes place at each presentation of a randomly drawn training pattern. It can be viewed as a stochastic process governed by a continuoustime master equation. O ..."
Abstract

Cited by 31 (4 self)
 Add to MetaCart
We study online learning processes in artificial neural networks from a general point of view. Online learning means that a learning step takes place at each presentation of a randomly drawn training pattern. It can be viewed as a stochastic process governed by a continuoustime master equation. Online learning is necessary if not all training patterns are available all the time. This occurs in many applications when the training patterns are drawn from a timedependent environmental distribution. Studying learning in a changing environment, we encounter a conflict between the adaptability and the confidence of the network's representation. Minimization of a criterion incorporating both effects yields an algorithm for online adaptation of the learning parameter. The inherent noise of online learning makes it possible to escape from undesired local minima of the error potential on which the learning rule performs (stochastic) gradient descent. We try to quantify these often made cl...
Statistical Mechanics of Dissipative Particle Dynamics
, 1995
"... The stochastic differential equations corresponding to the updating algorithm of Dissipative Particle Dynamics (DPD), and the corresponding FokkerPlanck equation are derived. It is shown that a slight modification to the algorithm is required before the Gibbs distribution is recovered as the statio ..."
Abstract

Cited by 30 (1 self)
 Add to MetaCart
The stochastic differential equations corresponding to the updating algorithm of Dissipative Particle Dynamics (DPD), and the corresponding FokkerPlanck equation are derived. It is shown that a slight modification to the algorithm is required before the Gibbs distribution is recovered as the stationary solution to the FokkerPlanck equation. The temperature of the system is then directly related to the noise amplitude by means of a fluctuationdissipation theorem. However, the correspondingly modified, discrete DPD algorithm is only found to obey these predictions if the length of the timestep is sufficiently reduced. This indicates the importance of time discretisation in DPD. Recently, Hoogerbrugge and Koelman have introduced a new method for simulating hydrodynamic behavior which has been coined Dissipative Particle Dynamics (DPD)[1],[2]. This technique was conceived as an improvement over conventional molecular dynamics MD in order to describe complex hydrodynamic behavior with c...
Realtime kinetics of gene activity in individual bacteria
 Cell
, 2005
"... Protein levels have been shown to vary substantially between individual cells in clonal populations. In prokaryotes, the contribution to such fluctuations from the inherent randomness of gene expression has largely been attributed to having just a few transcriptsofthecorrespondingmRNAs.Bycontrast, e ..."
Abstract

Cited by 30 (0 self)
 Add to MetaCart
Protein levels have been shown to vary substantially between individual cells in clonal populations. In prokaryotes, the contribution to such fluctuations from the inherent randomness of gene expression has largely been attributed to having just a few transcriptsofthecorrespondingmRNAs.Bycontrast, eukaryotic studies tend to emphasize chromatin remodeling and burstlike transcription. Here, we study singlecell transcription in Escherichia coli by measuring mRNA levels in individual living cells. The results directly demonstrate transcriptional bursting, similar to that indirectly inferred for eukaryotes.We also measure mRNA partitioning at cell division and correlate mRNA and protein levels in single cells. Partitioning is approximately binomial, and mRNAprotein correlations are weaker earlier in the cell cycle, where cell division has recently randomized the relative concentrations. Our methods further extend proteinbased approaches by counting the integervalued number of transcript with singlemolecule resolution. This greatly facilitates kinetic interpretations in terms of the integervalued random processes that produce the fluctuations.
Systematic strategies for stochastic mode reduction in climate
 J. Atmos. Sci
, 2003
"... A systematic strategy for stochastic mode reduction is applied here to three prototype ‘‘toy’ ’ models with nonlinear behavior mimicking several features of lowfrequency variability in the extratropical atmosphere. Two of the models involve explicit stable periodic orbits and multiple equilibria in ..."
Abstract

Cited by 30 (4 self)
 Add to MetaCart
A systematic strategy for stochastic mode reduction is applied here to three prototype ‘‘toy’ ’ models with nonlinear behavior mimicking several features of lowfrequency variability in the extratropical atmosphere. Two of the models involve explicit stable periodic orbits and multiple equilibria in the projected nonlinear climate dynamics. The systematic strategy has two steps: stochastic consistency and stochastic mode elimination. Both aspects of the mode reduction strategy are tested in an a priori fashion in the paper. In all three models the stochastic mode elimination procedure applies in a quantitative fashion for moderately large values of � � 0.5 or even � � 1, where the parameter � roughly measures the ratio of correlation times of unresolved variables to resolved climate variables, even though the procedure is only justified mathematically for � K 1. The results developed here provide some new perspectives on both the role of stable nonlinear structures in projected nonlinear climate dynamics and the regression fitting strategies for stochastic climate modeling. In one example, a deterministic system with 102 degrees of freedom has an explicit stable periodic orbit for the projected climate dynamics in two variables; however, the complete deterministic system has instead a probability density function with two large isolated peaks on the ‘‘ghost’ ’ of this periodic orbit, and correlation functions that only weakly ‘‘shadow’ ’ this periodic orbit. Furthermore, all of these features are predicted in a quantitative fashion by the
The dynamics of choice among multiple alternatives
 Journal of Mathematical Psychology
, 2006
"... We consider neurallybased models for decisionmaking in the presence of noisy incoming data. The twoalternative forcedchoice task has been extensively studied, and in that case it is known that mutuallyinhibited leaky integrators in which leakage and inhibition balance can closely approximate a ..."
Abstract

Cited by 29 (4 self)
 Add to MetaCart
We consider neurallybased models for decisionmaking in the presence of noisy incoming data. The twoalternative forcedchoice task has been extensively studied, and in that case it is known that mutuallyinhibited leaky integrators in which leakage and inhibition balance can closely approximate a driftdiffusion process that is the continuum limit of the optimal sequential probability ratio test (SPRT). Here we study the performance of neural integrators in n ≥ 2 alternative choice tasks and relate them to a multihypothesis sequential probability ratio test (MSPRT) that is asymptotically optimal in the limit of vanishing error rates. While a simple race model can implement this ‘maxvsnext ’ MSPRT, it requires an additional computational layer, while absolute threshold crossing tests do not require such a layer. Race models with absolute thresholds perform relatively poorly, but we show that a balanced leaky accumulator model with an absolute crossing criterion can approximate a ‘maxvsave ’ test that is intermediate in performance between the absolute and maxvsnext tests. We consider free and fixed time response protocols, and show that the resulting mean reaction times under the former and decision times for fixed accuracy under the latter obey versions of Hick’s law in the low error rate range, and we interpret this in terms of information gained. Specifically, we derive relationships of the forms log(n − 1), log(n), or log(n + 1) depending on error rates, signaltonoise ratio, and the test itself. We focus on linearized models, but also consider nonlinear effects of neural activities (firing rates) that are bounded below and show how they modify Hick’s law. KEYWORDS: leaky accumulator, driftdiffusion model, neural network, Hick’s law, multihypothesis sequential test, sequential ratio test.
Characterization of Subthreshold Voltage Fluctuations in Neuronal Membranes
, 2003
"... Synaptic noise due to intense network activity can have a significant impact on the electrophysiological properties of individual neurons. This is the case for the cerebral cortex, where ongoing activity leads to strong barrages of synaptic inputs, which act as the main source of synaptic noise affe ..."
Abstract

Cited by 28 (13 self)
 Add to MetaCart
Synaptic noise due to intense network activity can have a significant impact on the electrophysiological properties of individual neurons. This is the case for the cerebral cortex, where ongoing activity leads to strong barrages of synaptic inputs, which act as the main source of synaptic noise affecting on neuronal dynamics. Here, we characterize the subthreshold behavior of neuronal models in which synaptic noise is represented by either additive or multiplicative noise, described by OrnsteinUhlenbeck processes. We derive and solve the FokkerPlanck equation for this system, which describes the time evolution of the probability density function for the membrane potential. We obtain an analytic expression for the membrane potential distribution at steady state and compare this expression with the subthreshold activity obtained in HodgkinHuxleytype models with stochastic synaptic inputs. The differences between multiplicative and additive noise models suggest that multiplicative noise is adequate to describe the highconductance states similar to in vivo conditions. Because the steadystate membrane potential distribution is easily obtained experimentally, this approach provides a possible method to estimate the mean and variance of synaptic conductances in real neurons.