Results 1  10
of
14
Correlation learning rule in floatinggate pFET synapses
 in IEEE Int. Symp. Circuits and Systems
, 1999
"... Abstract—We study the weight dynamics of the floatinggate pFET synapse and the effects of the pFET’s gate and drain voltages on these dynamics. We show that we can derive a weight update rule such that the equilibrium weight value is proportional to the correlation between the gate and drain voltag ..."
Abstract

Cited by 18 (7 self)
 Add to MetaCart
(Show Context)
Abstract—We study the weight dynamics of the floatinggate pFET synapse and the effects of the pFET’s gate and drain voltages on these dynamics. We show that we can derive a weight update rule such that the equilibrium weight value is proportional to the correlation between the gate and drain voltages. In particular, we want a rule of the form 1 _ = 1 + [], where is a voltage signal on the gate terminal and is a voltage signal on the drain terminal. We obtain this rule by making a linear approximation to the weight dynamics around a given equilibrium point. We develop this approximation by considering the basic functional form of the system dynamics and then examining the effects of the gate and drain voltages on the specifics of this form. Index Terms—Analog learning rules, analog synapses, electron tunneling, floatinggate circuits, hotelectron injection. I.
Adaptive CMOS: From Biological Inspiration to SystemsonaChip
 PROCEEDINGS OF THE IEEE
, 2002
"... ..."
Computational neurobiology meets semiconductor engineering
 In 30th IEEE International Symposium on MultipleValued Logic, Lecture Notes in Artificial Intelligence
, 2000
"... Many believe that the most important result to come out of the last ten years of neural network research is the significant change in perspective in the neuroscience community towards a theory of computational neurobiology and functional neuromodels. Arriving on a fast moving train from the other di ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
Many believe that the most important result to come out of the last ten years of neural network research is the significant change in perspective in the neuroscience community towards a theory of computational neurobiology and functional neuromodels. Arriving on a fast moving train from the other direction is semiconductor technology, one of the greatest technology success stories of all time – transistors are now approaching deep submicron (less than 100 nanometers) in size, and we will soon be building silicon chips with over 1 billion transistors. The marriage of these two technologies is creating what Andy Grove (exCEO of Intel) refers to as a strategic inflection point. Although previous attempts at merging these technologies were premature, silicon and computational neurobiology are now merging to create an extremely powerful, and radically new form of computation. 1.
Learning SpikeBased Correlations and Conditional Probabilities in Silicon
 in Advances in Neural Information Processing Systems 14
, 2001
"... We have designed and fabricated a VLSI synapse that can learn a conditional probability or correlation between spikebased inputs and feedback signals. The synapse is low power, compact, provides nonvolatile weight storage, and can perform simultaneous multiplication and adaptation. We can calibrate ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
We have designed and fabricated a VLSI synapse that can learn a conditional probability or correlation between spikebased inputs and feedback signals. The synapse is low power, compact, provides nonvolatile weight storage, and can perform simultaneous multiplication and adaptation. We can calibrate arrays of synapses to ensure uniform adaptation characteristics. Finally, adaptation in our synapse does not necessarily depend on the signals used for computation.
CMOL Based Cortical Models
"... There are a number of challenges facing the semiconductor industry, and, ..."
unknown title
"... This paper introduces a static multilevel memory cell that was conceived to store state variables in neuromorphic onchip learning applications. It consists of a capacitance that holds a voltage and an array of ‘fusing ’ amplifiers that are connected as followers. These followers drive their output ..."
Abstract
 Add to MetaCart
(Show Context)
This paper introduces a static multilevel memory cell that was conceived to store state variables in neuromorphic onchip learning applications. It consists of a capacitance that holds a voltage and an array of ‘fusing ’ amplifiers that are connected as followers. These followers drive their output towards the voltage level of the input like normal followers, but only if the difference between input and output is smaller than about 120mV. The inputs to this ’fusing ’ follower array determine the stable voltage levels of the memory cell. All followeroutputs are connect to the storage capacitance and thus the voltage is always driven to the closest stable level. The cell content can be changed by injecting current into the capacitance. This form of storage offers arguably a better compromise between desirable and undesirable properties for neuromorphic learning systems than alternative solutions (e.g. nonvolatile analog storage on floating gates or digital static storage in combination with AD/DA conversion), as shall be discussed in the following. 1.
Acknowledgments
"... This thesis concludes my work for the Candidatus Scientiarum (Cand. ..."
(Show Context)
The Coming Revolution: The Merging of Computational Neural Science and Semiconductor Engineering
"... This paper discusses an approach that has the potential us to move us closer to solving these problems. I first begin with a discussion of Intelligent Signal Processing that attempts to solve complex problems in recognition and control. I then look at biological computing models, which offer insight ..."
Abstract
 Add to MetaCart
(Show Context)
This paper discusses an approach that has the potential us to move us closer to solving these problems. I first begin with a discussion of Intelligent Signal Processing that attempts to solve complex problems in recognition and control. I then look at biological computing models, which offer insight into new techniques for doing intelligent signal processing. However, these biological models are radically different and require radically different implementation