@MISC{Has_artificialneural, author = {This Chapter Has}, title = {Artificial Neural Network Structures}, year = {} }

Share

OpenURL

Abstract

adjustments of the strengths of the synaptic inputs, which led to the incorporation of adjustable synaptic weights on the input lines to excite or inhibit incoming signals. Figure 3.2 - A Neuron with Hebbian Learning Ability Figure 3.2 incorporates adjustable synaptic weights (knobs) on the input lines. An input vector x = (x 1 ,...,x N ), considered to be a columnmatrix vector, is linearly combined with the weight vector w=(w 1 ,...,w N ) via the inner (dot) product to form the sum s= (n=1,N) w n x n = w ox (3-1) If the sum s is greater than the given threshold , then the output y is 1, else it is 0. This threshold function is unipolar in that it puts out the nonnegative values of 0 or 1 (or 0 or V for some voltage V) and complies with the formerly presumed two-valued all-or-nothing principle of biological neurons. Neurons that use the bipolar threshold functions with output values of-1 or 1 (or-V or V for some voltage V) are nowadays called McCulloch-Pitts neurons. For furthe