Results 1  10
of
12
Hidden Patterns in Combined and Adaptive Knowledge Networks
 International Journal of Approximate Reasoning
, 1988
"... Uncertain causal knowledge is stored in fuzzy cognitive maps (FCMs). FCMs are fuzzy signed digraphs with feedback. The sign (+ or) of FCM edges indicates causal increase or causal decrease. The fuzzy degree of causality is indicated by a number in [ 1, 1]. FCMs learn by modifying their causal conn ..."
Abstract

Cited by 38 (2 self)
 Add to MetaCart
Uncertain causal knowledge is stored in fuzzy cognitive maps (FCMs). FCMs are fuzzy signed digraphs with feedback. The sign (+ or) of FCM edges indicates causal increase or causal decrease. The fuzzy degree of causality is indicated by a number in [ 1, 1]. FCMs learn by modifying their causal connections in sign and magnitude, structurally analogous to the way in which neural networks learn. An appropriate causal learning law for inductively inferring FCMs from timeseries data is the differential Hebbian law, which modifies causal connections by correlating time derivatives of FCM node outputs. The differential Hebbian law contrasts with Hebbian outputcorrelation learning laws of adaptive neural networks. FCM nodes represent variable phenomena or fuzzy sets. An FCM node nonlinearly transforms weighted summed inputs into numerical output, again in analogy to a model neuron. Unlike expert systems, which are feedforward search trees, FCMs are nonlinear dynamical systems. FCM resonant states are limit cycles, or timevarying patterns. An FCM limit cycle or hidden pattern is an FCM inference. Experts construct FCMs by drawing causal pictures or digraphs. The corresponding connection matrices are used for inferencing. By additively combining augmented connection matrices, any number of FCMs can be naturally combined into a single knowledge network. The credibility wi in [0, 1] of the ith expert is included in this learning process by multiplying the ith expert's augmented FCM connection matrix by w i. Combining connection matrices is a simple type of adaptive inference. In general, connection matrices are modified by an unsupervised learning law, such as the
Learning Algorithms in Neural Networks
, 1990
"... Neural Network models have received increased attention in the recent years. Aimed at achieving humanlike performance in tasks of the cognitive sciences domain, these models are composed of a highly interconnected mesh of nonlinear computing elements, whose structure is drawn from our current knowl ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Neural Network models have received increased attention in the recent years. Aimed at achieving humanlike performance in tasks of the cognitive sciences domain, these models are composed of a highly interconnected mesh of nonlinear computing elements, whose structure is drawn from our current knowledge of biological neural systems. Several Neural Network Learning Algorithms have been developed in the past years. In these algorithms, a set of rules defines the evolution process undertaken by the synaptic connections of the networks, thus allowing them to learn how to perform specified tasks. In this article, several such algorithms are surveyed. They range from simple associative learning paradigms to more complex reinforcement learning systems. A detailed description of each algorithm is presented, and a discussion of their capabilities and limitations is included. 1 Introduction The history of artificial neural networks starts with the work by McCulloch and Pitts [64] in which neur...
A Complexvalued Associative Memory for Storing Patterns as Oscillatory States
 Biological Cybernetics
, 1996
"... A neuron model in which the neuron state is described by a complex number is proposed. A network of these neurons, which can be used as an associative memory, operates in two distinct modes: (i) fixed point mode and (ii) oscillatory mode. Mode selection can be done by varying a continuous mode param ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
A neuron model in which the neuron state is described by a complex number is proposed. A network of these neurons, which can be used as an associative memory, operates in two distinct modes: (i) fixed point mode and (ii) oscillatory mode. Mode selection can be done by varying a continuous mode parameter, , between 0 and 1. At one extreme value of (= 0), the network has conservative dynamics, and at the other ( = 1), the dynamics are dissipative and governed by a Lyapunov function. Patterns can be stored and retrieved at any value of by, (i) a onestep outer product rule or (ii) adaptive hebbian learning. In the fixed point mode patterns are stored as fixed points, whereas in the oscillatory mode they are encoded as phase relations among individual oscillations. By virtue of an instability in the oscillatory mode, the retrieval pattern is stable over a finite interval, the stability interval, and the pattern gradually deteriorates with time beyond this interval. However, at certain val...
A Balanced Differential Learning algorithm in Fuzzy Cognitive Maps
 In: Proceedings of the 16th International Workshop on Qualitative Reasoning 2002
, 2002
"... Fuzzy Conceptual Maps have become an important means for describing a particular domain showing the concepts (variables) and the relationship between them. They have been used for several tasks like simulation processes, forecasting or decision support. In general, the task of creating Fuzzy Concept ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Fuzzy Conceptual Maps have become an important means for describing a particular domain showing the concepts (variables) and the relationship between them. They have been used for several tasks like simulation processes, forecasting or decision support. In general, the task of creating Fuzzy Conceptual Maps is made by experts in a certain domain but it is very promising the automatic creation of Fuzzy Conceptual Maps from raw data. In this paper we present a new algorithm (the Balanced Differential Algorithm) to learn Fuzzy Conceptual Maps from data. We compare the results obtained from the proposed algorithm versus the results obtained from the Differential Hebbian algorithm. Based on the results we conclude that the algorithm proposed seems to be better to learn patterns and
A Neural Networkbased Associative Memory for Storing Complexvalued Patterns
 In Proc. of IEEE International Conference on Systems, Man and Cybernetics
, 1994
"... A neural networkbased associative memory for storing complex patterns is proposed. Two variations of the model are proposed: (1) discrete model and, (2) continuous model. The latter approaches the former as a limit. A crude capacity estimate for the discrete model is made. Network weights can be ca ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
A neural networkbased associative memory for storing complex patterns is proposed. Two variations of the model are proposed: (1) discrete model and, (2) continuous model. The latter approaches the former as a limit. A crude capacity estimate for the discrete model is made. Network weights can be calculated in one step using a complex outerproduct rule or can be adjusted adaptively using a Hebbian learning rule. Possible biological significance of the complex neuron state is briefly discussed. 1 Introduction Artificial neural network models that exhibit "emergent computation" have given a new impetus to neural network research. In a seminal paper, Hopfield demonstrated that a system of simple neuronlike computing elements can exhibit emergent computational characteristics [Hop82]. The system described therein consists of a fullyconnected network of twostate computational elements, the neurons. Each neuron receives weighted input from all the other neurons, updates its current outp...
Differential ICA
"... Abstract. As an alternative to the conventional Hebbtype unsupervised learning, differential learning was studied in the domain of Hebb’s rule [1] and decorrelation [2]. In this paper we present an ICA algorithm which employs differential learning, thus named as differential ICA. We derive a differ ..."
Abstract
 Add to MetaCart
Abstract. As an alternative to the conventional Hebbtype unsupervised learning, differential learning was studied in the domain of Hebb’s rule [1] and decorrelation [2]. In this paper we present an ICA algorithm which employs differential learning, thus named as differential ICA. We derive a differential ICA algorithm in the framework of maximum likelihood estimation and random walk model. Algorithm derivation using the natural gradient and local stability analysis are provided. Usefulness of the algorithm is emphasized in the case of blind separation of temporally correlated sources and is demonstrated through a simple numerical example. 1
Connectionist Symbol Processing: Dead or Alive?
, 1999
"... this article are of varying nature: position summaries, individual research summaries, historical accounts, discussion of controversial issues, etc. We have not attempted to connect the various pieces together, or to organize them within a coherent framework. Despite this, we think, the reader will ..."
Abstract
 Add to MetaCart
this article are of varying nature: position summaries, individual research summaries, historical accounts, discussion of controversial issues, etc. We have not attempted to connect the various pieces together, or to organize them within a coherent framework. Despite this, we think, the reader will find this collection useful.
Connectionist Symbol Processing: Dead or Alive?
, 1999
"... this article are of varying nature: position summaries, individual research summaries, historical accounts, discussion of controversial issues, etc. No attempt was made to connect up the various pieces, nor to organize them in a coherent order. Despite this, we think the reader will find this collec ..."
Abstract
 Add to MetaCart
this article are of varying nature: position summaries, individual research summaries, historical accounts, discussion of controversial issues, etc. No attempt was made to connect up the various pieces, nor to organize them in a coherent order. Despite this, we think the reader will find this collection useful.
Contributed Article
, 2000
"... A neural network model that can simulate the learning of some simple proportional analogies is presented. These analogies include, for example, (a) redsquare:redcircle # yellowsquare:?, (b) apple:red # banana: ?, (c) a:b # c:?. Underlying the development of this network is a theory for how the br ..."
Abstract
 Add to MetaCart
A neural network model that can simulate the learning of some simple proportional analogies is presented. These analogies include, for example, (a) redsquare:redcircle # yellowsquare:?, (b) apple:red # banana: ?, (c) a:b # c:?. Underlying the development of this network is a theory for how the brain learns the nature of association between pairs of concepts. Traditional Hebbian learning of associations is necessary for this process but not sufficient. This is because it simply says, for example, that the concepts "apple" and "red" have been associated, but says nothing about the nature of this relationship. The types of contextdependent interlevel connections in the network suggest a semilocal type of learning that in some manner involves association among more than two nodes or neurons at once. Such connections have been called synaptic triads, and related to potential cell responses in the prefrontal cortex. Some additional types of connections are suggested by the problem of modeling analogies. These types of connections have not yet been verified by brain imaging, but the work herein suggests that they may occur and, possibly, be made and broken quickly in the course of working memory encoding. These working memory connections are referred to as differential, delayed and antiHebbian connections. In these connections, one can learn transitions such as "keep red the same"; "change red to yellow"; "turn off red"; "turn on yellow," and so forth. Also, included in the network is a kind of weight transport so that, for example, red to red can be transported to a different instance of color, such as yellow to yellow. The network instantiation developed here, based on common connectionist building blocks such as associative learning, competition, and adaptive resonance...
Approximately as appeared in: Learning and Computational Neuroscience: Foundations of Adaptive Networks, M. Gabriel and J. Moore, Eds., pp. 497537. MIT Press, 1990.
 Learning and Computational Neuroscience: Foundations of Adaptive Networks
, 1990
"... this paper, however, we analyze it from the point of view of animal learning theory. Our intended audience is both animal learning researchers interested in computational theories of behavior and machine learning researchers interested in how their learning algorithms relate to, and may be constrain ..."
Abstract
 Add to MetaCart
this paper, however, we analyze it from the point of view of animal learning theory. Our intended audience is both animal learning researchers interested in computational theories of behavior and machine learning researchers interested in how their learning algorithms relate to, and may be constrained by, animal learning studies. For an exposition of the TD model from an engineering point of view, see Chapter 13 of this volume