Results 1  10
of
16
Hidden Patterns in Combined and Adaptive Knowledge Networks
 International Journal of Approximate Reasoning
, 1988
"... Uncertain causal knowledge is stored in fuzzy cognitive maps (FCMs). FCMs are fuzzy signed digraphs with feedback. The sign (+ or) of FCM edges indicates causal increase or causal decrease. The fuzzy degree of causality is indicated by a number in [ 1, 1]. FCMs learn by modifying their causal conn ..."
Abstract

Cited by 39 (2 self)
 Add to MetaCart
Uncertain causal knowledge is stored in fuzzy cognitive maps (FCMs). FCMs are fuzzy signed digraphs with feedback. The sign (+ or) of FCM edges indicates causal increase or causal decrease. The fuzzy degree of causality is indicated by a number in [ 1, 1]. FCMs learn by modifying their causal connections in sign and magnitude, structurally analogous to the way in which neural networks learn. An appropriate causal learning law for inductively inferring FCMs from timeseries data is the differential Hebbian law, which modifies causal connections by correlating time derivatives of FCM node outputs. The differential Hebbian law contrasts with Hebbian outputcorrelation learning laws of adaptive neural networks. FCM nodes represent variable phenomena or fuzzy sets. An FCM node nonlinearly transforms weighted summed inputs into numerical output, again in analogy to a model neuron. Unlike expert systems, which are feedforward search trees, FCMs are nonlinear dynamical systems. FCM resonant states are limit cycles, or timevarying patterns. An FCM limit cycle or hidden pattern is an FCM inference. Experts construct FCMs by drawing causal pictures or digraphs. The corresponding connection matrices are used for inferencing. By additively combining augmented connection matrices, any number of FCMs can be naturally combined into a single knowledge network. The credibility wi in [0, 1] of the ith expert is included in this learning process by multiplying the ith expert's augmented FCM connection matrix by w i. Combining connection matrices is a simple type of adaptive inference. In general, connection matrices are modified by an unsupervised learning law, such as the
A Balanced Differential Learning algorithm in Fuzzy Cognitive Maps
 In: Proceedings of the 16th International Workshop on Qualitative Reasoning 2002
, 2002
"... Fuzzy Conceptual Maps have become an important means for describing a particular domain showing the concepts (variables) and the relationship between them. They have been used for several tasks like simulation processes, forecasting or decision support. In general, the task of creating Fuzzy Concept ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Fuzzy Conceptual Maps have become an important means for describing a particular domain showing the concepts (variables) and the relationship between them. They have been used for several tasks like simulation processes, forecasting or decision support. In general, the task of creating Fuzzy Conceptual Maps is made by experts in a certain domain but it is very promising the automatic creation of Fuzzy Conceptual Maps from raw data. In this paper we present a new algorithm (the Balanced Differential Algorithm) to learn Fuzzy Conceptual Maps from data. We compare the results obtained from the proposed algorithm versus the results obtained from the Differential Hebbian algorithm. Based on the results we conclude that the algorithm proposed seems to be better to learn patterns and
Learning Algorithms in Neural Networks
, 1990
"... Neural Network models have received increased attention in the recent years. Aimed at achieving humanlike performance in tasks of the cognitive sciences domain, these models are composed of a highly interconnected mesh of nonlinear computing elements, whose structure is drawn from our current knowl ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Neural Network models have received increased attention in the recent years. Aimed at achieving humanlike performance in tasks of the cognitive sciences domain, these models are composed of a highly interconnected mesh of nonlinear computing elements, whose structure is drawn from our current knowledge of biological neural systems. Several Neural Network Learning Algorithms have been developed in the past years. In these algorithms, a set of rules defines the evolution process undertaken by the synaptic connections of the networks, thus allowing them to learn how to perform specified tasks. In this article, several such algorithms are surveyed. They range from simple associative learning paradigms to more complex reinforcement learning systems. A detailed description of each algorithm is presented, and a discussion of their capabilities and limitations is included. 1 Introduction The history of artificial neural networks starts with the work by McCulloch and Pitts [64] in which neur...
A Complexvalued Associative Memory for Storing Patterns as Oscillatory States
 Biological Cybernetics
, 1996
"... A neuron model in which the neuron state is described by a complex number is proposed. A network of these neurons, which can be used as an associative memory, operates in two distinct modes: (i) fixed point mode and (ii) oscillatory mode. Mode selection can be done by varying a continuous mode param ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
A neuron model in which the neuron state is described by a complex number is proposed. A network of these neurons, which can be used as an associative memory, operates in two distinct modes: (i) fixed point mode and (ii) oscillatory mode. Mode selection can be done by varying a continuous mode parameter, , between 0 and 1. At one extreme value of (= 0), the network has conservative dynamics, and at the other ( = 1), the dynamics are dissipative and governed by a Lyapunov function. Patterns can be stored and retrieved at any value of by, (i) a onestep outer product rule or (ii) adaptive hebbian learning. In the fixed point mode patterns are stored as fixed points, whereas in the oscillatory mode they are encoded as phase relations among individual oscillations. By virtue of an instability in the oscillatory mode, the retrieval pattern is stable over a finite interval, the stability interval, and the pattern gradually deteriorates with time beyond this interval. However, at certain val...
Fuzzy Cognitive Maps in Business Analysis and PerformanceDriven Change
 IEEE Transactions on Engineering Management
, 2004
"... Abstract—Business process reengineering (BPR) has made a significant impact on managers and academics. Despite the rhetoric surrounding BPR, articulated mechanisms, which support reasoning on the effect of the redesign activities to the performance of the business model, are still emerging. This p ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract—Business process reengineering (BPR) has made a significant impact on managers and academics. Despite the rhetoric surrounding BPR, articulated mechanisms, which support reasoning on the effect of the redesign activities to the performance of the business model, are still emerging. This paper describes an attempt to build and operate such a reasoning mechanism as a novel supplement to performancedriven change (PDC) exercises. This new approach proposes the utilization of the fuzzy causal characteristics of fuzzy cognitive maps (FCMs) as the underlying methodology in order to generate a hierarchical and dynamic network of interconnected performance indicators. By using FCMs, the proposed mechanism aims at simulating the operational efficiency of complex process models with imprecise relationships to quantify the impact of performancedriven reengineering activities. This research also establishes generic maps that supplement the strategic planning and business analysis phases of typical redesign projects in order to implement the integration of hierarchical FCMs into PDC activities. Finally, this paper discusses experiments with the proposed mechanism and comments on its usability. Index Terms—Business analysis, business process reengineering (BPR), fuzzy cognitive mapping, performance metrics. I.
A Neural Networkbased Associative Memory for Storing Complexvalued Patterns
 In Proc. of IEEE International Conference on Systems, Man and Cybernetics
, 1994
"... A neural networkbased associative memory for storing complex patterns is proposed. Two variations of the model are proposed: (1) discrete model and, (2) continuous model. The latter approaches the former as a limit. A crude capacity estimate for the discrete model is made. Network weights can be ca ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
A neural networkbased associative memory for storing complex patterns is proposed. Two variations of the model are proposed: (1) discrete model and, (2) continuous model. The latter approaches the former as a limit. A crude capacity estimate for the discrete model is made. Network weights can be calculated in one step using a complex outerproduct rule or can be adjusted adaptively using a Hebbian learning rule. Possible biological significance of the complex neuron state is briefly discussed. 1 Introduction Artificial neural network models that exhibit "emergent computation" have given a new impetus to neural network research. In a seminal paper, Hopfield demonstrated that a system of simple neuronlike computing elements can exhibit emergent computational characteristics [Hop82]. The system described therein consists of a fullyconnected network of twostate computational elements, the neurons. Each neuron receives weighted input from all the other neurons, updates its current outp...
Contents lists available at SciVerse ScienceDirect Neural Networks
"... journal homepage: www.elsevier.com/locate/neunet ..."
unknown title
"... A neural networkbased associative memory for storing complex patterns is proposed. Two variations of the model are proposed: (1) discrete model and, (2) continuous model. The latter approaches the former as a limit. A crude capacity estimate for the discrete model is made. Network weights can be ca ..."
Abstract
 Add to MetaCart
A neural networkbased associative memory for storing complex patterns is proposed. Two variations of the model are proposed: (1) discrete model and, (2) continuous model. The latter approaches the former as a limit. A crude capacity estimate for the discrete model is made. Network weights can be calculated in one step using a complex outerproduct rule or can be adjusted adaptively using a Hebbian learning rule. Possible biological significance of the complex neuron state is briefly discussed. 1. Introduction Artificial neural network models that exhibit "emergent computation" have given a new impetus to neural network research. In a seminal paper, Hopfield demonstrated that a system of simple neuronlike computing elements can exhibit emergent computational characteristics [Hop82]. The system described therein consists of a fullyconnected network of twostate computational elements, the neurons. Each neuron receives weighted input from all the other neurons, updates its current ou...
DIFFERENTIAL LEARNING AND RANDOM WALK MODEL
"... This paper presents a learning algorithm for differential decorrelation, the goal of which is to find a linear transform that minimizes the concurrent change of associated output nodes. First the algorithm is derived from the minimization of the objective function which measures the differential cor ..."
Abstract
 Add to MetaCart
This paper presents a learning algorithm for differential decorrelation, the goal of which is to find a linear transform that minimizes the concurrent change of associated output nodes. First the algorithm is derived from the minimization of the objective function which measures the differential correlation. Then we show that the differential decorrelation learning algorithm can also be derived in the framework of maximum likelihood estimation of a linear generative model with assuming a random walk model for latent variables. Algorithm derivation and local stability analysis are given with a simple numerical example. 1.