## Hidden Patterns in Combined and Adaptive Knowledge Networks (1988)

Venue: | International Journal of Approximate Reasoning |

Citations: | 38 - 2 self |

### BibTeX

@ARTICLE{Kosko88hiddenpatterns,

author = {Bart Kosko},

title = {Hidden Patterns in Combined and Adaptive Knowledge Networks},

journal = {International Journal of Approximate Reasoning},

year = {1988},

pages = {377--393}

}

### Years of Citing Articles

### OpenURL

### Abstract

Uncertain causal knowledge is stored in fuzzy cognitive maps (FCMs). FCMs are fuzzy signed digraphs with feedback. The sign (+ or-) of FCM edges indicates causal increase or causal decrease. The fuzzy degree of causality is indicated by a number in [- 1, 1]. FCMs learn by modifying their causal connections in sign and magnitude, structurally analogous to the way in which neural networks learn. An appropriate causal learning law for inductively inferring FCMs from time-series data is the differential Hebbian law, which modifies causal connections by correlating time derivatives of FCM node outputs. The differential Hebbian law contrasts with Hebbian output-correlation learning laws of adaptive neural networks. FCM nodes represent variable phenomena or fuzzy sets. An FCM node nonlinearly transforms weighted summed inputs into numerical output, again in analogy to a model neuron. Unlike expert systems, which are feedforward search trees, FCMs are nonlinear dynamical systems. FCM resonant states are limit cycles, or time-varying patterns. An FCM limit cycle or hidden pattern is an FCM inference. Experts construct FCMs by drawing causal pictures or digraphs. The corresponding connection matrices are used for inferencing. By additively combining augmented connection matrices, any number of FCMs can be naturally combined into a single knowledge network. The credibility wi in [0, 1] of the ith expert is included in this learning process by multiplying the ith expert's augmented FCM connection matrix by w i. Combining connection matrices is a simple type of adaptive inference. In general, connection matrices are modified by an unsupervised learning law, such as the

### Citations

246 |
Absolute stability of global pattern formation and parallel memory storage by competitive neural networks
- Cohen, Grossberg
- 1983
(Show Context)
Citation Context ...ete binary FCMs in synchronous operation, resonate on limit cycles [7] or repeating temporal patterns. Rarest of all is equilibration to fixed points. This is global stability. As Cohen and Grossberg =-=[11]-=- phrase (absolute) global stability: The limits of system trajectories exist for all inputs and all choices of system parameters. All input balls rapidly roll down a local "energy" or Lyapunov minimum... |

157 | Bidirectional Associative Memories
- Kosko
- 1988
(Show Context)
Citation Context ...ssociative memory recollections. FCM cycles naturallyCombined and Adaptive Knowledge Networks 379 allow feedback to be represented. Abandoning graph search, the FCM (temporal associative memory; see =-=[7]-=-) dynamic system immediately reverberates [7] on an inference or prediction no matter how large the FCM. An arbitrary number of weighted FCMs of arbitrary structure can naturally be combined by summin... |

35 |
Fuzzy Cognitive Maps", Int
- Kosko
- 1986
(Show Context)
Citation Context ...tion of the ith node Ci is some real number x;, then the simplest model for causal activation is equivalent to the additive short-term memory model of a neuron's activation: xi = -xi+ ~ Cj(xi)eji+ Ii =-=(2)-=-Combined and Adaptive Knowledge Networks 389 where Cj is a sigmoid function. Several state activation models other than (2) are possible. The first term in (2) is passive causal decay. Something happ... |

18 |
A drive-reinforcement model of single neuron function: An alternative to the Hebbian neuronal model
- Klopf
- 1986
(Show Context)
Citation Context ...st conclude but that, to some extent, Q~ "causes" Q2? The greater the concomitant or lagged variation in frequency and magnitude, the bolder the causal conjecture. A differential Hebbian learning law =-=[1, 3, 4, 6, 10]-=- is the minimal unsupervised learning law for measuring change. It correlates time derivatives of node activations or of node outputs, or some mix thereof. For example, = - eu + cicj = - eu + c/cj yci... |

15 |
Differential hebbian learning
- Kosko
- 1986
(Show Context)
Citation Context ...st conclude but that, to some extent, Q~ "causes" Q2? The greater the concomitant or lagged variation in frequency and magnitude, the bolder the causal conjecture. A differential Hebbian learning law =-=[1, 3, 4, 6, 10]-=- is the minimal unsupervised learning law for measuring change. It correlates time derivatives of node activations or of node outputs, or some mix thereof. For example, = - eu + cicj = - eu + c/cj yci... |

7 |
Fuzzy knowledge combination
- Kosko
- 1986
(Show Context)
Citation Context ...ion through F r is a crude attempt to reverse the causal arrow of time. It produces a rough "backward chaining" inference. COMBINING FUZZY KNOWLEDGE NETWORKS Any set of FCMs can be naturally combined =-=[4, 5]-=-. Each expert can draw a different size FCM with different FCM causal concepts. There is no restriction on the number of experts or on the number of concepts. Indeed, the more experts the better. We a... |

2 |
Adaptive Inference. Monograph, Verac Inc
- Kosko
- 1985
(Show Context)
Citation Context ...onnection matrices are used for inferencing. By additively combining augmented connection matrices, any number of FCMs can be naturally combined into a single knowledge network. The credibility wi in =-=[0, 1]-=- of the ith expert is included in this learning process by multiplying the ith expert's augmented FCM connection matrix by w i. Combining connection matrices is a simple type of adaptive inference. In... |

2 |
Fuzzy associative memories In: Fuzzy Expert Systems
- Kosko
- 1987
(Show Context)
Citation Context ...valued FCMs, are easier to get from experts. Simple FCMs are also usually more reliable, because experts are more likely to agree on causal signs than on magnitudes. The FCM matrix combination scheme =-=[4]-=- described below allows simple signed FCMs to be combined into a nonsimple FCM that naturally represents causal magnitudes as the expert sample size increases. FCM inference proceeds by nonlinear spre... |

1 |
Vision as causal activation and association
- Kosko, Limm
- 1985
(Show Context)
Citation Context ...ess reliable knowledge bases. A directed graph is the minimal knowledge representation structure that overcomes the difficulties of search trees. In general a fuzz~ cognitive map (FCM) (see [1-5] and =-=[6]-=-) is a fuzzy signed digraph with feedback. An FCM is the feedback generalization of a search tree. An FCM graphically represents uncertain causal reasoning. Its matrix representation allows causal inf... |

1 |
South Africa is changing
- Williams
- 1986
(Show Context)
Citation Context ...es of the signals Sj(yj) and Si(xi) as they flow back and forth over the pathway mij. For instance, the simple additive model (2) must be extended to the model Xi = --Xi+ ~ Cj(xj)eji+ ~ Cj(xj)eji+ Ii =-=(8)-=- J The globally stable form of the learning law (3) includes a Hebbian product as well as a differential Hebbian product: J mi.i = - mij + Si(xi)Sj(yj) + ,Si(xi),~i(Yj) (9) stated in two-field or hete... |

1 |
A System of Logic, 1843. and Adaptive Knowledge Networks 393
- Mill
(Show Context)
Citation Context ...an FCM in sign and magnitude when applied to data generated from that FCM? A good answer is: those laws that measure changes in the environmental parameters or, in the terminology of John Stuart Mill =-=[9]-=-, that measure concomitant variation. The "causes" of a phenomenon's behavior are the variables of which that phenomenon's behavior is a function: B = f(vl, 02, • "'). If changes in a variable quantit... |

1 |
Global stability in neural networks
- Kosko
- 1987
(Show Context)
Citation Context ...92 Bart Kosko -But now to prove global stability we must make a crucial assumption on signal accelerations. They must approximate signal velocities: ,~i(xi).~.S)(xi) and ,~j(xj)..~S)(xj) for all i, j =-=(13)-=- (More generally, signal velocities and accelerations must agree, or tend to agree, in sign). If (13) holds, then L __< 0. And if all ai > 0 and S/' > 0, then/~ = 0 iff ~?i = Yj = mij = 0 for all i an... |