## Recurrent Multilayer Perceptrons for Identification and Control: The Road to Applications (1995)

### Cached

### Download Links

Citations: | 21 - 3 self |

### BibTeX

@MISC{Tutschku95recurrentmultilayer,

author = {K. Tutschku},

title = {Recurrent Multilayer Perceptrons for Identification and Control: The Road to Applications},

year = {1995}

}

### OpenURL

### Abstract

: This study investigates the properties of artificial recurrent neural networks. Particular attention is paid to the question of how these nets can be applied to the identification and control of non-linear dynamic processes. Since these kind of processes can only insufficiently be modelled by conventional methods, different approaches are required. Neural networks are considered to be useful for this purpose due to their ability to approximate a wide class of continuous functions. Among the numerous network structures, especially the recurrent multi-layer perceptron (RMLP) architecture is promising from application point of view. This network architecture has the wellknown properties of multi layer perceptrons and moreover these nets have the ability to incorporate temporal behavior. Departing from the original process description the applicability of RMLPs is investigated and different learning algorithms for this network class are outlined. Furthermore, besides the conventional alg...

### Citations

1783 |
Introduction to the theory of neural computation
- Hertz, Krogh, et al.
- 1991
(Show Context)
Citation Context ...umber of subnets is L =2+number of hidden subnets. The layer numbering starts with 0 and ends at L,1. b) Node Structure The node type of RMLP nets is based on the usual McCulloch and Pitts model, cf. =-=[HKP91]-=-. In detail, the network input neti;j to node j in layer i is de ned as: neti;j = X N i,1 k=1 w f;i k;j yi,1;k(n)+ N Xi k=1 where yi;j is the activation of node j in layer i, w f;i k;j w r;i k;j yi;k(... |

1544 | Finding Structure in Time - Elman - 1990 |

1232 |
Multilayer feedforward networks are universal approximators
- Hornik, Stinchcombe, et al.
- 1989
(Show Context)
Citation Context ...ral network architecture is the ability to learn functions arbitrarily close. Hornik, Stinchcombe and White have shown that the well-known Stone-Weierstrass-Theorem can be applied to neural networks, =-=[HSW89]-=-. They stated that standard feed-forward multi layer networks with at least one hidden layer can approximate any Borel measurable function at any desired degree of accuracy, provided 6sx(t) z -1 z -1 ... |

457 |
Identification and control of dynamical systems using neural networks
- Narendra, Parthasarathy
- 1990
(Show Context)
Citation Context ... algebra and linear ordinary differential equations to describe the processes. Methods for the adaptive identification of unknown process parameters of linear time-invariant processes are well known, =-=[NP90]-=-. However, applying these techniques to real-world applications very often fail, since most of the processes reveal a nonlinear and dynamic behaviour. Therefore different modeling techniques have to b... |

417 | A learning algorithm for continually running fully recurrent neural networks
- Williams, Zipser
- 1989
(Show Context)
Citation Context ... length. 11 w w 22 22 n = 4 n = 3 n = 2 n = 1s4.3 Real-Time Recurrent Learning A learning method for general recurrent networks without duplicating the units has been proposed by Williams and Zipser, =-=[WZ89]-=-. The algorithm allows updating the weights while the sequence is presented. It is therefore called a real time method. The real time ability also constitutes a major advantage of the Real-time recurr... |

270 | Statistical properties of MPEG video traffic and their impact on traffic modeling - Rose - 1995 |

269 |
der Schaft. Nonlinear Dynamical Control Systems
- Nijmeijer, van
- 1990
(Show Context)
Citation Context ...e vector case the system is of multiple-input/multiple-output (MIMO) type. Time-invariant Linear System A time-invariant linear system is de ned in discrete-time using following linear equations, cf. =-=[NvdS90]-=-: x(n +1) = Ax(n)+Bu(n) (1a) y(n) = Cx(n)+Du(n); (1b) or in the continuous case by linear di erential equations: _x(t) = Ax(t)+Bu(t) (2a) y(t) = Cx(t)+Du(t); (2b) where A; B; C and D are properly dime... |

246 | Attractor dynamics and parallelism in a connectionist sequential machine - Jordan |

85 | A note on the power of threshold circuits - Allender - 1989 |

73 |
Neurocontrol of nonlinear dynamical systems with kalman filter trained recurrent networks
- Puskorius, Feldkamp
- 1994
(Show Context)
Citation Context ...ecurrent multi layer perceptron (RMLP). This architecture was originally presented in [PSY88] and [FPT90]. The generalization has been proposed by Puskorius and Feldkamp and can be found in detail in =-=[PF94]-=-. A general RMLP consists of a sequence of cascaded subnetworks. Each subnet consists of layers of nodes. All subnets are interconnected by feed-forward links and no recurrent connections between the ... |

65 | Non-linear system identification using neural networks - Chen, Billings, et al. - 1990 |

63 |
Training multilayer perceptrons with the extended kalman algorithm
- Singhal, Wu
- 1989
(Show Context)
Citation Context ... has to be trained are well known, one takes advantage of the approximation ability of neural nets and directly applies control theory methods as learning algorithms. For this purpose, Singhal and Wu =-=[SW89]-=- suggested the use of the Extended Kalman Filter algorithm from the estimation theory for training of the nets. 5.1 State Model of a Neural Network The process behaviour is characterized by astate mod... |

47 | A multilayered neural network controller
- Psaltis, Sideris, et al.
- 1988
(Show Context)
Citation Context ...ork architecture that satisfy the requirements stated in the previous Section, is the generalized version of the recurrent multi layer perceptron (RMLP). This architecture was originally presented in =-=[PSY88]-=- and [FPT90]. The generalization has been proposed by Puskorius and Feldkamp and can be found in detail in [PF94]. A general RMLP consists of a sequence of cascaded subnetworks. Each subnet consists o... |

47 |
Decoupled extended Kalman filter training of feedforward layered networks
- Puskorious, Feldkamp
- 1991
(Show Context)
Citation Context ...that introduces artificial noise in Kalman recursion. The elements of Q(n) are in the range of 10 \Gamma6 to 10 \Gamma2 . Artificial noise prevents the process from getting stuck in local minima, cf. =-=[PF91]-=-. 6 Application and Conclusion Recurrent multi layer perceptrons have shown to be a powerful neural network architecture. They can incorporate temporal behaviour and are able to approximate arbitrary ... |

44 | Folded Petersen cube networks: New competitors for the hypercubes - Ohring, Das - 1996 |

35 | Discrete-time analysis technique and application to usage parameter control modelling in ATM systems - Tran-Gia - 1993 |

33 | Statistical properties of MPEG video tra c and their impact on tra c modeling in ATM systems - Rose - 1995 |

32 | On the power of number-theoretic operations with respect to counting - Hertrampf, Vollmer, et al. - 1995 |

30 | Dimensioning of a peak cell rate monitor algorithm using discrete-time analysis - Hubner - 1993 |

29 | On Balanced vs. Unbalanced Computation Trees - Hertrampf, Wagner - 1994 |

29 |
Optimal filtering algorithms for fast learning in feedforward neural networks
- Shah, Palmieri, et al.
- 1992
(Show Context)
Citation Context ...ion To linearize the non-linear model, taylor series approximation is used. The non-linear function h n (\Delta) can be expanded around the current estimate of the parameter vectorsw(n \Gamma 1), cf. =-=[SPD92]-=-. The observation function becomes: ~ d(n) = h n (sw(n \Gamma 1); ~u(n)) +H T (n)(~w 0 \Gammasw(n \Gamma 1)) + ae(n) + e(n) (34) where H(n) = @h n ( ~ w; ~u(n)) @ ~ w fi fi fi fi fi ~ w=sw(n\Gamma1) (... |

26 | Incomplete Hypercubes: Embeddings of Tree-Related Networks - Ã–hring, Das - 1995 |

26 | Steady-State Analysis of the Rate-Based Congestion Control Mechanism for - Ritter - 1995 |

25 | Analysis of polling systems with general input process and nite capacity - Tran-Gia - 1990 |

25 | Growing Context-Sensitive Languages and Automata - Buntrock - 1993 |

24 | Performance analysis of the CRM a-protocol in high-speed networks. Dezember - Dittmann - 1990 |

24 | Performance analysis of a batch service system operating in pull mode - Grob - 1991 |

24 | Classes of counting functions and complexity theoretic operators - Wagner - 1991 |

24 | Discrete-time analysis of the output process of an atm multiplexer with periodic input - Hubner - 1991 |

24 | Vector language: Simple description of hard instances. Oktober - Wagner - 1992 |

23 | Application of the discrete transforms in performance modeling and analysis. Februar - Tran-Gia - 1989 |

23 | Die Vektor-Sprache: Einfachste Mittel zur kompakten Beschreibung endlicher Objekte - Wagner - 1989 |

23 | A transformation system for chain code picture languages: Properties and algorithms - Gutbrod - 1990 |

23 | und P. Tran-Gia. Performance analysis of a batch service queue arising out of manufacturing systems modeling - Gold - 1990 |

23 | Communication network routing using neural nets { numerical aspects and alternative approaches. Juli - Mandel - 1991 |

23 | On complexity classes and algorithmically random languages - Book, Wagner - 1991 |

23 | A graphical user interface for genetic algorithms - Dabs, Schoof - 1995 |

23 | Combining KARL and Configurable Role Limiting Methods for Configuring Elevator Systems. Januar - Poeck, Fensel, et al. - 1994 |

22 | Bounded query classes. Februar - Wagner - 1989 |

22 | Relations among mod-classes. Februar - Hertrampf - 1989 |

22 | Number-of-query hierarchies. Februar - Wagner - 1989 |

22 | und Th. Stock. Approximate performance analysis of the DQDB access protocol - Tran-Gia - 1989 |

22 | Workshop uber Komplexitatstheorie, e ziente Algorithmen und Datenstrukturen - Wagner - 1989 |

22 | Allender und U. Hertrampf. On the power of uniform families of constant depth threshold circuits. Februar - W - 1990 |

22 | Hertrampf und C. Meinel. Structure and importance of logspace-MOD-classes. Juli - Buntrock, Damm, et al. - 1990 |

22 | Complexity and approximation theoretical properties of rational functions which maptwo intervals into two other ones - Huckenbeck - 1990 |

22 | On random oracle separations - Book - 1990 |

22 | und G. Bleckert. Analysis of a batch service system with two heterogeneous servers - Gold - 1991 |

22 | Structure and performance of neural nets in broadband system admission control - Gropp - 1991 |

22 | und K. Lorys. On growing context-sensitive languages. Januar - Buntrock - 1992 |