Results 1  10
of
44
Neural Net Architectures for Temporal Sequence Processing
, 1994
"... I present a general taxonomy of neural net architectures for processing timevarying patterns. This taxonomy subsumes many existing architectures in the literature, and points to several promising architectures that have yet to be examined. Any architecture that processes timevarying patterns requir ..."
Abstract

Cited by 120 (0 self)
 Add to MetaCart
I present a general taxonomy of neural net architectures for processing timevarying patterns. This taxonomy subsumes many existing architectures in the literature, and points to several promising architectures that have yet to be examined. Any architecture that processes timevarying patterns requires two conceptually distinct components: a shortterm memory that holds on to relevant past events and an associator that uses the shortterm memory to classify or predict. My taxonomy is based on a characterization of shortterm memory models along the dimensions of form, content, and adaptability. Experiments on predicting future values of a financial time series (US dollarSwiss franc exchange rates) are presented using several alternative memory models. The results of these experiments serve as a baseline against which more sophisticated architectures can be compared. Neural networks have proven to be a promising alternative to traditional techniques for nonlinear temporal prediction t...
Diagrammatic Derivation of Gradient Algorithms for Neural Networks
 in Neural Computation
, 1994
"... Deriving gradient algorithms for timedependent neural network structures typically requires numerous chain rule expansions, diligent bookkeeping, and careful manipulation of terms. In this paper, we show how to use the principle of Network Reciprocity to derive such algorithms via a set of simple b ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
(Show Context)
Deriving gradient algorithms for timedependent neural network structures typically requires numerous chain rule expansions, diligent bookkeeping, and careful manipulation of terms. In this paper, we show how to use the principle of Network Reciprocity to derive such algorithms via a set of simple block diagram manipulation rules. The approach provides a common framework to derive popular algorithms including backpropagation and backpropagationthroughtime without a single chain rule expansion. Additional examples are provided for a variety of complicated architectures to illustrate both the generality and the simplicity of the approach. 1 Introduction Deriving the appropriate gradient descent algorithm for a new network architecture or system configuration normally involves brute force derivative calculations. For example, the celebrated backpropagation algorithm for training feedforward neural networks was derived by repeatedly applying chain rule expansions backward through the ne...
Time Series Prediction with Multilayer Perceptron, FIR and Elman Neural Networks
 In Proceedings of the World Congress on Neural Networks
, 1996
"... Multilayer perceptron network (MLP), FIR neural network and Elman neural network were compared in four different time series prediction tasks. Time series include load in an electric network series, fluctuations in a farinfrared laser series, numerically generated series and behaviour of sunspots s ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
(Show Context)
Multilayer perceptron network (MLP), FIR neural network and Elman neural network were compared in four different time series prediction tasks. Time series include load in an electric network series, fluctuations in a farinfrared laser series, numerically generated series and behaviour of sunspots series. FIR neural network was trained with temporal backpropagation learning algorithm. Results show that the efficiency of the learning algorithm is more important factor than the network model used. Elman network models load in an electric network series better than MLP network and in other prediction tasks it performs similar to MLP network. FIR network performs adequately but not as good as Elman network. 1. Introduction In this paper we study neural network architectures that are capable of learning temporal features in data in time series prediction. The feedforward multilayer perceptron (MLP) network is used frequently in time series prediction. MLP network, however, has the major l...
A Unifying View of Some Training Algorithms for Multilayer Perceptrons with FIR Filter Synapses
 Neural Networks for Signal Processing 4
, 1995
"... Recent interest has come about in deriving various neural network architectures for modelling timedependent signals. A number of algorithms have been published for multilayer perceptrons with synapses described by finite impulse response (FIR) and infinite impulse response (IIR) filters (the latter ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
Recent interest has come about in deriving various neural network architectures for modelling timedependent signals. A number of algorithms have been published for multilayer perceptrons with synapses described by finite impulse response (FIR) and infinite impulse response (IIR) filters (the latter case is also known as Locally Recurrent Globally Feedforward Networks). The derivations of these algorithms have used different approaches in calculating the gradients, and in this note, we present a short, but unifying account of how these different algorithms compare for the FIR case, both in derivation, and performance. New algorithms are subsequently presented. Simulation results have been performed to benchmark these algorithms. In this note, results are compared for the MackeyGlass chaotic time series against a number of other methods including a standard multilayer perceptron, and a local approximation method. INTRODUCTION As a means of capturing timedependent signals in a nonlin...
Multiresolution FIR NeuralNetworkBased Learning Algorithm Applied to Network Traffic Prediction
 IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Review
, 2006
"... Abstract—In this paper, a multiresolution finiteimpulseresponse (FIR) neuralnetworkbased learning algorithm using the maximal overlap discrete wavelet transform (MODWT) is proposed. The multiresolution learning algorithm employs the analysis framework of wavelet theory, which decomposes a signal ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
(Show Context)
Abstract—In this paper, a multiresolution finiteimpulseresponse (FIR) neuralnetworkbased learning algorithm using the maximal overlap discrete wavelet transform (MODWT) is proposed. The multiresolution learning algorithm employs the analysis framework of wavelet theory, which decomposes a signal into wavelet coefficients and scaling coefficients. The translationinvariant property of the MODWT allows aligment of events in a multiresolution analysis with respect to the original time series and, therefore, preserving the integrity of some transient events. A learning algorithm is also derived for adapting the gain of the activation functions at each level of resolution. The proposed multiresolution FIR neuralnetworkbased learning algorithm is applied to network traffic prediction (realworld aggregate Ethernet traffic data) with comparable results. These results indicate that the generalization ability of the FIR neural network is improved by the proposed multiresolution learning algorithm. Index Terms—Finiteimpulseresponse (FIR) neural networks, multiresolution learning, network traffic prediction, wavelet transforms, wavelets. I.
A NeuroWavelet Strategy for Web Traffic Forecasting
"... Recently statistical examination of WorldWide Web (Web) traces have shown evidence that Web traffic arising from the file transfers exhibits a behavior that is consistent with the notion of selfsimilarity. Essentially, selfsimilarity indicates that significant burstiness is present on a wide rang ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Recently statistical examination of WorldWide Web (Web) traces have shown evidence that Web traffic arising from the file transfers exhibits a behavior that is consistent with the notion of selfsimilarity. Essentially, selfsimilarity indicates that significant burstiness is present on a wide range of time scales. We conjecture that Web traffic exhibits characteristics spanning different time scales and investigate how such traffic can be modeled by nonparametric methods. For this purpose, we present a forecasting strategy based on the wavelet decomposition of the original time series into varying scales of temporal resolution, and apply it to Web traffic data. The wavelet transform provides a sensible decomposition of the data so that the underlying temporal structures of the original time series become more tractable. An extensive set of HTTP logs is converted to a univariate traffic time series on the basis of the average number of bytes transferred over a oneminute period. We f...
Timeseries predictions using constrained formulations for neuralnetwork training and cross validation
 In Proc. Int'l Conf. on Intelligent Information Processing, 16th IFIP World Computer Congress
, 2000
"... Abstract In this paper, we formulate the training of artificial neural networks for timeseries prediction as a constrained optimization problem, instead of the traditional formulation as an unconstrained optimization. A constrained formulation is better because violated constraints can provide addi ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
Abstract In this paper, we formulate the training of artificial neural networks for timeseries prediction as a constrained optimization problem, instead of the traditional formulation as an unconstrained optimization. A constrained formulation is better because violated constraints can provide additional guidance during a search than the traditional sumofsquared errors in an unconstrained formulation. Using a constrained formulation, we propose to add constraints on validation errors in order to monitor the prediction quality during training of an ANN. We then solve the constrained problem using our newly developed constrained simulated annealing algorithm (CSA). Experimental results on the sunspot and laser time series show that constrained formulations on training and crossvalidation, when solved by CSA, lead to better prediction quality and less number of weights than previous designs.
Neural Network Methods In Analysing And Modelling Time Varying Processes
, 2003
"... Teknillinen korkeakoulu Sähkö ja tietoliikennetekniikan osasto Laskennallisen tekniikan laboratorio Distribution: ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
Teknillinen korkeakoulu Sähkö ja tietoliikennetekniikan osasto Laskennallisen tekniikan laboratorio Distribution:
Myoelectric Signal Classification Using a Finite Impulse Response Neural Network
"... Recent work by Hudgins [1] has proposed a neural networkbased approach to classifying the myoelectric signal (MES) elicited at the onset of movement of the upper limb. A standard feedforward arti cial network was trained (using the backpropagation algorithm) to discriminate amongst four classes of ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Recent work by Hudgins [1] has proposed a neural networkbased approach to classifying the myoelectric signal (MES) elicited at the onset of movement of the upper limb. A standard feedforward arti cial network was trained (using the backpropagation algorithm) to discriminate amongst four classes of upperlimb movements from the MES, acquired from the biceps and triceps muscles. The approach has demonstrated a powerful means of classifying limb function intent from the MES during natural muscular contraction, but the static nature of the network architecture fails to fully characterize the dynamic structure inherent in the MES. It has been demonstrated [2] that a niteimpulse response (FIR) network has the ability to incorporate the temporal structure of a signal, representing the relationships between events in time and providing translation invariance of the relevant feature set. The application of this network architecture to limb function discrimination from the MES is described here.
Title: Foetal ECG recovery using dynamic neural networks. Authors:
"... Address for correspondence: ..."
(Show Context)