Results 1  10
of
414,840
Bidirectional recurrent neural networks
 IEEE Transactions on Signal Processing
, 1997
"... Abstract—In the first part of this paper, a regular recurrent neural network (RNN) is extended to a bidirectional recurrent neural network (BRNN). The BRNN can be trained without the limitation of using input information just up to a preset future frame. This is accomplished by training it simultane ..."
Abstract

Cited by 85 (2 self)
 Add to MetaCart
Abstract—In the first part of this paper, a regular recurrent neural network (RNN) is extended to a bidirectional recurrent neural network (BRNN). The BRNN can be trained without the limitation of using input information just up to a preset future frame. This is accomplished by training
Stability of recurrent neural networks
, 2006
"... This article studies the existence, uniqueness and bounds almost periodic solution for recurrent neural networks with fractional distributed delays. We show that the zero solution is asymptotically and globally exponentially stable by using generalized Halanay inequality and Laypaunov functional met ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
This article studies the existence, uniqueness and bounds almost periodic solution for recurrent neural networks with fractional distributed delays. We show that the zero solution is asymptotically and globally exponentially stable by using generalized Halanay inequality and Laypaunov functional
RECURRENT NEURAL NETWORK
"... An RNN can in principle map from the entire history of previous inputs to each output. The idea is that the recurrent connections allow a memory of previous inputs to persist in the network's internal state, and thereby influence the network's output. 2 RECURRENT NEURAL NETWORK (RNN) Consi ..."
Abstract
 Add to MetaCart
An RNN can in principle map from the entire history of previous inputs to each output. The idea is that the recurrent connections allow a memory of previous inputs to persist in the network's internal state, and thereby influence the network's output. 2 RECURRENT NEURAL NETWORK (RNN
Multidimensional recurrent neural networks
 in Proceedings of the 2007 International Conference on Artificial Neural Networks
, 2007
"... Abstract. Recurrent neural networks (RNNs) have proved effective at one dimensional sequence learning tasks, such as speech and online handwriting recognition. Some of the properties that make RNNs suitable for such tasks, for example robustness to input warping, and the ability to access contextual ..."
Abstract

Cited by 15 (6 self)
 Add to MetaCart
Abstract. Recurrent neural networks (RNNs) have proved effective at one dimensional sequence learning tasks, such as speech and online handwriting recognition. Some of the properties that make RNNs suitable for such tasks, for example robustness to input warping, and the ability to access
Dynamic recurrent neural networks
, 1990
"... We survey learning algorithms for recurrent neural networks with hidden units and attempt to put the various techniques into a common framework. We discuss xpoint learning algorithms, namely recurrent backpropagation and deterministic Boltzmann Machines, and non xpoint algorithms, namely backpropa ..."
Abstract

Cited by 34 (3 self)
 Add to MetaCart
We survey learning algorithms for recurrent neural networks with hidden units and attempt to put the various techniques into a common framework. We discuss xpoint learning algorithms, namely recurrent backpropagation and deterministic Boltzmann Machines, and non xpoint algorithms, namely
Learning with Recurrent Neural Networks
 Lecture Notes in Control and Information Sciences 254
, 1999
"... : This thesis examines socalled folding neural networks as a mechanism for machine learning. Folding networks form a generalization of partial recurrent neural networks such that they are able to deal with tree structured inputs instead of simple linear lists. In particular, they can handle classi ..."
Abstract

Cited by 25 (16 self)
 Add to MetaCart
: This thesis examines socalled folding neural networks as a mechanism for machine learning. Folding networks form a generalization of partial recurrent neural networks such that they are able to deal with tree structured inputs instead of simple linear lists. In particular, they can handle
Recurrent Neural Networks For . . .
, 2000
"... Recurrent neural networks have been used for timeseries prediction with good results. In this dissertation we compare recurrent neural networks with timedelayed feed forward networks, feed forward networks and linear regression models to see which architecture that can make the most accurate predi ..."
Abstract
 Add to MetaCart
Recurrent neural networks have been used for timeseries prediction with good results. In this dissertation we compare recurrent neural networks with timedelayed feed forward networks, feed forward networks and linear regression models to see which architecture that can make the most accurate
An Evolutionary Algorithm that Constructs Recurrent Neural Networks
 IEEE TRANSACTIONS ON NEURAL NETWORKS
"... Standard methods for inducing both the structure and weight values of recurrent neural networks fit an assumed class of architectures to every task. This simplification is necessary because the interactions between network structure and function are not well understood. Evolutionary computation, whi ..."
Abstract

Cited by 261 (14 self)
 Add to MetaCart
Standard methods for inducing both the structure and weight values of recurrent neural networks fit an assumed class of architectures to every task. This simplification is necessary because the interactions between network structure and function are not well understood. Evolutionary computation
A Learning Algorithm for Continually Running Fully Recurrent Neural Networks
, 1989
"... The exact form of a gradientfollowing learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal supervised learning tasks. These algorithms have: (1) the advantage that they do not require a precis ..."
Abstract

Cited by 529 (4 self)
 Add to MetaCart
The exact form of a gradientfollowing learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal supervised learning tasks. These algorithms have: (1) the advantage that they do not require a
Interpretation Of Recurrent Neural Networks
, 1997
"... This paper addresses techniques for interpretation and characterization of trained recurrent nets for time series problems. In particular, we focus on assessment of effective memory and suggest an operational definition of memory. Further we discuss the evaluation of learning curves. Various numeric ..."
Abstract
 Add to MetaCart
numerical experiments on time series prediction problems are used to illustrate the potential of the suggested methods. INTRODUCTION It is widely recognized that recurrent neural networks (RNNs) are flexible tools for time series processing, system identification and control problems, see e.g., [3]. Feed
Results 1  10
of
414,840