Results

**1 - 3**of**3**### Predicting Sequential Data with LSTMs Augmented with Strictly 2-Piecewise Input Vectors

, 2016

"... Abstract Recurrent neural networks such as Long-Short Term Memory (LSTM) are often used to learn from various kinds of time-series data, especially those that involved long-distance dependencies. We introduce a vector representation for the Strictly 2-Piecewise (SP-2) formal languages, which encode ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract Recurrent neural networks such as Long-Short Term Memory (LSTM) are often used to learn from various kinds of time-series data, especially those that involved long-distance dependencies. We introduce a vector representation for the Strictly 2-Piecewise (SP-2) formal languages, which encode certain kinds of long-distance dependencies using subsequences. These vectors are added to the LSTM architecture as an additional input. Through experiments with the problems in the SPiCe dataset

### lif.univ-mrs.fr

"... We define two proper subclasses of sub-sequential functions based on the concept of Strict Locality (McNaughton and Papert, 1971; Rogers and Pullum, 2011; Rogers et al., 2013) for formal languages. They are called ..."

Abstract
- Add to MetaCart

(Show Context)
We define two proper subclasses of sub-sequential functions based on the concept of Strict Locality (McNaughton and Papert, 1971; Rogers and Pullum, 2011; Rogers et al., 2013) for formal languages. They are called