Results 1  10
of
110
Resonance and the Perception of Musical Meter
 CONNECTION SCIENCE
, 1994
"... Many connectionist approaches to musical expectancy and music composition let the question of "What next?" overshadow the equally important question of "When next?". One cannot escape the latter question, one of temporal structure, when considering the perception of musical meter ..."
Abstract

Cited by 115 (5 self)
 Add to MetaCart
Many connectionist approaches to musical expectancy and music composition let the question of "What next?" overshadow the equally important question of "When next?". One cannot escape the latter question, one of temporal structure, when considering the perception of musical meter. We view the perception of metrical structure as a dynamic process where the temporal organization of external musical events synchronizes, or entrains, a listener's internal processing mechanisms. This article introduces a novel connectionist unit, based upon a mathematical model of entrainment, capable of phase and frequencylocking to periodic components of incoming rhythmic patterns. Networks of these units can selforganize temporally structured responses to rhythmic patterns. The resulting network behavior embodies the perception of metrical structure. The article concludes with a discussion of the implications of our approach for theories of metrical structure and musical expectancy.
DRAMA, a Connectionist Architecture for Control and Learning in Autonomous Robots
 Adaptive Behavior
, 1999
"... this paper gives ..."
(Show Context)
Learning to Forget: Continual Prediction with LSTM
 NEURAL COMPUTATION
, 1999
"... Long ShortTerm Memory (LSTM, Hochreiter & Schmidhuber, 1997) can solve numerous tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams that are not a priori segmented into subsequenc ..."
Abstract

Cited by 59 (25 self)
 Add to MetaCart
Long ShortTerm Memory (LSTM, Hochreiter & Schmidhuber, 1997) can solve numerous tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams that are not a priori segmented into subsequences with explicitly marked ends at which the network's internal state could be reset. Without resets, the state may grow indenitely and eventually cause the network to break down. Our remedy is a novel, adaptive \forget gate" that enables an LSTM cell to learn to reset itself at appropriate times, thus releasing internal resources. We review illustrative benchmark problems on which standard LSTM outperforms other RNN algorithms. All algorithms (including LSTM) fail to solve continual versions of these problems. LSTM with forget gates, however, easily solves them in an elegant way.
Hybrid Neural Systems
, 2000
"... This chapter provides an introduction to the field of hybrid neural systems. Hybrid neural systems are computational systems which are based mainly on artificial neural networks but also allow a symbolic interpretation, or interaction with symbolic components. In this overview, we will describe rece ..."
Abstract

Cited by 49 (10 self)
 Add to MetaCart
(Show Context)
This chapter provides an introduction to the field of hybrid neural systems. Hybrid neural systems are computational systems which are based mainly on artificial neural networks but also allow a symbolic interpretation, or interaction with symbolic components. In this overview, we will describe recent results of hybrid neural systems. We will give a brief overview of the main methods used, outline the work that is presented here, and provide additional references. We will also highlight some important general issues and trends.
On the Analysis of Pattern Sequences by SelfOrganizing Maps
, 1994
"... This thesis is organized in three parts. In the first part, the SelfOrganizing Map algorithm is introduced. The discussion focuses on the analysis of the SelfOrganizing Map algorithm. It is shown that the nonlinear nature of the algorithm makes it difficult to analyze the algorithm except in some ..."
Abstract

Cited by 31 (0 self)
 Add to MetaCart
This thesis is organized in three parts. In the first part, the SelfOrganizing Map algorithm is introduced. The discussion focuses on the analysis of the SelfOrganizing Map algorithm. It is shown that the nonlinear nature of the algorithm makes it difficult to analyze the algorithm except in some trivial cases. In the second part the SelfOrganizing Map algorithm is applied to several patterns sequence analysis tasks. The first application is a voice quality analysis system. It is shown that the SelfOrganizing Map algorithm can be applied to voice analysis by providing the visualization of certain deviations. The key point in the applicability of SelfOrganizing Map algorithm is the topological nature of the mapping; similar voice samples are mapped to nearby locations in the map. The second application is a speech recognition system. Through several experiments it is demonstrated that by collecting some time dependent features and using them in conjunction with the basic SelfOrgan...
Dynamical Recurrent Neural Networks  Towards Environmental Time Series Prediction
, 1995
"... Dynamical Recurrent Neural Networks (DRNN) (Aussem 1994) are a class of fully recurrent networks obtained by modeling synapses as autoregressive filters. By virtue of their internal dynamic, these networks approximate the underlying law governing the time series by a system of nonlinear difference e ..."
Abstract

Cited by 27 (8 self)
 Add to MetaCart
Dynamical Recurrent Neural Networks (DRNN) (Aussem 1994) are a class of fully recurrent networks obtained by modeling synapses as autoregressive filters. By virtue of their internal dynamic, these networks approximate the underlying law governing the time series by a system of nonlinear difference equations of internal variables. They therefore provide historysensitive forecasts without having to be explicitly fed with external memory. The model is trained by a local and recursive error propagation algorithm called temporalrecurrentbackpropagation. The efficiency of the procedure benefits from the exponential decay of the gradient terms backpropagated through the adjoint network. We assess the predictive ability of the DRNN model with meteorological and astronomical time series recorded around the candidate observation sites for the future VLT telescope. The hope is that reliable environmental forecasts provided with the model will allow the modern telescopes to be preset, a few hou...
Unsupervised Neural Network Learning Procedures . . .
, 1996
"... In this article, we review unsupervised neural network learning procedures which can be applied to the task of preprocessing raw data to extract useful features for subsequent classification. The learning algorithms reviewed here are grouped into three sections: informationpreserving methods, densi ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
(Show Context)
In this article, we review unsupervised neural network learning procedures which can be applied to the task of preprocessing raw data to extract useful features for subsequent classification. The learning algorithms reviewed here are grouped into three sections: informationpreserving methods, density estimation methods, and feature extraction methods. Each of these major sections concludes with a discussion of successful applications of the methods to realworld problems.
A survey of hybrid ANN/HMM models for automatic speech recognition
 Neurocomputing
, 2001
"... In spite of the advances accomplished throughout the last decades, automatic speech recognition (ASR) is still a challenging and di$cult task. In particular, recognition systems based on hidden Markov models (HMMs) are e!ective under many circumstances, but do su!er from some major limitations that ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
(Show Context)
In spite of the advances accomplished throughout the last decades, automatic speech recognition (ASR) is still a challenging and di$cult task. In particular, recognition systems based on hidden Markov models (HMMs) are e!ective under many circumstances, but do su!er from some major limitations that limit applicability of ASR technology in realworld environments. Attempts were made to overcome these limitations with the adoption of arti"cial neural networks (ANN) as an alternative paradigm for ASR, but ANN were unsuccessful in dealing with long timesequences of speech signals. Between the end of the 1980s and the beginning of the 1990s, some researchers began exploring a new research area, by combining HMMs and ANNs within a single, hybrid architecture. The goal in hybrid systems for ASR is to take advantage from the properties of both HMMs and ANNs, improving #exibility and recognition performance. A variety of di!erent architectures and novel training algorithms have been proposed in literature. This paper reviews a number of signi"cant hybrid models for ASR, putting together approaches and techniques from a highly specialistic and nonhomogeneous literature. E!orts concentrate on describing and referencing architectures and algorithms, their
Neural Networks for Time Series Processing
 Neural Network World
, 1996
"... This paper provides an overview over the most common neural network types for time series processing, i.e. pattern recognition and forecasting in spatiotemporal patterns. Emphasis is put on the relationships between neural network models and more classical approaches to time series processing, in p ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
(Show Context)
This paper provides an overview over the most common neural network types for time series processing, i.e. pattern recognition and forecasting in spatiotemporal patterns. Emphasis is put on the relationships between neural network models and more classical approaches to time series processing, in particular, forecasting. The paper begins with an introduction of the basics of time series processing, and discusses feedforward as well as recurrent neural networks, with respect to their ability to model nonlinear dependencies in spatiotemporal patterns.
Recurrent SOM with Local Linear Models in Time Series Prediction
 In 6th European Symposium on Artificial Neural Networks
, 1998
"... Recurrent SelfOrganizing Map (RSOM) is studied in three different time series prediction cases. RSOM is used to cluster the series into local data sets, for which corresponding local linear models are estimated. RSOM includes recurrent difference vector in each unit which allows storing context fro ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
(Show Context)
Recurrent SelfOrganizing Map (RSOM) is studied in three different time series prediction cases. RSOM is used to cluster the series into local data sets, for which corresponding local linear models are estimated. RSOM includes recurrent difference vector in each unit which allows storing context from the past input vectors. Multilayer perceptron (MLP) network and autoregressive (AR) model are used to compare the prediction results. In studied cases RSOM shows promising results.