• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 21
Next 10 →

An experimental unification of reservoir computing methods

by D. Verstraeten, B. Schrauwen, M. D'Haene, D. Stroobandt , 2007
"... Three different uses of a recurrent neural network (RNN) as a reservoir that is not trained but instead read out by a simple external classification layer have been described in the literature: Liquid State Machines (LSMs), Echo State Networks (ESNs) and the Backpropagation Decorrelation (BPDC) lea ..."
Abstract - Cited by 70 (10 self) - Add to MetaCart
Three different uses of a recurrent neural network (RNN) as a reservoir that is not trained but instead read out by a simple external classification layer have been described in the literature: Liquid State Machines (LSMs), Echo State Networks (ESNs) and the Backpropagation Decorrelation (BPDC

Improving reservoirs using Intrinsic Plasticity

by Benjamin Schrauwen , Marion Wardermann , David Verstraeten, Jochen J. Steil, Dirk Stroobandt , 2007
"... The benefits of using Intrinsic Plasticity (IP), an unsupervised, local, biologically inspired adaptation rule that tunes the probability density of a neuron’s output towards an exponential distribution – thereby realizing an information maximization – have already been demonstrated. In this work, w ..."
Abstract - Cited by 14 (1 self) - Add to MetaCart
that the rule converges to the expected distributions, even in random recurrent networks. The IP rule is evaluated in a Reservoir Computing setting, which is a temporal processing technique which uses random, un-trained recurrent networks as excitable media, where the network’s state is fed to a linear

Optimizing microcircuits through reward modulated STDP

by Prashant Joshi, Jochen Triesch
"... What is reservoir computing? Figure 1: The initial ”liquid computing ” model of [1] and its subsequent expansion by allowing feedback [2] from trained linear readouts (dashed line).The circuit itself is a generic recurrent circuit, based on biological data (not constructed for any particular task). ..."
Abstract - Add to MetaCart
What is reservoir computing? Figure 1: The initial ”liquid computing ” model of [1] and its subsequent expansion by allowing feedback [2] from trained linear readouts (dashed line).The circuit itself is a generic recurrent circuit, based on biological data (not constructed for any particular task

Learning Flexible Full Body Kinematics for Humanoid Tool Use

by Matthias Rolf, Jochen J. Steil, Michael Gienger - LAB-RS , 2010
"... We show that inverse kinematics of different tools can be efficiently learned with a single recurrent neural network. Our model exploits all upper body degrees of freedom of the Honda humanoid robot research platform. Both hands are controlled at the same time with parametrized tool geometry. We sho ..."
Abstract - Cited by 2 (2 self) - Add to MetaCart
show that generalization both in space as well as across tools is possible from very few training data. The network even permits extrapolation beyond the training data. For training we use an efficient online scheme for recurrent reservoir networks utilizing supervised backpropagation

Efficient exploration and learning of whole body kinematics

by Matthias Rolf, Jochen J. Steil, Michael Gienger - IEEE 8TH INTERNATIONAL CONFERENCE ON DEVELOPMENT AND LEARNING , 2009
"... We present a neural network approach to early motor learning. The goal is to explore the needs for bootstrapping the control of hand movements in a biologically plausible learning scenario. The model is applied to the control of hand postures of the humanoid robot ASIMO by means of full upper body m ..."
Abstract - Cited by 15 (9 self) - Add to MetaCart
movements. For training, we use an efficient online scheme for recurrent reservoir networks consisting of supervised backpropagation-decorrelation output adaptation and an unsupervised intrinsic plasticity reservoir optimization. We demonstrate that the network can acquire accurate inverse models

Memory Capacity of Input-Driven Echo State Networks at the Edge of Chaos

by Peter Barančok , Igor Farkaš
"... Abstract. Reservoir computing provides a promising approach to efficient training of recurrent neural networks, by exploiting the computational properties of the reservoir structure. Various approaches, ranging from suitable initialization to reservoir optimization by training have been proposed. I ..."
Abstract - Add to MetaCart
Abstract. Reservoir computing provides a promising approach to efficient training of recurrent neural networks, by exploiting the computational properties of the reservoir structure. Various approaches, ranging from suitable initialization to reservoir optimization by training have been proposed

Support vector echo-state machine for chaotic time-series prediction

by Zhiwei Shi, Min Han - IEEE Transactions on Neural Networks
"... Abstract: A novel chaotic time series prediction method based on support vector machines and echo state mechanisms is proposed. The basic idea is replacing “kernel trick ” with “reservoir trick ” in dealing with nonlinearity, that is, performing linear support vector regression in the high dimension ..."
Abstract - Cited by 12 (0 self) - Add to MetaCart
dimension “reservoir ” state space, and the solution benefits from the advantages from structural risk minimization principle, and we call it SVESMs (Support Vector Echo State Machines). SVESMs belong to a special kind of recurrent neural networks with convex objective function, and its solution is global

Supervised and evolutionary learning of echo state networks

by Fei Jiang, Hugues Berry, Marc Schoenauer - Parallel Problem Solving from Nature - PPSN X, 10th International Conference , 2008
"... Abstract. A possible alternative to topology fine-tuning for Neural Network (NN) optimization is to use Echo State Networks (ESNs), recurrent NNs built upon a large reservoir of sparsely randomly connected neurons. The promises of ESNs have been fulfilled for supervised learning tasks, but unsupervi ..."
Abstract - Cited by 10 (1 self) - Add to MetaCart
Abstract. A possible alternative to topology fine-tuning for Neural Network (NN) optimization is to use Echo State Networks (ESNs), recurrent NNs built upon a large reservoir of sparsely randomly connected neurons. The promises of ESNs have been fulfilled for supervised learning tasks

IEEE TRANSACTIONS ON NEURAL NETWORKS 1 Minimum Complexity Echo State Network

by Ali Rodan, Student Member
"... Abstract—Reservoir computing (RC) refers to a new class of state-space models with a fixed state transition structure (the “reservoir”) and an adaptable readout form the state space. The reservoir is supposed to be sufficiently complex so as to capture a large number of features of the input stream ..."
Abstract - Add to MetaCart
to the standard echo state network methodology. The (short term) memory capacity of linear cyclic reservoirs can be made arbitrarily close to the proved optimal value. Index Terms—Reservoir computing, Echo state networks, Sim-ple recurrent neural networks, Memory capability, Time series prediction I.

Optoelectronic systems trained with backpropagation through time

by Michiel Hermans, Joni Dambre, Peter Bienstman - IEEE Trans. Neural , 2014
"... Abstract — Delay-coupled optoelectronic systems form promising can-didates to act as powerful information processing devices. In this brief, we consider such a system that has been studied before in the context of reservoir computing (RC). Instead of viewing the system as a random dynamical system, ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
, we see it as a true machine-learning model, which can be fully optimized. We use a recently introduced extension of backpropagation through time, an optimization algorithm originally designed for recurrent neural networks, and use it to let the network perform a difficult phoneme recognition task. We
Next 10 →
Results 1 - 10 of 21
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University