• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 2,113
Next 10 →

Distortion invariant object recognition in the dynamic link architecture

by Martin Lades, Jan C. Vorbrüggen, Joachim Buhmann, Christoph v. d. Malsburg, Rolf P. Würtz, Wolfgang Konen - IEEE TRANSACTIONS ON COMPUTERS , 1993
"... We present an object recognition system based on the Dynamic Link Architecture, which is an extension to classical Artificial Neural Networks. The Dynamic Link Architecture ex-ploits correlations in the fine-scale temporal structure of cellular signals in order to group neurons dynamically into hig ..."
Abstract - Cited by 637 (80 self) - Add to MetaCart
We present an object recognition system based on the Dynamic Link Architecture, which is an extension to classical Artificial Neural Networks. The Dynamic Link Architecture ex-ploits correlations in the fine-scale temporal structure of cellular signals in order to group neurons dynamically

Contour enhancement, short-term memory, and constancies in reverberating neural networks

by Stephen Grossberg - Studies in Applied Math , 1973
"... A model of the nonlinear dynamics of reverberating on-center off-surround networks of nerve cells, or of cell populations, is analysed. The on-center off-surround anatomy allows patterns to be processed across populations without saturating the populations ' response to large inputs. The signal ..."
Abstract - Cited by 245 (93 self) - Add to MetaCart
A model of the nonlinear dynamics of reverberating on-center off-surround networks of nerve cells, or of cell populations, is analysed. The on-center off-surround anatomy allows patterns to be processed across populations without saturating the populations ' response to large inputs

The induction of dynamical recognizers

by Jordan B. Pollack - Machine Learning , 1991
"... A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the le ..."
Abstract - Cited by 225 (14 self) - Add to MetaCart
A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination

The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks

by Marcus Frean, Marcus Frean - Neural Computation Vol.2 , 1990
"... A general method for building and training multilayer perceptrons composed of linear threshold units is proposed. A simple recursive rule is used to build the structure of the network by adding units as they are needed, while a modified perceptron algorithm is used to learn the connection strengths. ..."
Abstract - Cited by 192 (1 self) - Add to MetaCart
A general method for building and training multilayer perceptrons composed of linear threshold units is proposed. A simple recursive rule is used to build the structure of the network by adding units as they are needed, while a modified perceptron algorithm is used to learn the connection strengths

Kernel-based methods for hyperspectral image classification

by Gustavo Camps-valls, Lorenzo Bruzzone, Senior Member - IEEE Transactions on Geoscience and Remote Sensing , 2005
"... Abstract—This paper presents the framework of kernel-based methods in the context of hyperspectral image classification, illustrating from a general viewpoint the main characteristics of different kernel-based approaches and analyzing their properties in the hyperspectral domain. In particular, we a ..."
Abstract - Cited by 150 (25 self) - Add to MetaCart
Abstract—This paper presents the framework of kernel-based methods in the context of hyperspectral image classification, illustrating from a general viewpoint the main characteristics of different kernel-based approaches and analyzing their properties in the hyperspectral domain. In particular, we

An Efficient Gradient-Based Algorithm for On-Line Training of Recurrent Network Trajectories

by Ronald J. Williams, Jing Peng - Neural Computation , 1990
"... A novel variant of a familiar recurrent network learning algorithm is described. This algorithm is capable of shaping the behavior of an arbitrary recurrent network as it runs, and it is specifically designed to execute efficiently on serial machines. 1 Introduction Artificial neural networks having ..."
Abstract - Cited by 145 (3 self) - Add to MetaCart
A novel variant of a familiar recurrent network learning algorithm is described. This algorithm is capable of shaping the behavior of an arbitrary recurrent network as it runs, and it is specifically designed to execute efficiently on serial machines. 1 Introduction Artificial neural networks

A critical role for the right fronto-insular cortex in switching between central-executive and default-mode networks.

by Devarajan Sridharan , † ‡ , Daniel J Levitin , Vinod Menon - Proc Natl Acad Sci USA , 2008
"... Cognitively demanding tasks that evoke activation in the brain's central-executive network (CEN) have been consistently shown to evoke decreased activation (deactivation) in the default-mode network (DMN). The neural mechanisms underlying this switch between activation and deactivation of larg ..."
Abstract - Cited by 178 (1 self) - Add to MetaCart
Causality Analysis (GCA), to provide information about the dynamics and directionality of signaling in cortical circuits In the second experiment, we investigated the generality of network switching mechanisms involving the FIC by examining brain responses elicited during a visual "

Dynamic clustering for acoustic target tracking in wireless sensor networks

by Wei-peng Chen, Jennifer C. Hou, Lui Sha , 2003
"... In the paper, we devise and evaluate a fully decentralized, light-weight, dynamic clustering algorithm for target tracking. Instead of assuming the same role for all the sensors, we envision a hierarchical sensor network that is composed of (a) a static backbone of sparsely placed high-capability se ..."
Abstract - Cited by 138 (1 self) - Add to MetaCart
In the paper, we devise and evaluate a fully decentralized, light-weight, dynamic clustering algorithm for target tracking. Instead of assuming the same role for all the sensors, we envision a hierarchical sensor network that is composed of (a) a static backbone of sparsely placed high-capability

Analog Computation via Neural Networks

by Hava T. Siegelmann, Eduardo D. Sontag - THEORETICAL COMPUTER SCIENCE , 1994
"... We pursue a particular approach to analog computation, based on dynamical systems of the type used in neural networks research. Our systems have a fixed structure, invariant in time, corresponding to an unchanging number of "neurons". If allowed exponential time for computation, they turn ..."
Abstract - Cited by 96 (10 self) - Add to MetaCart
We pursue a particular approach to analog computation, based on dynamical systems of the type used in neural networks research. Our systems have a fixed structure, invariant in time, corresponding to an unchanging number of "neurons". If allowed exponential time for computation

Gradient-Based Learning Algorithms for Recurrent Networks and Their Computational Complexity

by Ronald J. Williams, David Zipser , 1995
"... Introduction 1.1 Learning in Recurrent Networks Connectionist networks having feedback connections are interesting for a number of reasons. Biological neural networks are highly recurrently connected, and many authors have studied recurrent network models of various types of perceptual and memory pr ..."
Abstract - Cited by 149 (4 self) - Add to MetaCart
Introduction 1.1 Learning in Recurrent Networks Connectionist networks having feedback connections are interesting for a number of reasons. Biological neural networks are highly recurrently connected, and many authors have studied recurrent network models of various types of perceptual and memory
Next 10 →
Results 1 - 10 of 2,113
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University