Results 11 - 20
of
51,226
Incremental Communication for Multilayer Neural Networks: Error Analysis
, 1995
"... Artificial neural networks (ANNs) involve a large amount of inter-node communications. To reduce the communication cost as well as the time of learning process in ANNs, we earlier proposed an incremental inter-node communication method. In the incremental communication method, instead of communicati ..."
Abstract
-
Cited by 2 (1 self)
- Add to MetaCart
of communicating the full magnitude of the output value of a node, only the increment or decrement to its previous value is sent on a communication link. In this paper, the effects of the limited precision incremental communication method on the convergence behavior and performance of multilayer neural networks
Multilayer Neural Networks: One Or Two Hidden Layers?
, 1996
"... We study the number of hidden layers required by a multilayer neural network with threshold units to compute a function f from R d to f0; 1g. In spite of similarity with the characterization of linearly separable Boolean functions, this problem presents a higher level of complexity. Gibson charact ..."
Abstract
-
Cited by 6 (2 self)
- Add to MetaCart
We study the number of hidden layers required by a multilayer neural network with threshold units to compute a function f from R d to f0; 1g. In spite of similarity with the characterization of linearly separable Boolean functions, this problem presents a higher level of complexity. Gibson
Backpropagation Without Multiplier for Multilayers Neural Networks
, 1996
"... When multilayer neural networks are implemented with digital hardware, which allows full exploitation of the well developed digital VLS1 technologies, the multiply operations in each neuron between the weights and the inputs can create a bottleneck in the system, because the digital multiplier ..."
Abstract
- Add to MetaCart
When multilayer neural networks are implemented with digital hardware, which allows full exploitation of the well developed digital VLS1 technologies, the multiply operations in each neuron between the weights and the inputs can create a bottleneck in the system, because the digital
Domains of Solutions and Replica Symmetry Breaking in Multilayer Neural Networks
, 1994
"... The relationship between the geometrical structure of weight space and replica symmetry breaking (RSB) in multilayer neural networks is studied using a toy model. The distribution of sizes of the disconnected domains of solution space is computed analytically and compared to the RSB calculation of t ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
The relationship between the geometrical structure of weight space and replica symmetry breaking (RSB) in multilayer neural networks is studied using a toy model. The distribution of sizes of the disconnected domains of solution space is computed analytically and compared to the RSB calculation
Multilayer Neural Networks and Nearest Neighbor Classifier Performances for Image Annotation
- IJACSA) International Journal of Advanced Computer Science and Applications
, 2012
"... Abstract—The explosive growth of image data leads to the research and development of image content searching and indexing systems. Image annotation systems aim at annotating automatically animage with some controlled keywords that can be used for indexing and retrieval of images. This paper presents ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
presents a comparative evaluation of the image content annotation system by using the multilayer neural networks and the nearest neighbour classifier. The region growing segmentation is used to separate objects, the Hu moments, Legendre moments and Zernike moments which are used in as feature descriptors
Multilayer Neural Network-Burg Combination for Acoustical Detection of Buried Objects
, 2008
"... Abstract: A Burg technique is employed to model the long wavelength localization and imaging problem. A Burg method is used as a high resolution and stable technique. The idea of in-line holography is used to increase the ratio of the signal to noise due to the effect of concealing media that decrea ..."
Abstract
- Add to MetaCart
that decreases the value of the received signal. The performance is enhanced by using multilayer neural network for noise reduction. The aim of using multilayer neural network is to extract the essential knowledge from a noisy training data. Theoretical and experimental results have showed that preprocessing
A Relaxation/Regression Algorithm for Efficient Training of Multilayer Neural Networks
"... A new method for training multilayer neural networks has been developed. This method combines the speed of a least squares approach with the iterative nature of backpropagation. This method converges quickly, typically within 10 iterations, where back propagation can take tens of thousands of iterat ..."
Abstract
- Add to MetaCart
A new method for training multilayer neural networks has been developed. This method combines the speed of a least squares approach with the iterative nature of backpropagation. This method converges quickly, typically within 10 iterations, where back propagation can take tens of thousands
Synthesis and Performance Analysis of Multilayer Neural Network Architectures
, 1992
"... this paper we present various approaches for automatic topology--optimization of backpropagation networks. First of all, we review the basics of genetic algorithms which are our essential tool for a topology search. Then we give a survey of backprop and the topological properties of feedforward netw ..."
Abstract
-
Cited by 21 (1 self)
- Add to MetaCart
implemtation of artificial neural networks. On one hand, training is accelerated because fewer connections must be trained per epoch. The same holds for the real time behaviour during recall. Therefore we can postulate that optimizing the architecture is as well important as finding new learnin...
Likelihood ratio of unidentifiable models and multilayer neural networks
, 2003
"... This paper discusses the behavior of the maximum likelihood estimator, in the case that the true parameter cannot be identified uniquely. Among many statistical models with unidentifiability, neural network models are the main concern of this paper. It has been known in some models with unidentifiab ..."
Abstract
-
Cited by 18 (2 self)
- Add to MetaCart
This paper discusses the behavior of the maximum likelihood estimator, in the case that the true parameter cannot be identified uniquely. Among many statistical models with unidentifiability, neural network models are the main concern of this paper. It has been known in some models
Analyzing the Performance of Multilayer Neural Networks for Object Recognition
"... Abstract. In the last two years, convolutional neural networks (CNNs) have achieved an impressive suite of results on standard recognition datasets and tasks. CNN-based features seem poised to quickly replace engineered representations, such as SIFT and HOG. However, compared to SIFT and HOG, we und ..."
Abstract
-
Cited by 14 (1 self)
- Add to MetaCart
Abstract. In the last two years, convolutional neural networks (CNNs) have achieved an impressive suite of results on standard recognition datasets and tasks. CNN-based features seem poised to quickly replace engineered representations, such as SIFT and HOG. However, compared to SIFT and HOG, we
Results 11 - 20
of
51,226