Results 1  10
of
13
Survey of Neural Transfer Functions
 Neural Computing Surveys
, 1999
"... The choice of transfer functions may strongly influence complexity and performance of neural networks. Although sigmoidal transfer functions are the most common there is no apriorireason why models based on such functions should always provide optimal decision borders. A large number of alternative ..."
Abstract

Cited by 42 (21 self)
 Add to MetaCart
(Show Context)
The choice of transfer functions may strongly influence complexity and performance of neural networks. Although sigmoidal transfer functions are the most common there is no apriorireason why models based on such functions should always provide optimal decision borders. A large number of alternative transfer functions has been described in the literature. A taxonomy of activation and output functions is proposed, and advantages of various nonlocal and local neural transfer functions are discussed. Several lessknown types of transfer functions and new combinations of activation/output functions are described. Universal transfer functions, parametrized to change from localized to delocalized type, are of greatest interest. Other types of neural transfer functions discussed here include functions with activations based on nonEuclidean distance measures, bicentral functions, formed from products or linear combinations of pairs of sigmoids, and extensions of such functions making rotations...
Classification, Association and Pattern Completion Using Neural Similarity Based Methods
 APPLIED MATH. & COMP. SCIENCE
, 2000
"... A framework for SimilarityBased Methods (SBMs) includes many classification models as special cases: neural network of the Radial Basis Function Networks type, Feature Space Mapping neurofuzzy networks based on separable transfer functions, Learning Vector Quantization, variants of the k nearest ne ..."
Abstract

Cited by 18 (17 self)
 Add to MetaCart
(Show Context)
A framework for SimilarityBased Methods (SBMs) includes many classification models as special cases: neural network of the Radial Basis Function Networks type, Feature Space Mapping neurofuzzy networks based on separable transfer functions, Learning Vector Quantization, variants of the k nearest neighbor methods and several new models that may be presented in a network form. Multilayer Perceptrons (MLPs) use scalar products to compute weighted activation of neurons, combining soft hyperplanes to provide decision borders. Distancebased multilayer perceptrons (DMLPs) evaluate similarity of inputs to weights offering a natural generalization of standard MLPs. Clusterbased initialization procedure determining architecture and values of all adaptive parameters is described. Networks
Towards comprehensive foundations of computational intelligence
 In: Duch W, Mandziuk J, Eds, Challenges for Computational Intelligence
, 2007
"... Abstract. Although computational intelligence (CI) covers a vast variety of different methods it still lacks an integrative theory. Several proposals for CI foundations are discussed: computing and cognition as compression, metalearning as search in the space of data models, (dis)similarity based m ..."
Abstract

Cited by 16 (12 self)
 Add to MetaCart
(Show Context)
Abstract. Although computational intelligence (CI) covers a vast variety of different methods it still lacks an integrative theory. Several proposals for CI foundations are discussed: computing and cognition as compression, metalearning as search in the space of data models, (dis)similarity based methods providing a framework for such metalearning, and a more general approach based on chains of transformations. Many useful transformations that extract information from features are discussed. Heterogeneous adaptive systems are presented as particular example of transformationbased systems, and the goal of learning is redefined to facilitate creation of simpler data models. The need to understand data structures leads to techniques for logical and prototypebased rule extraction, and to generation of multiple alternative models, while the need to increase predictive power of adaptive models leads to committees of competent models. Learning from partial observations is a natural extension towards reasoning based on perceptions, and an approach to intuitive solving of such problems is presented. Throughout the paper neurocognitive inspirations are frequently used and are especially important in modeling of the higher cognitive functions. Promising directions such as liquid and laminar computing are identified and many open problems presented. 1
A framework for similaritybased classification methods
 Intelligent Information Systems VII
, 1998
"... Abstract. A general framework for similaritybased (SB) classification methods is presented. Neural networks, such as the Radial Basis Function (RBF) and the Multilayer Perceptrons (MLPs) models, are special cases of SB methods. Many new versions of minimal distance methods are derived from this fra ..."
Abstract

Cited by 9 (7 self)
 Add to MetaCart
(Show Context)
Abstract. A general framework for similaritybased (SB) classification methods is presented. Neural networks, such as the Radial Basis Function (RBF) and the Multilayer Perceptrons (MLPs) models, are special cases of SB methods. Many new versions of minimal distance methods are derived from this framework.
Search and Global Minimization in SimilarityBased Methods.
 In: Int. Joint Conference on Neural Networks (IJCNN
, 1999
"... . The class of similarity based methods (SBM) covers most neural models and many other classifiers. Performance of such methods is significantly improved if irrelevant features are removed and feature weights introduced, scaling their influence on calculation of similarity. Several methods for feat ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
. The class of similarity based methods (SBM) covers most neural models and many other classifiers. Performance of such methods is significantly improved if irrelevant features are removed and feature weights introduced, scaling their influence on calculation of similarity. Several methods for feature selection and weighting are described. As an alternative to the global minimization procedures computationally efficient bestfirst search methods are advocated. Although these methods can be used with any SBM classifier they have been tested using the kNN method, since it is relatively fast and for some databases gives excellent results. A few illustrative examples show significant improvements due to the feature weighting and selection. Introduction. M ANY neural, pattern recognition and machine learning methods developed in the past use explicitly or implicitly similarity measures during the training and classification process (although our focus here is on classification the sam...
Distancebased Multilayer Perceptrons
, 1999
"... Neural network models are presented as special cases of a framework for general SimilarityBased Methods (SBMs). Distancebased multilayer perceptrons (DMLPs) with nonEuclidean metric functions are described. DMLPs evaluate similarity to prototypes making the interpretation of the results eas ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
Neural network models are presented as special cases of a framework for general SimilarityBased Methods (SBMs). Distancebased multilayer perceptrons (DMLPs) with nonEuclidean metric functions are described. DMLPs evaluate similarity to prototypes making the interpretation of the results easier. Renormalization of the input data in the extended feature space brings dramatic changes in the shapes of decision borders. An illustrative example showing these changes is provided.
Minimal Distance Neural Methods
, 1998
"... A general framework for minimal distance methods is presented. Radial Basis Functions (RBFs) and Multilayer Perceptrons (MLPs) neural networks are included in this framework as special cases. New versions of minimal distance methods are formulated. A few of them have been tested on a realworld dat ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
A general framework for minimal distance methods is presented. Radial Basis Functions (RBFs) and Multilayer Perceptrons (MLPs) neural networks are included in this framework as special cases. New versions of minimal distance methods are formulated. A few of them have been tested on a realworld datasets obtaining very encouraging results.
A Framework for SimilarityBased Methods
, 1998
"... Similaritybased methods (SBM) are a generalization of the minimal distance (MD) methods which form a basis of many machine learning and pattern recognition method. Investigation of similarity leads to a fruitful framework in which many classification methods are accommodated. Probability p(CX ; ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Similaritybased methods (SBM) are a generalization of the minimal distance (MD) methods which form a basis of many machine learning and pattern recognition method. Investigation of similarity leads to a fruitful framework in which many classification methods are accommodated. Probability p(CX ; M) of assigning class C to vector X , given the classification model M depends on adaptive parameters of the model and procedures used in calculation, such as: the number of reference vectors taken into account in the neighborhood of X , the maximum size of the neighborhood, parameterization of the similarity measures, the weighting function estimating contributions of neighboring reference vectors, the procedure used to create a set of reference vectors from the training data, the total cost function minimized at the training stage and the kernel function, scaling the influence of the error on the total cost function. SBM may include several models M l and an interpolation procedure to ...
Neural networks in nonEuclidean metric spaces
, 1999
"... Multilayer Perceptrons (MLPs) use scalar products to compute weighted activation of neurons providing decision borders using combinations of soft hyperplanes. The weighted funin activation function corresponds to Euclidean distance functions used to compute similarities between input and weight vec ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Multilayer Perceptrons (MLPs) use scalar products to compute weighted activation of neurons providing decision borders using combinations of soft hyperplanes. The weighted funin activation function corresponds to Euclidean distance functions used to compute similarities between input and weight vector. Replacing the fanin activation function by nonEuclidean distance function offers a natural generalization of the standard MLP model, providing more flexible decision borders. An alternative way leading to similar results is based on renormalization of the input vectors using nonEuclidean norms in extended feature spaces. Both approaches influence the shapes of decision borders dramatically, allowing to reduce the complexity of MLP networks.
Neural Networks from Similarity Based Perspective
 In: New Frontiers in Computational Intelligence and its Applications. Ed. M. Mohammadian, IOS
, 2000
"... A framework for SimilarityBased Methods (SBMs) includes many neural network models as special cases. Multilayer Perceptrons (MLPs) use scalar products to compute weighted activation of neurons, combining soft hyperplanes to provide decision borders. Scalar product is replaced by a distance function ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
A framework for SimilarityBased Methods (SBMs) includes many neural network models as special cases. Multilayer Perceptrons (MLPs) use scalar products to compute weighted activation of neurons, combining soft hyperplanes to provide decision borders. Scalar product is replaced by a distance function between the inputs and the weights, offering a natural generalization of the standard MLP model to the distancebased multilayer perceptron (DMLP) model. DMLPs evaluate similarity of inputs to weights making the interpretation of their mappings easier. Clusterbased initialization procedure determining architecture and values of all adaptive parameters is described. DMLP networks are useful not only for classification and approximation, but also as associative memories, in problems requiring pattern completion, offering an efficient way to deal with missing values. NonEuclidean distance functions may also be introduced by normalization of the input vectors in an extended fe...