## Evolutionary optimization of radial basis function classifiers for data mining applications (2005)

Venue: | IEEE Transactions on Systems Man and Cybernetics Part B-Cybernetics |

Citations: | 9 - 0 self |

### BibTeX

@ARTICLE{Buchtala05evolutionaryoptimization,

author = {Oliver Buchtala and Manuel Klimek and Bernhard Sick},

title = {Evolutionary optimization of radial basis function classifiers for data mining applications},

journal = {IEEE Transactions on Systems Man and Cybernetics Part B-Cybernetics},

year = {2005},

volume = {35},

pages = {928--947}

}

### OpenURL

### Abstract

Abstract—In many data mining applications that address classification problems, feature and model selection are considered as key tasks. That is, appropriate input features of the classifier must be selected from a given (and often large) set of possible features and structure parameters of the classifier must be adapted with respect to these features and a given data set. This paper describes an evolutionary algorithm (EA) that performs feature and model selection simultaneously for radial basis function (RBF) classifiers. In order to reduce the optimization effort, various techniques are integrated that accelerate and improve the EA significantly: hybrid training of RBF networks, lazy evaluation, consideration of soft constraints by means of penalty terms, and temperature-based adaptive control of the EA. The feasibility and the benefits of the approach are demonstrated by means of four data mining problems: intrusion detection in computer networks, biometric signature verification, customer acquisition with direct marketing methods, and optimization of chemical production processes. It is shown that, compared to earlier EA-based RBF optimization techniques, the runtime is reduced by up to 99% while error rates are lowered by up to 86%, depending on the application. The algorithm is independent of specific applications so that many ideas and solutions can be transferred to other classifier paradigms. Index Terms—Data mining, evolutionary algorithm (EA), feature selection, model selection, radial basis function (RBF) network. I.

### Citations

5369 |
Neural Networks for Pattern Recognition
- Bishop
- 1995
(Show Context)
Citation Context .... Potential feature selection algorithms are described in [9], [10]. In general, filter and wrapper approaches can be distinguished. The problem of model selection for neural networks is discussed in =-=[11]-=-–[13] in greater detail. Usually, these techniques are categorized as being either constructive (growing techniques), destructive (pruning techniques), or hybrid. The subject of data mining and knowle... |

3085 |
UCI repository of machine learning databases
- Blake, Merz
- 1998
(Show Context)
Citation Context ...lidated by only one single data set. Apart from the investigations shown here we also carried out experiments with generic data and data taken from the Proben1 or the UCI benchmark collections [104], =-=[105]-=-. The final outcome of the EA is the best network as described in Section IV-C2. Therefore, the classification error rates of this network (or lift factors in the case of DM) on validation data used b... |

2221 |
Genetic Algorithms + Data Structures = Evolutionary Programs, Third, revised and extended edition
- Michalewicz
- 1996
(Show Context)
Citation Context ... klimek@box4.net). Digital Object Identifier 10.1109/TSMCB.2005.847743 1083-4419/$20.00 © 2005 IEEE vector machines (SVM) with Gaussian kernel functions [1], [6]. 2) Evolutionary algorithms (EA, [7], =-=[8]-=-) are used for architecture optimization (combined feature and model selection) of the RBF networks. Here, this class of optimization algorithms is chosen because the search space is high-dimensional ... |

2126 |
Pattern Classification
- Duda, Hart, et al.
- 2001
(Show Context)
Citation Context ... an equal a-priori distribution of classes. That means, the loss of correct classification is zero, whereas the loss of errors for each class is proportional to the a-priori probability of that class =-=[93]-=-. Hence, the classification rate of a certain network with respect to a validation set is In this case, the basis fitness is . In a sample subset selection problem appropriate input patterns must be s... |

2070 | Some methods for classification and analysis of multivariate observations
- MacQueen
- 1967
(Show Context)
Citation Context ...the cluster centers (cf. definition of empirical variance). The originals934 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART B: CYBERNETICS, VOL. 35, NO. 5, OCTOBER 2005 -means algorithm (cf. =-=[97]-=- and [98]) is modified in the following way. A stable initialization of prototypes prior to the first -means step leads to better results and a deterministic behavior of this clustering algorithm. In ... |

1724 |
An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge Univ
- Cristianini, Shawe-Taylor
- 2000
(Show Context)
Citation Context ...rasse 12, 85354 Freising, Germany (e-mail: klimek@box4.net). Digital Object Identifier 10.1109/TSMCB.2005.847743 1083-4419/$20.00 © 2005 IEEE vector machines (SVM) with Gaussian kernel functions [1], =-=[6]-=-. 2) Evolutionary algorithms (EA, [7], [8]) are used for architecture optimization (combined feature and model selection) of the RBF networks. Here, this class of optimization algorithms is chosen bec... |

941 |
Evolutionary Algorithms in Theory and Practice
- Bäck
- 1996
(Show Context)
Citation Context ...mail: klimek@box4.net). Digital Object Identifier 10.1109/TSMCB.2005.847743 1083-4419/$20.00 © 2005 IEEE vector machines (SVM) with Gaussian kernel functions [1], [6]. 2) Evolutionary algorithms (EA, =-=[7]-=-, [8]) are used for architecture optimization (combined feature and model selection) of the RBF networks. Here, this class of optimization algorithms is chosen because the search space is high-dimensi... |

922 | P.: Least squares quantization in PCM
- LLOYD
- 1982
(Show Context)
Citation Context ...er centers (cf. definition of empirical variance). The originals934 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART B: CYBERNETICS, VOL. 35, NO. 5, OCTOBER 2005 -means algorithm (cf. [97] and =-=[98]-=-) is modified in the following way. A stable initialization of prototypes prior to the first -means step leads to better results and a deterministic behavior of this clustering algorithm. In essence, ... |

730 | A direct adaptive method for faster back-propagation learning: The RPROP algorithm
- Riedmiller, Braun
- 1993
(Show Context)
Citation Context ... that combines modified -means, QR decomposition (QRD), and SCG (Section IV-C1a) have already been investigated in detail in [85], [86] but with Backpropagation (BP) and Resilient Propagation (RPROP, =-=[128]-=-) instead of SCG. The conclusions drawn there can be transferred to the TABLE IX LAZY EVALUATION—INTRUSION DETECTION slightly modified training concept applied here. Therefore, only the main findings ... |

666 |
Numerical Methods for Least Squares Problems
- Björck
- 1996
(Show Context)
Citation Context ...e pseudo-inverse that defines the solution of the least-squares problem is not computed explicitly, but the least-squares problem is solved by means of an efficient, numerically stable algorithm (cf. =-=[99]-=- and [100]) such as QR decomposition. If Gaussian basis functions are used (for other permissible functions see [2]) and the centers are chosen to be a subset of the training data and distinct, the ps... |

519 | Fast Learning in Networks of Locally-Tuned Processing Units - Moody, Darken - 1989 |

484 |
Multivariate functional interpolation and adaptive networks
- Broomhead, Lowe
- 1988
(Show Context)
Citation Context ...at the two problems should be addressed simultaneously to achieve the best classification results. In this article we start with the following assumptions. 1) Radial basis function networks (RBF, [1]–=-=[4]-=-) are used for classification. Here, these neural networks are trained to estimate posterior probabilities of class membership by means of mixtures of Gaussian basis functions and hyperplanes. From a ... |

466 | Evolving artificial neural networks
- Yao
- 1999
(Show Context)
Citation Context ...tive (pruning techniques), or hybrid. The subject of data mining and knowledge discovery with EA is addressed in [14]. The combination of EA and neural networks in general is investigated in [15] and =-=[16]-=-. Here, we discuss examples of the combination of EA and RBF networks. Altogether, we investigated 64 publications ([17]–[80]) where EA or closely related techniques (genetic algorithms, evolution str... |

292 | A Data Mining Framework for Building Intrusion Detection Model
- Lee, Stolfo, et al.
- 1999
(Show Context)
Citation Context ...ssible. Typically (see Figs. 9 and 10), the EA converges after about 10–20 cycles for this application and parameter setting. The DARPA data set has been used by many researchers up to now (see [110]–=-=[112]-=-, for instance). It was also the basis for the KDD Cup 1999 that has been won by an approach based on decision trees with a combination of bagging and boosting (overall classification error 7.67%) [11... |

286 |
Feature Selection for Knowledge Discovery and Data Mining
- Liu, Motoda
- 1998
(Show Context)
Citation Context ...s survey will motivate a new approach. A. Related Work This article focuses on “feature selection” and “model selection” for RBF networks. Potential feature selection algorithms are described in [9], =-=[10]-=-. In general, filter and wrapper approaches can be distinguished. The problem of model selection for neural networks is discussed in [11]–[13] in greater detail. Usually, these techniques are categori... |

208 | A theory of networks for approximation and learning
- Poggio, Girosi
- 1989
(Show Context)
Citation Context ...I. CLASSIFICATION WITH RADIAL BASIS FUNCTION NETWORKS Radial basis function (RBF) networks combine a number of different concepts from approximation theory, clustering, and neural network theory [1], =-=[2]-=-, [11]. A key advantage of RBF networks for practitioners is the clear and understandable interpretation of the functionality of basis functions. Also, fuzzy rules may be extracted from RBF networks (... |

205 | Proben1: A set of neural network benchmark problems and benchmarking rules
- Prechelt
- 1994
(Show Context)
Citation Context ...rily validated by only one single data set. Apart from the investigations shown here we also carried out experiments with generic data and data taken from the Proben1 or the UCI benchmark collections =-=[104]-=-, [105]. The final outcome of the EA is the best network as described in Section IV-C2. Therefore, the classification error rates of this network (or lift factors in the case of DM) on validation data... |

173 | A review of evolutionary artificial neural networks
- Yao
- 1993
(Show Context)
Citation Context ...a network architecture—such as the existence of single connections or the assignment of a feature to a specific input node—can be seen as a high-level specification scheme (weak encoding scheme) [89]–=-=[91]-=-. An overview of various neural network encoding strategies is provided in [92]. 2) Initialization of the Population: The feature vector and the number of hidden nodes of individuals in the initial po... |

161 | The 1999 darpa off-line intrusion detection evaluation. Computer Networks
- Lippmann, Haines, et al.
- 2000
(Show Context)
Citation Context ...uch as misuse detection or anomaly detection. The most important evaluations of IDS performed up to now were supported by the Defense Advanced Research Projects Agency (DARPA) in 1998 and 1999 [108], =-=[109]-=-. Here, the data of the first DARPA IDS evaluation is used. RBF networks are applied to classify a communication as being either a certain attack or normal behavior (the latter includes other attack t... |

145 |
Combinations of Genetic Algorithms and Neural Networks: A survey of the state of the art
- Schaffer, Whitley, et al.
- 1992
(Show Context)
Citation Context ..., destructive (pruning techniques), or hybrid. The subject of data mining and knowledge discovery with EA is addressed in [14]. The combination of EA and neural networks in general is investigated in =-=[15]-=- and [16]. Here, we discuss examples of the combination of EA and RBF networks. Altogether, we investigated 64 publications ([17]–[80]) where EA or closely related techniques (genetic algorithms, evol... |

135 |
Data Mining and Knowledge Discovery with Evolutionary Algorithms
- Freitas
- 2002
(Show Context)
Citation Context ...e techniques are categorized as being either constructive (growing techniques), destructive (pruning techniques), or hybrid. The subject of data mining and knowledge discovery with EA is addressed in =-=[14]-=-. The combination of EA and neural networks in general is investigated in [15] and [16]. Here, we discuss examples of the combination of EA and RBF networks. Altogether, we investigated 64 publication... |

135 | Evaluating Intrusion Detection Systems: The 1998 DARPA Off-line Intrusion Detection Evaluation
- Lippmann, Fried, et al.
- 2000
(Show Context)
Citation Context ...are possible. Typically (see Figs. 9 and 10), the EA converges after about 10–20 cycles for this application and parameter setting. The DARPA data set has been used by many researchers up to now (see =-=[110]-=-–[112], for instance). It was also the basis for the KDD Cup 1999 that has been won by an approach based on decision trees with a combination of bagging and boosting (overall classification error 7.67... |

101 |
Neural networks.A comprehensive foundation
- Haykin
- 1999
(Show Context)
Citation Context ...s that the two problems should be addressed simultaneously to achieve the best classification results. In this article we start with the following assumptions. 1) Radial basis function networks (RBF, =-=[1]-=-–[4]) are used for classification. Here, these neural networks are trained to estimate posterior probabilities of class membership by means of mixtures of Gaussian basis functions and hyperplanes. Fro... |

101 |
Feature Extraction Construction and Selection. A Data Mining Perspective
- Liu, Motoda
- 1998
(Show Context)
Citation Context ...f this survey will motivate a new approach. A. Related Work This article focuses on “feature selection” and “model selection” for RBF networks. Potential feature selection algorithms are described in =-=[9]-=-, [10]. In general, filter and wrapper approaches can be distinguished. The problem of model selection for neural networks is discussed in [11]–[13] in greater detail. Usually, these techniques are ca... |

74 | Prunning algorithms-a survey - Reed - 1993 |

73 |
Genetic evolution of the topology and weight distribution of neural networks
- Maniezzo
- 1994
(Show Context)
Citation Context ...s of a network architecture—such as the existence of single connections or the assignment of a feature to a specific input node—can be seen as a high-level specification scheme (weak encoding scheme) =-=[89]-=-–[91]. An overview of various neural network encoding strategies is provided in [92]. 2) Initialization of the Population: The feature vector and the number of hidden nodes of individuals in the initi... |

64 | Intrusion detection using neural networks and support vector machines - Mukkamala, Janoski, et al. |

50 |
Testing and evaluating computer intrusion detection systems
- Durst, Champion, et al.
- 1999
(Show Context)
Citation Context ...ction such as misuse detection or anomaly detection. The most important evaluations of IDS performed up to now were supported by the Defense Advanced Research Projects Agency (DARPA) in 1998 and 1999 =-=[108]-=-, [109]. Here, the data of the first DARPA IDS evaluation is used. RBF networks are applied to classify a communication as being either a certain attack or normal behavior (the latter includes other a... |

49 | Cooperative-competitive genetic evolution of Radial Basis Function centers and widths for time series prediction
- Whitehead, Choate
- 1996
(Show Context)
Citation Context ...1) The computation of weights (particularly centers and radii of basis functions and/or output weights) by EA is suggested in the majority of the publications. The centers are optimized in [17]–[20], =-=[23]-=-, [28]–[33], [37]–[42], [45], [46], [48]–[55], [57], [58], [62], [63], [65]–[71], and [73]–[79], for instance. Radii are considered in [17], [19]–[23], [26]–[28], [30], [32], [38], [45], [46], [50], [... |

36 | Combined genetic algorithm optimization and regularized orthogonal least squares learning for radial basis function networks
- Chen, Wu, et al.
- 1999
(Show Context)
Citation Context ... [71], [73], [75], and [77], for instance. Other architecture parameters are the type of basis functions [57], [58], the training time (epoch number) [36], and parameters of training algorithms [26], =-=[27]-=-, [36], [63]. Examples of such parameters are learning rate and momentum or a regularization parameter used for training with regularized orthogonal least squares training. 3) Feature selection for RB... |

33 |
Radial basis function network configuration using mutual information and the Orthogonal Least Squares algorithms. Neural Network
- Zheng, Billings
- 1996
(Show Context)
Citation Context ...ptimized in about half of the publications. The number of centers (hidden neurons) is considered in [17]–[20], [25], [29]–[31], [33], [40], [42], [45], [46], [50], [52], [54], [57], [62], [67], [68], =-=[70]-=-, [71], [73], [75], and [77], for instance. Other architecture parameters are the type of basis functions [57], [58], the training time (epoch number) [36], and parameters of training algorithms [26],... |

30 |
Winning the kdd99 classification cup: Bagged boosting
- Pfahringer
(Show Context)
Citation Context ...12], for instance). It was also the basis for the KDD Cup 1999 that has been won by an approach based on decision trees with a combination of bagging and boosting (overall classification error 7.67%) =-=[113]-=-. A direct comparison, however, would not be fair because many additional attack types had to be detected etc. Our own work with this data set includes a comparison of various neural and fuzzy classif... |

26 | Understanding interactions among genetic algorithm parameters,” in Foundations of Genetic Algorithms
- Deb, Agrawal
- 1998
(Show Context)
Citation Context ...ntation of an individual is a pivotal step as the representation interacts highly with the choice of evolutionary operators. Thus, the representation has a commensurate impact on the success of an EA =-=[88]-=-. Here, the genotype of an individual is represented by the following. 1) The feature vector: This vector is a binary vector, where each bit indicates the presence (“1”) or absence (“0”) of one of the... |

25 |
Evolving Space-Filling Curves to Distribute Radial Basis Functions Over an Input Space
- Whitehead, Choate
- 1994
(Show Context)
Citation Context ...raint: Primary objectives of a penalty on the training time are to reduce the overall runtime of the EA and to make the network structures smaller (indirect penalty of feature and center numbers, cf. =-=[63]-=-). As a consequence, more solutions can be investigated with the same temporal effort. It can be expected that with a severe penalty the error rates on validation and test data increase but it is not ... |

24 | Multiobjective evolutionary optimization of the size, shape, and position parameters of radial basis function networks for function approximation
- González, Rojas, et al.
- 2003
(Show Context)
Citation Context ...f basis functions and/or output weights) by EA is suggested in the majority of the publications. The centers are optimized in [17]–[20], [23], [28]–[33], [37]–[42], [45], [46], [48]–[55], [57], [58], =-=[62]-=-, [63], [65]–[71], and [73]–[79], for instance. Radii are considered in [17], [19]–[23], [26]–[28], [30], [32], [38], [45], [46], [50], [51], [54], [56], [58], [60], [62], [63], [65]–[69], [71], and [... |

22 | Tuning a Neuro-Fuzzy Controller by Genetic Algorithm
- Seng, Khalid, et al.
- 1999
(Show Context)
Citation Context ...neural networks [57], RBF networks with dynamic receptive fields [59], beta basis function neural networks [65], [66], functional link networks [38], or a neuro-fuzzy controller based on RBF networks =-=[51]-=-. 1) The computation of weights (particularly centers and radii of basis functions and/or output weights) by EA is suggested in the majority of the publications. The centers are optimized in [17]–[20]... |

22 |
Palm，“Three learning phases for radial-basis-function networks
- Schwenker, Kestler, et al.
- 2001
(Show Context)
Citation Context ...n combination with methods for the solution of linear least-squares problems (e.g., -means and singular value decomposition). An overview of various training methods for RBF networks is given in [85]–=-=[87]-=-. For a classification problem with a set of classes , each class is typically assigned its own output neuron (i.e., and ). For training purposes, an orthogonal representation of classes is used at th... |

20 |
Evolutionäre Algorithmen - Verfahren, Operatoren, Hinweise aus der Praxis
- Pohlheim
- 1999
(Show Context)
Citation Context ...tor of a network is defined by and the basis fitness is . 4) Selection for Reproduction: Two selection mechanisms are used here in combination (see Section IV-C3). Stochastic universal sampling (SUS) =-=[96]-=- prefers fitter individuals but also gives a chance to worse individuals. With elitist selection, only the fittest individuals are taken.sBUCHTALA et al.: EVOLUTIONARY OPTIMIZATION OF RADIAL BASIS FUN... |

18 |
Measuring lift quality in database marketing”, ACM SIGKDD Explorations Newsletter 2(2) (2000), p. 76–80. tel-00482649, version 1
- Piatetsky-Shapiro, Steingold
- 1998
(Show Context)
Citation Context ...r the higher, more interesting percentile of scores is defined by a so-called cut-off point. The lift factor is then used to assess the suitability of an individual (network) for an application [94], =-=[95]-=-. Here, the lift factor is defined by the quotient of the response rate of the trained network for a certain class and the percentage of input patterns associated with this class in the overall set of... |

16 |
Second-Order Methods for Neural Networks
- Shepherd
- 1997
(Show Context)
Citation Context ...conjugate gradient (SCG) learning algorithm is applied [101]. Basically, SCG is a very fast and efficient combination of a Conjugate Gradient algorithm and a Model Trust Region approach (cf. [11] and =-=[102]-=-). Also, any first-order (such as Backpropagation or Resilient Propagation) or second-order algorithm (such as conjugate gradient techniques) may be applied. A similar training concept (three-phase le... |

15 |
Biometrics: Advanced Identity Verification: The Complete Guide
- Ashbourn
(Show Context)
Citation Context ...tion methods belong to another, very important category of techniques that ensure data and information security. Here, an example in the field of signature verification is investigated (cf. [116] and =-=[117]-=-). A person provides her/his signature together with a personal identification number (PIN) and a verification system must decide whether the signature corresponds to this PIN or not. Possible applica... |

13 | RBF neural network, basis functions and genetic algorithm
- Maillard, Gueriot
- 1997
(Show Context)
Citation Context ...adii of basis functions and/or output weights) by EA is suggested in the majority of the publications. The centers are optimized in [17]–[20], [23], [28]–[33], [37]–[42], [45], [46], [48]–[55], [57], =-=[58]-=-, [62], [63], [65]–[71], and [73]–[79], for instance. Radii are considered in [17], [19]–[23], [26]–[28], [30], [32], [38], [45], [46], [50], [51], [54], [56], [58], [60], [62], [63], [65]–[69], [71],... |

10 |
Introduction to Scientific Data Mining: Direct Kernel Methods & Applications”, Computationally Intelligent Hybrid Systems: The Fusion
- Mark, Embrechts
(Show Context)
Citation Context ...te posterior probabilities of class membership by means of mixtures of Gaussian basis functions and hyperplanes. From a structural viewpoint, RBF networks are closely related to direct kernel methods =-=[5]-=- and support Manuscript received June 8, 2004; revised September 21, 2004. This paper was recommended by Associate Editor M. Berthold. O. Buchtala and B. Sick are with the Faculty for Computer Science... |

10 |
Detection of small objects in clutter using a GARBF neural network
- Leung, Dubash, et al.
- 2002
(Show Context)
Citation Context ...], [23], [28]–[33], [37]–[42], [45], [46], [48]–[55], [57], [58], [62], [63], [65]–[71], and [73]–[79], for instance. Radii are considered in [17], [19]–[23], [26]–[28], [30], [32], [38], [45], [46], =-=[50]-=-, [51], [54], [56], [58], [60], [62], [63], [65]–[69], [71], and [74]–[79], and weights in the output layer in [18], [19], [51], [56], [57], [60], and [61]. Particular parameters of basis function typ... |

10 | Extracting interpretable fuzzy rules from RBF networks, Neural Process
- Jin, Sendhoff
- 2003
(Show Context)
Citation Context ...1]. A key advantage of RBF networks for practitioners is the clear and understandable interpretation of the functionality of basis functions. Also, fuzzy rules may be extracted from RBF networks (cf. =-=[84]-=-) for deployment in an expert system. The RBF networks used here may be defined as follows (see Fig. 1) [13]. 1) RBF networks have three layers of nodes: input layer , hidden layer , and output layer ... |

9 |
Neuronale Netze. Optimierung durch Lernen und Evolution
- Braun
- 1997
(Show Context)
Citation Context ...mbination of EA and neural networks in general is investigated in [15] and [16]. Here, we discuss examples of the combination of EA and RBF networks. Altogether, we investigated 64 publications ([17]–=-=[80]-=-) where EA or closely related techniques (genetic algorithms, evolution strategies, or immunity-based approaches, for instance) are applied to optimize RBF networks or closely related paradigms in som... |

8 |
Effectiveness of feature extraction in neural network architectures for novelty detection
- ADDISON, WERMTER, et al.
- 1999
(Show Context)
Citation Context ... instance) are applied to optimize RBF networks or closely related paradigms in some respect. Related paradigms are hyper basis function networks [21], probabilistic neural networks [22], [29], [31], =-=[47]-=-, second-order multilayer perceptrons (MLP) [36], hybrid RBF-MLP networks [25], [49], Volterra polynomial basis function networks [40], [52], projection neural networks [57], RBF networks with dynamic... |

8 |
Genetically optimized neuro-fuzzy IPFC for damping modal oscillations of power system
- Mishra, Dash, et al.
- 2002
(Show Context)
Citation Context ...3], [26]–[28], [30], [32], [38], [45], [46], [50], [51], [54], [56], [58], [60], [62], [63], [65]–[69], [71], and [74]–[79], and weights in the output layer in [18], [19], [51], [56], [57], [60], and =-=[61]-=-. Particular parameters of basis function types (e.g., shaping parameters) are optimized in [57], [59], [65], and [66]. Weights that are not optimized by an EA are adjusted by means of various other t... |

8 |
Network Intrusion Detection, 3rd ed
- Northcutt, Novak
- 2003
(Show Context)
Citation Context ...ion security. Various techniques such as authentication (see the following subsection), data encryption, firewalls, or intrusion detection systems (IDS) help to protect against attacks (cf. [106] and =-=[107]-=-). For more than a decade, the utilizations938 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART B: CYBERNETICS, VOL. 35, NO. 5, OCTOBER 2005 TABLE II INTRUSION DETECTION—OVERALL RESULTS of soft... |

8 |
Predictive Modeling in Automotive Direct Marketing: Tools, Experiences and Open Issues
- Gersten, Wirth, et al.
- 2000
(Show Context)
Citation Context ...ning application investigated here arose in the Sales Department of DaimlerChrysler AG, when a direct mailing campaign targeting the launch of the new Mercedes Benz E-Class was planned (cf. [124] and =-=[125]-=-). Promising addressees had to be selected on the basis of so-called micro-geographical data, i.e., aggregated information on small geographical units (micro-cells) such as the size of the city, the s... |