Results 1 
8 of
8
A Comparison of Prediction Accuracy, Complexity, and Training Time of Thirtythree Old and New Classification Algorithms
, 2000
"... . Twentytwo decision tree, nine statistical, and two neural network algorithms are compared on thirtytwo datasets in terms of classication accuracy, training time, and (in the case of trees) number of leaves. Classication accuracy is measured by mean error rate and mean rank of error rate. Both cr ..."
Abstract

Cited by 225 (8 self)
 Add to MetaCart
(Show Context)
. Twentytwo decision tree, nine statistical, and two neural network algorithms are compared on thirtytwo datasets in terms of classication accuracy, training time, and (in the case of trees) number of leaves. Classication accuracy is measured by mean error rate and mean rank of error rate. Both criteria place a statistical, splinebased, algorithm called Polyclass at the top, although it is not statistically signicantly dierent from twenty other algorithms. Another statistical algorithm, logistic regression, is second with respect to the two accuracy criteria. The most accurate decision tree algorithm is Quest with linear splits, which ranks fourth and fth, respectively. Although splinebased statistical algorithms tend to have good accuracy, they also require relatively long training times. Polyclass, for example, is third last in terms of median training time. It often requires hours of training compared to seconds for other algorithms. The Quest and logistic regression algor...
Support vector machines for speech recognition
 Proceedings of the International Conference on Spoken Language Processing
, 1998
"... Statistical techniques based on hidden Markov Models (HMMs) with Gaussian emission densities have dominated signal processing and pattern recognition literature for the past 20 years. However, HMMs trained using maximum likelihood techniques suffer from an inability to learn discriminative informati ..."
Abstract

Cited by 114 (2 self)
 Add to MetaCart
Statistical techniques based on hidden Markov Models (HMMs) with Gaussian emission densities have dominated signal processing and pattern recognition literature for the past 20 years. However, HMMs trained using maximum likelihood techniques suffer from an inability to learn discriminative information and are prone to overfitting and overparameterization. Recent work in machine learning has focused on models, such as the support vector machine (SVM), that automatically control generalization and parameterization as part of the overall optimization process. In this paper, we show that SVMs provide a significant improvement in performance on a static pattern classification task based on the Deterding vowel data. We also describe an application of SVMs to large vocabulary speech recognition, and demonstrate an improvement in error rate on a continuous alphadigit task (OGI Aphadigits) and a large vocabulary conversational speech task (Switchboard). Issues related to the development and optimization of an SVM/HMM hybrid system are discussed.
MDLbased Decision Tree Pruning
, 1995
"... This paper explores the application of the Minimum Description Length principle for pruning decision trees. We present a new algorithm that intuitively captures the primary goal of reducing the misclassification error. An experimental comparison is presented with three other pruning algorithms. The ..."
Abstract

Cited by 68 (1 self)
 Add to MetaCart
This paper explores the application of the Minimum Description Length principle for pruning decision trees. We present a new algorithm that intuitively captures the primary goal of reducing the misclassification error. An experimental comparison is presented with three other pruning algorithms. The results show that the MDL pruning algorithm achieves good accuracy, small trees, and fast execution times. Introduction Construction or "induction" of decision trees from examples has been the subject of extensive research in the past [Breiman et. al. 84, Quinlan 86]. It is typically performed in two steps. First, training data is used to grow a decision tree. Then in the second step, called pruning, the tree is reduced to prevent "overfitting". There are two broad classes of pruning algorithms. The first class includes algorithms like costcomplexity pruning [Breiman et. al., 84], that use a separate set of samples for pruning, distinct from the set used to grow the tree. In many cases, ...
An Empirical Comparison of Decision Trees and Other Classification Methods
, 1998
"... Twenty two decision tree, nine statistical, and two neural network classifiers are compared on thirtytwo datasets in terms of classification error rate, computational time, and (in the case of trees) number of terminal nodes. It is found that the average error rates for a majority of the classifiers ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
(Show Context)
Twenty two decision tree, nine statistical, and two neural network classifiers are compared on thirtytwo datasets in terms of classification error rate, computational time, and (in the case of trees) number of terminal nodes. It is found that the average error rates for a majority of the classifiers are not statistically significant but the computational times of the classifiers differ over a wide range. The statistical POLYCLASS classifier based on a logistic regression spline algorithm has the lowest average error rate. However, it is also one of the most computationally intensive. The classifier based on standard polytomous logistic regression and a decision tree classifier using the QUEST algorithm with linear splits have the second lowest average error rates and are about 50 times faster than POLYCLASS. Among decision tree classifiers with univariate splits, the classifiers based on the C4.5, INDCART, and QUEST algorithms have the best combination of error rate and speed, althoug...
Stochastic Complexity and Its Applications
 In Workshop on Model Uncertainty and Model Robustness. Online
"... Introduction One can make a strong case for that an unsurpassed model of any set of observations, generated by some physical machinery, is provided by the shortest program for a universal computer with which the data can be reproduced. Indeed, such a program must take advantage of all the constrain ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Introduction One can make a strong case for that an unsurpassed model of any set of observations, generated by some physical machinery, is provided by the shortest program for a universal computer with which the data can be reproduced. Indeed, such a program must take advantage of all the constraints that the data have, and hence it will capture the relevant properties of the machinery  provided of course that the data set is large enough to reflect them. Unfortunately, such a program or even its length, the celebrated Kolmogorov complexity, cannot be found by algorithmic means, which has the devastating implication that, even though we can estimate it from above, we cannot assess the goodness of the estimate. And this puts an end to the dreams of basing inductive inference on the Kolmogorov complexity. The problem of noncomputability can be overcome, while retaining the idea of measuring the strength of constraints by code length, if we select a smaller class of `codes' as `
Towards Practical Machine Learning Techniques
 in Proceedings of the First Congress on Computing in Civil Engineering
, 1994
"... Most research on the application of machine learning to engineering problems have solved artificial problems. While research claimed to have reached results that would improve practice, these results have never been put to work by engineers themselves in solving their problems. A different approach ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Most research on the application of machine learning to engineering problems have solved artificial problems. While research claimed to have reached results that would improve practice, these results have never been put to work by engineers themselves in solving their problems. A different approach of doing research on machine learning application is presented and a system design that may result in tangible practical results is outlined. The development of this system is underway. INTRODUCTION In order to make machine learning (ML) techniques usable for engineers, a methodological shift is required in the way ML research is perceived, planned, and executed. Past investigations that dealt with the development of ML techniques for solving engineering problems mostly developed ideas that were tested on simplified artificial problems. Thus, researchers could not demonstrate that their ideas had practical implications. With no direct connection between research and practice, researchers w...
Modeling and Debugging Engineering Decision Procedures With Machine Learning
, 1996
"... This paper reports on the use of machine learning systems for modeling existing engineering decision procedures. In this activity, various models of an existing decision procedure are constructed by using different machine learning systems as well as by changing their operational parameters and inpu ..."
Abstract
 Add to MetaCart
This paper reports on the use of machine learning systems for modeling existing engineering decision procedures. In this activity, various models of an existing decision procedure are constructed by using different machine learning systems as well as by changing their operational parameters and input. Individual models serve to focus on different aspects of the decision procedure and their combined use thus improves the understanding of the decision procedure which, in turn, can assist in its evaluation and subsequent debugging and improvement. This important modeling role of machine learning systems is exemplified by modeling an existing decision procedure that is used by engineers in selecting among available techniques for modeling groundwater flow and contaminant transport in a process of environmental decision making. This decision procedure was corrected and improved in the course of this work. The example demonstrates the practical utility of the modeling role of machine learnin...