Results 1  10
of
2,917
Linear Regression in Regression Tree Leaves
 In Proceedings of ECAI92
, 1992
"... The advantage of using linear regression in the leaves of a regression tree is analysed in the paper. It is carried out how this modification affects the construction, pruning and interpretation of a regression tree. The modification is tested on artificial and reallife domains where its impact on ..."
Abstract

Cited by 77 (0 self)
 Add to MetaCart
The advantage of using linear regression in the leaves of a regression tree is analysed in the paper. It is carried out how this modification affects the construction, pruning and interpretation of a regression tree. The modification is tested on artificial and reallife domains where its impact
Structural Regression Trees
, 1996
"... In many realworld domains the task of machine learning algorithms is to learn a theory predicting numerical values. In particular several standard test domains used in Inductive Logic Programming (ILP) are concerned with predicting numerical values from examples and relational and mostly nondeterm ..."
Abstract

Cited by 67 (10 self)
 Add to MetaCart
determinate background knowledge. However, so far no ILP algorithm except one can predict numbers and cope with nondeterminate background knowledge. (The only exception is a covering algorithm called FORS.) In this paper we present Structural Regression Trees (SRT), a new algorithm which can be applied to the above
Kernel Regression Trees
 Proceedings of the poster papers of the European Conference on Machine Learning. University of Economics, Faculty of Informatics and Statistics
"... This paper presents a novel method for learning in domains with continuous target variables. The method integrates regression trees with kernel regression models. The integration is done by adding kernel regressors at the tree leaves producing what we call kernel regression trees. The approach is mo ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
This paper presents a novel method for learning in domains with continuous target variables. The method integrates regression trees with kernel regression models. The integration is done by adding kernel regressors at the tree leaves producing what we call kernel regression trees. The approach
Piecewisepolynomial regression trees
 Statistica Sinica
, 1994
"... A nonparametric function 1 estimation method called SUPPORT (“Smoothed and Unsmoothed PiecewisePolynomial Regression Trees”) is described. The estimate is typically made up of several pieces, each piece being obtained by fitting a polynomial regression to the observations in a subregion of the data ..."
Abstract

Cited by 51 (8 self)
 Add to MetaCart
A nonparametric function 1 estimation method called SUPPORT (“Smoothed and Unsmoothed PiecewisePolynomial Regression Trees”) is described. The estimate is typically made up of several pieces, each piece being obtained by fitting a polynomial regression to the observations in a subregion
Pruning regression trees with MDL
, 1998
"... . Pruning is a method for reducing the error and complexity of induced trees. There are several approaches to pruning decision trees, while regression trees have attracted less attention. We propose a method for pruning regression trees based on the sound foundations of the MDL principle. We develop ..."
Abstract
 Add to MetaCart
. Pruning is a method for reducing the error and complexity of induced trees. There are several approaches to pruning decision trees, while regression trees have attracted less attention. We propose a method for pruning regression trees based on the sound foundations of the MDL principle. We
Quantile Regression Trees
, 2011
"... Robust approaches to data mining form a crucial part of data mining methods. We propose a quantile regression tree method that is more robust to outliers than standard regression trees that use a splitting criterion based on the sum of squared errors. The splitting criterion we propose is based on a ..."
Abstract
 Add to MetaCart
Robust approaches to data mining form a crucial part of data mining methods. We propose a quantile regression tree method that is more robust to outliers than standard regression trees that use a splitting criterion based on the sum of squared errors. The splitting criterion we propose is based
Boosting and instability for regression trees
, 2002
"... The AdaBoost like algorithm for boosting CART regression trees is considered. The boosting predictors sequence is analyzed on various data sets and the behaviour of the algorithm is investigated. An instability index of a given estimation method with respect to some training sample is defined. Based ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
The AdaBoost like algorithm for boosting CART regression trees is considered. The boosting predictors sequence is analyzed on various data sets and the behaviour of the algorithm is investigated. An instability index of a given estimation method with respect to some training sample is defined
Adapting Peepholing to Regression Trees
"... Abstract. This paper presents an adaptation of the peepholing method to regression trees. Peepholing was described as a means to overcome the major computational bottleneck of growing classification trees by Catlett [3]. This method involves two major steps: shortlisting and blinkering. The former h ..."
Abstract
 Add to MetaCart
Abstract. This paper presents an adaptation of the peepholing method to regression trees. Peepholing was described as a means to overcome the major computational bottleneck of growing classification trees by Catlett [3]. This method involves two major steps: shortlisting and blinkering. The former
Visualisation of Regression Trees
, 2007
"... The regression tree [1] has been used as a tool for exploring multivariate data sets for some time. As in multiple linear regression, the technique is applied to a data set consisting of a continuous response variable y and a set of predictor variables {x1, x2,..., xk} which may be continuous or cat ..."
Abstract
 Add to MetaCart
The regression tree [1] has been used as a tool for exploring multivariate data sets for some time. As in multiple linear regression, the technique is applied to a data set consisting of a continuous response variable y and a set of predictor variables {x1, x2,..., xk} which may be continuous
Using regression trees to . . .
, 2002
"... Software faults are defects in software modules that might cause failures. Software developers tend to focus on faults, because they are closely related to the amount of rework necessary to prevent future operational software failures. The goal of this paper is to predict which modules are faultpr ..."
Abstract
 Add to MetaCart
prone and to do it early enough in the life cycle to be useful to developers. A regression tree is an algorithm represented by an abstract tree, where the response variable is a real quantity. Software modules are classified as faultprone or not, by comparing the predicted value to a threshold. A classification
Results 1  10
of
2,917