## k. Results indicate that this procedure is very effective in estimating good feature weights (Table 4.8). Particularly the results obtained in the (1994)

### BibTeX

@MISC{Tasks94k.results,

author = {Banded Sinusoidal Tasks},

title = {k. Results indicate that this procedure is very effective in estimating good feature weights (Table 4.8). Particularly the results obtained in the},

year = {1994}

}

### OpenURL

### Abstract

ance-based algorithms to compute distances that may not reflect the optimal distance between two data points. For example, two input features may be identical. The effect of these two identical input features is equivalent to a single feature with twice the weight during distance calculations. The feature's larger weight is only justified if it contains more information with respect to the desired outputs than the other features. Otherwise the larger weight will result in a degradation in classification accuracy. De-correlation of input features may therefore improve the classification accuracy of distance-based 77 Table 4.8. The performance of the weighted vote kNN algorithm without feature weights (kNNwv ), with computed feature weights (kNNwv FWMI ), or learned feature weights (kNNwv FW V SM ). Domain kNNwv kNNwv FWMI kNNwv