Results 1 -
6 of
6
M-Estimator and D-Optimality Model Construction Using Orthogonal Forward Regression
"... Abstract—This correspondence introduces a new orthogonal forward regression (OFR) model identification algorithm using D-optimality for model structure selection and is based on an M-estimators of parameter estimates. M-estimator is a classical robust parameter estimation technique to tackle bad dat ..."
Abstract
-
Cited by 4 (1 self)
- Add to MetaCart
(Show Context)
Abstract—This correspondence introduces a new orthogonal forward regression (OFR) model identification algorithm using D-optimality for model structure selection and is based on an M-estimators of parameter estimates. M-estimator is a classical robust parameter estimation technique to tackle bad data conditions such as outliers. Computationally, The M-estimator can be derived using an iterative reweighted least squares (IRLS) algorithm. D-optimality is a model structure robustness criterion in experimental design to tackle ill-conditioning in model structure. The orthogonal forward regression (OFR), often based on the modified Gram–Schmidt procedure, is an efficient method incorporating structure selection and parameter estimation simultaneously. The basic idea of the proposed approach is to incorporate an IRLS inner loop into the modified Gram–Schmidt procedure. In this manner, the OFR algorithm for parsimonious model structure determination is extended to bad data conditions with improved performance via the derivation of parameter M-estimators with inherent robustness to outliers. Numerical examples are included to demonstrate the effectiveness of the proposed algorithm. Index Terms—Forward regression, Gram–Schmidt, identification, M-estimator, model structure selection. I.
Orthogonal-Least-Squares Forward Selection for Parsimonious Modelling from Data
, 2009
"... The objective of modelling from data is not that the model simply fits the training data well. Rather, the goodness of a model is characterized by its generalization capability, interpretability and ease for knowledge extraction. All these desired properties depend crucially on the ability to constr ..."
Abstract
- Add to MetaCart
The objective of modelling from data is not that the model simply fits the training data well. Rather, the goodness of a model is characterized by its generalization capability, interpretability and ease for knowledge extraction. All these desired properties depend crucially on the ability to construct appropriate parsimonious models by the modelling process, and a basic principle in practical nonlinear data modelling is the parsimonious principle of ensuring the smallest possible model that explains the training data. There exists a vast amount of works in the area of sparse modelling, and a widely adopted approach is based on the linear-in-the-parameters data modelling that include the radial basis function network, the neurofuzzy network and all the sparse kernel modelling techniques. A well tested strategy for parsimonious modelling from data is the orthogonal least squares (OLS) algorithm for forward selection modelling, which is capable of constructing sparse models that generalise well. This contribution continues this theme and provides a unified framework for sparse modelling from data that includes regression and classification, which belong to supervised learning, and probability density function estimation, which is an unsupervised learning problem. The OLS forward selection method based on the leave-one-out test criteria is presented within this unified data-modelling framework. Examples from regression, classification and density estimation applications are used to illustrate the effectiveness of this generic parsimonious modelling approach from data.
unknown title
, 2005
"... This article was originally published in a journal published by Elsevier, and the attached copy is provided by Elsevier for the author’s benefit and for the benefit of the author’s institution, for non-commercial research and educational use including without limitation use in instruction at your in ..."
Abstract
- Add to MetaCart
(Show Context)
This article was originally published in a journal published by Elsevier, and the attached copy is provided by Elsevier for the author’s benefit and for the benefit of the author’s institution, for non-commercial research and educational use including without limitation use in instruction at your institution, sending it to specific colleagues that you know, and providing a copy to your institution’s administrator. All other uses, reproduction and distribution, including without limitation commercial reprints, selling or licensing copies or access, or posting on open internet sites, your personal or institution’s website or repository, are prohibited. For exceptions, permission may be sought for such use through Elsevier’s permissions site at:
A Novel Continuous Forward Algorithm for RBF Neural Modelling
"... Abstract—A continuous forward algorithm (CFA) is proposed for nonlinear modelling and identification using radial basis function (RBF) neural networks. The problem considered here is simultaneous network construction and parameter optimization, well-known to be a mixed integer hard one. The proposed ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract—A continuous forward algorithm (CFA) is proposed for nonlinear modelling and identification using radial basis function (RBF) neural networks. The problem considered here is simultaneous network construction and parameter optimization, well-known to be a mixed integer hard one. The proposed algorithm performs these two tasks within an integrated analytic framework, and offers two important advantages. First, the model performance can be significantly improved through con-tinuous parameter optimization. Secondly, the neural representation can be built without generating and storing all candidate regressors, leading to significantly reduced memory usage and computational complexity. Computational complexity analysis and simulation results confirm the effectiveness. Index Terms—radial basis function (RBF) neural network, parameter optimization, network construction, nonlinear systems, Modelling and identification. I.