Results 1 
6 of
6
Predicting Time Series with Support Vector Machines
, 1997
"... . Support Vector Machines are used for time series prediction and compared to radial basis function networks. We make use of two different cost functions for Support Vectors: training with (i) an ffl insensitive loss and (ii) Huber's robust loss function and discuss how to choose the regulariza ..."
Abstract

Cited by 161 (14 self)
 Add to MetaCart
(Show Context)
. Support Vector Machines are used for time series prediction and compared to radial basis function networks. We make use of two different cost functions for Support Vectors: training with (i) an ffl insensitive loss and (ii) Huber's robust loss function and discuss how to choose the regularization parameters in these models. Two applications are considered: data from (a) a noisy (normal and uniform noise) Mackey Glass equation and (b) the Santa Fe competition (set D). In both cases Support Vector Machines show an excellent performance. In case (b) the Support Vector approach improves the best known result on the benchmark by a factor of 29%. 1 Introduction Support Vector Machines have become a subject of intensive study (see e.g. [3, 14]). They have been applied successfully to classification tasks as OCR [14, 11] and more recently also to regression [5, 15]. In this contribution we use Support Vector Machines in the field of time series prediction and we find that they show an excel...
Sparse Regression Ensembles in Infinite and Finite Hypothesis Spaces
, 2000
"... We examine methods for constructing regression ensembles based on a linear program (LP). The ensemble regression function consists of linear combina tions of base hypotheses generated by some boostingtype base learning algorithm. Unlike the classification case, for regression the set of possible h ..."
Abstract

Cited by 21 (8 self)
 Add to MetaCart
We examine methods for constructing regression ensembles based on a linear program (LP). The ensemble regression function consists of linear combina tions of base hypotheses generated by some boostingtype base learning algorithm. Unlike the classification case, for regression the set of possible hypotheses producible by the base learning algorithm may be infinite. We explicitly tackle the issue of how to define and solve ensemble regression when the hypothesis space is infinite. Our approach is based on a semiinfinite linear program that has an infinite number of constraints and a finite number of variables. We show that the regression problem is well posed for infinite hypothesis spaces in both the primal and dual spaces. Most importantly, we prove there exists an optimal solution to the infinite hypothesisspace problem consisting of a finite number of hypothesis. We propose two algorithms for solving the infinite and finite hypothesis problems. One uses a column generation simplextype algorithm and the other adopts an exponential barrier approach. Furthermore, we give sufficient conditions for the base learning algorithm and the hypothesis set to be used for infinite regression ensembles. Computational resultsshow that these methods are extremely promising.
ªSparse Regression Ensembles in Infinite and Finite Hypothesis Spaces,º
 Machine Learning
, 2002
"... Abstract. We examine methods for constructing regression ensembles based on a linear program (LP). The ensemble regression function consists of linear combinations of base hypotheses generated by some boostingtype base learning algorithm. Unlike the classification case, for regression the set of po ..."
Abstract

Cited by 20 (8 self)
 Add to MetaCart
Abstract. We examine methods for constructing regression ensembles based on a linear program (LP). The ensemble regression function consists of linear combinations of base hypotheses generated by some boostingtype base learning algorithm. Unlike the classification case, for regression the set of possible hypotheses producible by the base learning algorithm may be infinite. We explicitly tackle the issue of how to define and solve ensemble regression when the hypothesis space is infinite. Our approach is based on a semiinfinite linear program that has an infinite number of constraints and a finite number of variables. We show that the regression problem is well posed for infinite hypothesis spaces in both the primal and dual spaces. Most importantly, we prove there exists an optimal solution to the infinite hypothesis space problem consisting of a finite number of hypothesis. We propose two algorithms for solving the infinite and finite hypothesis problems. One uses a column generation simplextype algorithm and the other adopts an exponential barrier approach. Furthermore, we give sufficient conditions for the base learning algorithm and the hypothesis set to be used for infinite regression ensembles. Computational results show that these methods are extremely promising.
Methods and techniques of complex systems science: An overview
, 2003
"... In this chapter, I review the main methods and techniques of complex systems science. As a ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
In this chapter, I review the main methods and techniques of complex systems science. As a
International Standard Book Number 0309063515
"... Academy of Sciences and its affiliated institutions with the goal of making books on science, technology, and health more widely available to professionals and the public. Joseph Henry was one of the founders of the National Academy of Sciences and a leader of early American science. Any opinions, f ..."
Abstract
 Add to MetaCart
Academy of Sciences and its affiliated institutions with the goal of making books on science, technology, and health more widely available to professionals and the public. Joseph Henry was one of the founders of the National Academy of Sciences and a leader of early American science. Any opinions, findings, conclusions, or recommendations expressed in this volume are those of the author and do not necessarily reflect the views of the National Academy of Sciences or its affiliated institutions.
Nonlinear Analysis on Interspike Interval Time Series from Foreign Exchange Rates
"... Abstract — In the conventional chaotic time series analysis, observed single time series are utilized to make nonlinear models. However, if we use multivariable information, we could expect the improvement of prediction accuracy. Then it is important to find interacting variables and to extend th ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — In the conventional chaotic time series analysis, observed single time series are utilized to make nonlinear models. However, if we use multivariable information, we could expect the improvement of prediction accuracy. Then it is important to find interacting variables and to extend the model with these variables. In this paper, we apply the above concept for analyzing the time series of exchange rates in the interbank markets. Then we confirm that the movement of prices interacts with the interval time of dealing and the spreads between offer and bid prices. We also confirm that adopting multivariables leads to the improvement of prediction accuracy. I.