Results 1  10
of
11
Predicting Time Series with Support Vector Machines
, 1997
"... . Support Vector Machines are used for time series prediction and compared to radial basis function networks. We make use of two different cost functions for Support Vectors: training with (i) an ffl insensitive loss and (ii) Huber's robust loss function and discuss how to choose the regulariza ..."
Abstract

Cited by 185 (14 self)
 Add to MetaCart
(Show Context)
. Support Vector Machines are used for time series prediction and compared to radial basis function networks. We make use of two different cost functions for Support Vectors: training with (i) an ffl insensitive loss and (ii) Huber's robust loss function and discuss how to choose the regularization parameters in these models. Two applications are considered: data from (a) a noisy (normal and uniform noise) Mackey Glass equation and (b) the Santa Fe competition (set D). In both cases Support Vector Machines show an excellent performance. In case (b) the Support Vector approach improves the best known result on the benchmark by a factor of 29%. 1 Introduction Support Vector Machines have become a subject of intensive study (see e.g. [3, 14]). They have been applied successfully to classification tasks as OCR [14, 11] and more recently also to regression [5, 15]. In this contribution we use Support Vector Machines in the field of time series prediction and we find that they show an excel...
A Comparison Between NeuralNetwork Forecasting Techniques  Case Study: River Flow Forecasting
 IEEE Transactions on Neural Networks
, 1999
"... Estimating the flows of rivers can have significant economic impact, as this can help in agricultural water management and in protection from water shortages and possible flood damage. The first goal of this paper is to apply neural networks to the problem of forecasting the flow of the River Nile i ..."
Abstract

Cited by 41 (1 self)
 Add to MetaCart
(Show Context)
Estimating the flows of rivers can have significant economic impact, as this can help in agricultural water management and in protection from water shortages and possible flood damage. The first goal of this paper is to apply neural networks to the problem of forecasting the flow of the River Nile in Egypt. The second goal of the paper is to utilize the time series as a benchmark to compare between several neuralnetwork forecasting methods. We compare between four different methods to preprocess the inputs and outputs, including a novel method proposed here based on the discrete Fourier series. We also compare between three different methods for the multistep ahead forecast problem: the direct method, the recursive method, and the recursive method trained using a backpropagation through time scheme. We also include a theoretical comparison between these three methods. The final comparison is between different methods to perform longer horizon forecast, and that includes ways to partition the problem into the several subproblems of forecasting KKK steps ahead. Index Terms Backpropagation, Fourier series, multistep ahead prediction, neural networks, Nile River, river flow forecasting, seasonal time series, time series prediction. I.
Local Learning for Iterated Time Series Prediction
 In
, 1999
"... We introduce and discuss a local method to learn onestepahead predictors for iterated time series forecasting. For each single onestepahead prediction, our method selects among different alternatives a local model representation on the basis of a local crossvalidation procedure. In the literatur ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
(Show Context)
We introduce and discuss a local method to learn onestepahead predictors for iterated time series forecasting. For each single onestepahead prediction, our method selects among different alternatives a local model representation on the basis of a local crossvalidation procedure. In the literature, local learning is generally used for function estimation tasks which do not take temporal behaviors into account. Our technique extends this approach to the problem of longhorizon prediction by proposing a local model selection based on an iterated version of the PRESS leaveoneout statistic. In order to show the effectiveness of our method, we present the results obtained on two time series from the Santa Fe competition and on a time series proposed in a recent international contest. 1 Introduction The use of local memorybased approximators for time series analysis has been the focus of numerous studies in the literature [5, 14]. Memorybased approaches do not estimate a global model...
Predicting Time Series with a Local Support Vector Regression Machine
 In ACAI 99
, 1999
"... Recently, a new Support Vector Regression (SVR) algorithm has been proposed, this approach, calledSV regression allows the SVR to adjust its accuracy parameter automatically. In this paper, we combineSV regression with a local approach in order to obtain accurate estimations of both the function a ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
Recently, a new Support Vector Regression (SVR) algorithm has been proposed, this approach, calledSV regression allows the SVR to adjust its accuracy parameter automatically. In this paper, we combineSV regression with a local approach in order to obtain accurate estimations of both the function and the noise distribution. This approach seems to be extremely useful when the noise distribution does depend on the input values. We illustrate the properties of the algorithm by atoyexample and we benchmark our approach on a 100000 points time series (Santa Fe Competition data set D) obtaining the state of the art performance on this data set. 1
The Use of Domain Knowledge in Feature Construction for Financial Time Series Prediction
 PORTUGUESE CONFERENCE ON ARTIFICIAL INTELLIGENCE (EPIA 2001)
, 2001
"... Most of the existing data mining approaches to time series prediction use data preparation techniques involving an embed of the most recent values of the time series, following the traditional linear autoregressive methodologies. However, in many time series prediction tasks the alternative appro ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Most of the existing data mining approaches to time series prediction use data preparation techniques involving an embed of the most recent values of the time series, following the traditional linear autoregressive methodologies. However, in many time series prediction tasks the alternative approach that uses derivative features constructed from the raw data with the help of domain theories can produce significant prediction improvements. This is particularly noticeable when the available data includes multivariate information but the aim is still the prediction of one particular time series, a situation that occurs frequently in financial time series prediction. This paper presents a method of feature construction based on domain knowledge that uses multivariate time series information and improves the accuracy of nextday stock quotes prediction, when compared with the traditional embed of historical values extracted from the original data.
A MultiStepAhead Prediction Method Based on Local Dynamic Properties
 M. VERLEYSEN ED., ESANN’2000
, 2000
"... ..."
/08/25 16:31
"... the fact that a collection of chapters can never be as homogeneous as a book conceived by a single person. We have tried to compensate for this by the selection and refereeing process of the submissions. In addition, we have written an introductory chapter describing the SV algorithm in some detail ..."
Abstract
 Add to MetaCart
the fact that a collection of chapters can never be as homogeneous as a book conceived by a single person. We have tried to compensate for this by the selection and refereeing process of the submissions. In addition, we have written an introductory chapter describing the SV algorithm in some detail (chapter 1), and added a roadmap (chapter 2) which describes the actual contributions which are to follow in chapters 3 through 20. Bernhard Scholkopf, Christopher J.C. Burges, Alexander J. Smola Berlin, Holmdel, July 1998/08/25 16:31 1 Introduction to Support Vector Learning The goal of this chapter, which describes the central ideas of SV learning, is twofold. First, we want to provide an introduction for readers unfamiliar with this field. Second, this introduction serves as a source of the basic equations for the chapters of this book. For more exhaustive treatments, we refer the interested reader to Vapnik (1995); Scholkopf (1997); Burges (1998). 1.1
Time Series Prediction
"... Support Vector Machines are used for time series prediction and compared to radial basis function networks. We make use of two dierent cost functions for Support Vectors: training with (i) an " insensitive loss and (ii) Huber's robust loss function and discuss how to choose the regularizat ..."
Abstract
 Add to MetaCart
(Show Context)
Support Vector Machines are used for time series prediction and compared to radial basis function networks. We make use of two dierent cost functions for Support Vectors: training with (i) an " insensitive loss and (ii) Huber's robust loss function and discuss how to choose the regularization parameters in these models. Two applications are considered: data from (a) a noisy MackeyGlass system (normal and uniform noise) and (b) the Santa Fe Time Series Competition (set D). In both cases, Support Vector Machines show an excellent performance. In case (b), the Support Vector approach improves the best known result on the benchmark by 29%. 1.1
Combining Support Vector and Mathematical . . .
 ADVANCES IN KERNEL METHODS  SUPPORT VECTOR LEARNING
, 1998
"... ..."