Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier...
-
date post
19-Dec-2015 -
Category
Documents
-
view
223 -
download
4
Transcript of Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier...
![Page 1: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/1.jpg)
Financial time series forecasting using support
vector machinesAuthor: Kyoung-jae Kim
2003 Elsevier B.V.
![Page 2: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/2.jpg)
Outline
• Introduction to SVM
• Introduction to datasets
• Experimental settings
• Analysis of experimental results
![Page 3: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/3.jpg)
Linear separability
• Linear separability– In general, two groups are linearly separable in n-
dimensional space if they can be separated by an (n − 1)-dimensional hyperplane.
![Page 4: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/4.jpg)
Support Vector Machines
• Maximum-margin hyperplanemaximum-margin hyperplane
![Page 5: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/5.jpg)
Formalization
• Training data
• Hyperplane
• Parallel bounding hyperplanes
![Page 6: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/6.jpg)
Objective
• Minimize (in w, b)||w||
• subject to (for any i=1, …, n)
![Page 7: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/7.jpg)
A 2-D case
• In 2-D:– Training data:
xi ci<1, 1> 1<2, 2> 1<2, 1> -1<3, 2> -1
-2x+2y+1=-1
-2x+2y+1=1-2x+2y+1=0
w=<-2, 2>b=-1margin=sqrt(2)/2
![Page 8: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/8.jpg)
Not linear separable
• No hyperplane can separate the two groups
![Page 9: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/9.jpg)
Soft Margin
• Choose a hyperplane that splits the examples as cleanly as possible
• Still maximizing the distance to the nearest cleanly split examples
• Introduce an error cost Cd*C
![Page 10: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/10.jpg)
Higher dimensions
• Separation might be easier
![Page 11: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/11.jpg)
Kernel Trick
• Build maximal margin hyperplanes in high-dimenisonal feature space depends on inner product: more cost
• Use a kernel function that lives in low dimensions, but behaves like an inner product in high dimensions
![Page 12: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/12.jpg)
Kernels
• Polynomial– K(p, q) = (p•q + c)d
• Radial basis function– K(p, q) = exp(-γ||p-q||2)
• Gaussian radial basis– K(p, q) = exp(-||p-q||2/2δ2)
![Page 13: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/13.jpg)
Tuning parameters
• Error weight– C
• Kernel parameters– δ2
– d
– c0
![Page 14: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/14.jpg)
Underfitting & Overfitting
• Underfitting
• Overfitting
• High generalization ability
![Page 15: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/15.jpg)
Datasets
• Input variables– 12 technical indicators
• Target attribute– Korea composite stock price index (KOSPI)
• 2928 trading days– 80% for training, 20% for holdout
![Page 16: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/16.jpg)
Settings (1/3)
• SVM– kernels• polynomial kernel
• Gaussian radial basis function– δ2
– error cost C
![Page 17: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/17.jpg)
Settings (2/3)• BP-Network– layers
• 3– number of hidden nodes
• 6, 12, 24– learning epochs per training example
• 50, 100, 200– learning rate
• 0.1– momentum
• 0.1– input nodes
• 12
![Page 18: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/18.jpg)
Settings (3/3)
• Case-Based Reasoning– k-NN• k = 1, 2, 3, 4, 5
– distance evaluation• Euclidean distance
![Page 19: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/19.jpg)
Experimental results
• The results of SVMs with various C where δ2 is fixed at 25
• Too small C• underfitting*
• Too large C• overfitting*
* F.E.H. Tay, L. Cao, Application of support vector machines in -nancial time series forecasting, Omega 29 (2001) 309–317
![Page 20: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/20.jpg)
Experimental results
• The results of SVMs with various δ2 where C is fixed at 78
• Small value of δ2
• overfitting*
• Large value of δ2
• underfitting*
* F.E.H. Tay, L. Cao, Application of support vector machines in -nancial time series forecasting, Omega 29 (2001) 309–317
![Page 21: Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.](https://reader036.fdocuments.in/reader036/viewer/2022062407/56649d3a5503460f94a1535c/html5/thumbnails/21.jpg)
Experimental results and conclusion
• SVM outperformes BPN and CBR
• SVM minimizes structural risk
• SVM provides a promising alternative for financial time-series forecasting
• Issues– parameter tuning