Neural Networks
Neural Network Application For
Predicting Stock Index Volatility
Using High Frequency Data
Project No CFWin03

32
Presented by: Venkatesh Manian
Professor : Dr Ruppa K Tulasiram
cs74.757
1
May, 30, 2003
Neural Networks
Outline
•
Introduction and Motivation
•
Background
•
Problem Statement
•
Solution Strategy and Implementation
•
Results
•
Conclusion and Future Work
cs74.757
2
May, 30, 2003
Neural Networks
Introduction and Motivation
•
Index is defined as “a statistical measure of the changes in
the portfolio of stocks representing a portion of the overall
market”[3].
•
Hol and Koopman [2] calculates volatility using high
frequency intraday returns.
•
The noise present in the daily squared series decreases as
the sampling frequency of the returns increases.
•
Pierre et.al[7] says that price change are correlated only
over a short period of time whereas “absolute value has the
ability to show correlation on time up to many years”.
•
Predicting capability of neural networks.
cs74.757
3
May, 30, 2003
Neural Networks
Background
•
Schwert in [9] points out the importance of intraday data
on stock prices to keep track of market trends.
–
Market decline on October 13, 1987.
•
Refenes in [8] explains about different problem available
and its solution strategies. He says that neural networks
is used in cases where the behavior of the system cannot
be predicted.
cs74.757
4
May, 30, 2003
Neural Networks
Problem Statement
•
The goal of this project is to predict the volatility of stock index using
Radial Basis Function (RBF) Neural Networks
•
The project focuses on the following aspects.
Using high frequency intraday returns so as to reduce the noise
present in the input.
Using RBF networks which can calculate the number of hidden
nodes needed for predicting volatility at runtime so as to reduce the
problems involved in using more hidden nodes or less.
•
Prediction of stock index volatility is also tested using multilayer
feedforward network. In this case sigmoidal function is used as
activation function.
cs74.757
5
May, 30, 2003
Neural Networks
Solution Strategy and Implementation
•
Collection of every five

minute value of stock index.
•
Intraday returns are calculated by subtracting successive
log prices.
•
Overnight returns is calculated in the similar way as the
intraday returns using the closing price of the index and the
price of index with which the market starts on the
following day.
•
Calculation of the daily realized volatility by finding the
cumulative squared intraday returns.
•
Realized volatility is used as input of the neural network
and future stock index value is predicted.
cs74.757
6
May, 30, 2003
Neural Networks
Algorithm
–
Radial Basis Function Networks
o
H
1
H
2
H
3
H
4
I
n
I
1
cs74.757
7
May, 30, 2003
Neural Networks
Cont..
cs74.757
8
May, 30, 2003
Neural Networks
Cont..
•
Calculated intraday value and its corresponding
realized volatility
•
Normalized input value
cs74.757
9
May, 30, 2003
Neural Networks
Cont..
•
Normalization is done using the following equation
=((x

mean
)/standard deviation)
•
Input Data
Day
V
o
l
a
t
i
l
i
t
y
cs74.757
10
May, 30, 2003
Neural Networks
Prediction using RBF network
•
Configuration of the network
–
Number of input nodes is ten.
–
Initially the number of hidden nodes is set to zero.
–
Number of output nodes is set to one.
•
Due to the high computational complexity of the system the size of the
network has to be kept minimal.
–
Number of input nodes cannot be increased more than 15.
–
Because for each hidden node added into the network number of
parameters to be updated in each equation of the Extended Kalman Filter
is
–
k(nx+ny+1)+ny.
–
Where ‘k’ is number of hidden nodes, ‘nx’ is number of inputs and ‘ny’ is
one in this case i.e. number of output.
cs74.757
11
May, 30, 2003
Neural Networks
Cont..
Learning in RBF network
•
Learning of this network involved assigning a larger centers and then
fine tuning of these centers.
•
Based out difference between expected value and output.
•
Setting up window size to see whether the normalized output value of
each hidden node for a particular duration is below a threshold value.
If the normalized output value of a particular hidden node is below a
threshold vale for a duration called the window size then the particular
hidden node is pruned.
•
The major problem due to the presence of noise in the input data is
over fitting. This results in increase in the number of hidden nodes
with increasing the number patterns. Root Mean Square value of the
output error is calculated to overcome this over fitting problem.
cs74.757
12
May, 30, 2003
Neural Networks
Cont..
Problems encountered.
•
Initially I did not use normalized inputs but I reduced the size by dividing
each input by 1000. This experiment gave me a kind of favorable results.
•
The number of hidden nodes learned in case is four. The number of input
patterns used in this case is 200. Number of input nodes used in this case
is 10.
•
Since normalizing is the way to reduce the range of the input value, each
input data is normalized with respect to the mean and standard deviation
of the data.
•
After normalizing the network started to overfit the data. I tried to update
the value of different parameters. But I was unable to control the effect of
this problem.
•
Hence I used different network for prediction. I used sigmoid function in
this case as the activation function.
cs74.757
13
May, 30, 2003
Neural Networks
ANN using Sigmoid Function
Algorithm
•
In this case all connections are
associated with weights.
•
Weighted sum is given is given to
each nodes of the next layer
which calculates sigmoid
function.
•
On receiving the output from the
output node , it is compared with
the expected value and the output
error is calculated.
•
This value of error propagated
back into network to adjust the
weights.
o
H
1
H
2
H
3
H
4
I
n
I
1
cs74.757
14
May, 30, 2003
Neural Networks
Results
•
I trained the network so as to get a minimum error in the
testing phase. MAPE (mean absolute percentage error) is used
as the evaluation method in this case.
cs74.757
15
May, 30, 2003
Neural Networks
Results
–
using test data
•
The above table gives the output of the network using
test data.
cs74.757
16
May, 30, 2003
Neural Networks
Conclusion and Future Work
•
I used high frequency intraday data for predicting the value
of volatility.
•
The method used for prediction in this project is neural
network.
•
Since I did not get any favorable results in this case, I
would take some help in solving the problem due to over
fitting of data.
•
I will also try to find a way to get better results using
ANN, which uses sigmoid function.
•
I would also make up a better algorithm which can
overcome the memory problem involved in using large
amount of data.
cs74.757
17
May, 30, 2003
Neural Networks
References
1.
Andersen, T. and T. Bollerslev (1997). “Intraday periodicity and volatility
persistence in Financial markets”.
Journal of Empirical Finance 4
, 115

158.
2.
Eugene Hol and Siem Jan Koopman, “Stock Index Volatility Forecasting with High
Frequency Data” No 02

068/4 in Tinbergen Institute Discussion Papers from
Tinbergen Institute.
3.
Investopedia.com
http://www.investopedia.com/university/indexes/index1.asp
4.
JingTao Yao and Chew Lim Tan. “Guidelines for Financial Forecasting with Neural
Networks”. In Proceeding of International Conference on Neural Information
Processing, Shangai, China, Pages 772

777, 2001.
5.
Iebeling Kaastra and Milton S. Boyd. “Forecasting Futures trading volume using
Neural Networks”. Journal of Futures Market, 15(8):953

970, December 1995.
6.
P. Sarachandran, N. Sundarajan and Lu Ying Wei. “Radial Basis Function Neural
Networks with Sequential Learning”. World Scientific Publication Co. Pt. Ltd, march
1999.
7.
Pierre Cizeau, Yanhui Liu, Martin Meyer, C

K. Peng and H. Eugene Stanley.
“Volatility distribution in the S&P500 stock index”. arXiv:condmat/97081431,
August 1997.
cs74.757
18
May, 30, 2003
Neural Networks
8.
Apostolos

Paul Refenes. “Neural Network In the Capital Market”. John Wiley and
Sons, LONDON, 1995.
9.
G. Williams Schwert. “Stock Market Volatility”. Financial Analysts Journal, pages
23

34, May

June 1990.
10.
Yahoo Finance. http://finance.yahoo.com/
cs74.757
19
May, 30, 2003
Neural Networks
Thank You
cs74.757
20
May, 30, 2003
Neural Networks
Network Training
•
I have considered two types of network in
this project.
•
Radial Basis Function(RBF) network
•
Artificial Neural Network
–
Sigmoid function
cs74.757
Enter the password to open this PDF file:
File name:

File size:

Title:

Author:

Subject:

Keywords:

Creation Date:

Modification Date:

Creator:

PDF Producer:

PDF Version:

Page Count:

Preparing document for printing…
0%
Comments 0
Log in to post a comment