首页    期刊浏览 2025年05月26日 星期一
登录注册

文章基本信息

  • 标题:Forecasting bank stock market prices with a Hybrid method: the case of Alpha Bank/Vertybiniu popieriu kainu prognozavimas Hibridiniu metodu: Alpha Bank pavyzdys.
  • 作者:Koutroumanidis, Theodoros ; Ioannou, Konstantinos ; Zafeiriou, Eleni
  • 期刊名称:Journal of Business Economics and Management
  • 印刷版ISSN:1611-1699
  • 出版年度:2011
  • 期号:March
  • 语种:English
  • 出版社:Vilnius Gediminas Technical University
  • 摘要:Forecasting in Finance and Economics has been a subject of extended study within the last decade (McAdam, McNellis 2005). In most cases the forecasts are based on time series modeling (Borovkova et al. 2003). Forecasting can also be based on the application of different methods like, bootstrapping (De Peretti 2003; Hatemi, Roca 2006) and Artificial Neural Networks (Kiani, Kastens 2008). The combined use of Bootstrap methods and Artificial Neural Networks (ANNs) has also been used in Forecasting in the past (Focarelli 2005). In our study we present a different method which relies on the application of ANNs for the estimation of the [(1-[alpha]).sup.*]100%(C.I) of the predicted values of the estimated time series. The estimation of this time series is based on the application of the Bootstrap method on the residuals.
  • 关键词:Artificial neural networks;Banking industry;Financial markets;Neural networks;Stock price forecasting;Time series analysis;Time-series analysis

Forecasting bank stock market prices with a Hybrid method: the case of Alpha Bank/Vertybiniu popieriu kainu prognozavimas Hibridiniu metodu: Alpha Bank pavyzdys.


Koutroumanidis, Theodoros ; Ioannou, Konstantinos ; Zafeiriou, Eleni 等


1. Introduction

Forecasting in Finance and Economics has been a subject of extended study within the last decade (McAdam, McNellis 2005). In most cases the forecasts are based on time series modeling (Borovkova et al. 2003). Forecasting can also be based on the application of different methods like, bootstrapping (De Peretti 2003; Hatemi, Roca 2006) and Artificial Neural Networks (Kiani, Kastens 2008). The combined use of Bootstrap methods and Artificial Neural Networks (ANNs) has also been used in Forecasting in the past (Focarelli 2005). In our study we present a different method which relies on the application of ANNs for the estimation of the [(1-[alpha]).sup.*]100%(C.I) of the predicted values of the estimated time series. The estimation of this time series is based on the application of the Bootstrap method on the residuals.

In detail we apply a hybrid method that includes the combined application of ANNs on the Upper Confidence Limit (UCL) and Lower Confidence Limit (LCL) of the [(1-[alpha]).sup.*]100% (C.I), that have been a result of the Bootstrap method on the residuals, aiming at the prediction of the confidence intervals.

The paper is organised as follows, section 1 introduces the subject of the study, section 2 is a review of the literature, and section 3 describes the methodology used to generate predictions. The empirical results are given in section 4, and finally section 5 provides some concluding remarks.

2. Literature review

Claeskens and Keilgom (2003), construct bootstrap confidence bands for regression curves. Kolsrud (2007) proposes principles and methods for the construction of a time-simultaneous prediction band for a univariate time series. The methods are entirely based on a learning sample of time trajectories, and make no parametric assumption about its distribution. The expected coverage probability of a band can be estimated with a bootstrap procedure. In Kim (2002), the construction of bootstrap prediction intervals is based on the percentile and percentile-t methods, employing the standard bootstrap as well as the bootstrap-after-bootstrap method.

The use of GARCH models in modelling the stock prices; behavior is used greatly within the last years (Teresiene 2009; Aktan et al. 2010). Tambakis and Van Royen (2002), on the other hand, used the bootstrap methodology to estimate the data's conditional predictability using GARCH models. This result is then compared to predictability under a random walk and a model using the prediction bias in uncovered interest parity (UIP). Mark (1995) suggested bootstrapping in testing the null hypothesis of no predictability. Based on the bootstrap tests, the author found strong evidence favouring the forecast accuracy of the monetary model relative to the random walk. Thombs and Schuchany (1990) have developed a method of calculating bootstrap conditional prediction intervals for autoregressive models.

McCullough, (1994), applies the bootstrap method in estimating forecast intervals for an AR(p) model. Ankenbrand and Tomassini (1996), present an integrated approach for modelling the behaviour of financial markets with ANNs. Fernando Fernandez-Rodriguez et al. (2000), investigate the profitability of a simple technical trading rule based on ANNs. Ioannou et al. (2009), used ANNs in order to predict the future prices of fuelwood. It is obvious that within the last decade, there has been an increasing interest in surveying the predictable components in stock prices (Fama 1991). Patterns in asset prices improved stock-market forecast ability with different techniques (Fernandez-Rondriquez et al. 1997). One of the approaches that improved the ability of forecasting security markets is the ANNs (Van Eyden 1995; Gencay, Stengos 1998a). Brock et al. (1992) used bootstrap simulations of various null asset pricing models and found that simple technical trading rule profits cannot be explained by popular statistical models of stock index returns. Dogan (2007) proposed the bootstrapping method for confidence interval estimation and hypothesis testing in system dynamics models and provided an overview of the issues related to the proper application of bootstrapping in dynamic models.

Gencay and Stengos (1998b, 1999), confirm predictive power of simple technical trading rules in forecasting the current returns using feed forward network and NN regressions, while Gencay and Stengos (1997), and Gencay and Stengos (1998c), find evidence of nonlinear predictability in stock market returns by using the past buy and sell signals of the moving average rules. Regarding foreign exchange markets, Le Baron (1992) and Le Baron (1998), use the bootstrap methodology to demonstrate the statistical significance of the technical trading rules against several parametric null models of exchange rates. Furthermore, Le Baron (1999) and Sosvilla-Rivero et al. (1999), discover that excess returns from extrapolative technical trading rules in foreign exchange markets are high during periods of central bank intervention. Gencay (1999), by using feed forward network and NN regressions, finds statistically significant forecast improvements for the current returns over the random walk model of foreign exchange returns.

Skabar and Cloete (2002), describe a methodology in which neural networks can be used indirectly, through a genetic algorithm based on weight optimisation procedure, in order to determine buy and sell points for financial commodities traded on a stock exchange. A number of studies applied the simulation of trading agents based on ANNs (White 1988; Kimoto et al. 1990; Weigend and Gershenfeld 1994). The traditional approach to supervise neural network weight optimisation is the well-known back propagation algorithm (Rumelhart, McClelland 1986), while Beltratti and Terna (1996), suggest the use of genetic search for neural network weight optimisation in this field.

Ruiz and Pascual (2002), review the application of bootstrap procedures in inference and prediction of financial time series. However, bootstrap methods are not adequate in this context. Korajczyk (1985) presents one of the earliest applications of bootstrap methods to analyze financial problems. Given that the basic bootstrap techniques were originally developed for independent observations, the bootstrap inference has not the desired properties when applied to raw returns; Bookstaber and McDonald (1987), Chatterjee and Pari (1990), Hsie and Miller (1990) and Levich and Thomas (1993) present the problems of surveys where returns are directly bootstrapped. Maddala and Li (1996) pointed out the shortcomings in the application of bootstrap methods in finance. On the other hand, Thombs and Schucany (1990), as well as Kim (2002), argue that bootstrap-based methods can also be used to obtain prediction densities and intervals for future values of a given variable without making distributional assumptions on the innovations and, at the same time, allowing the introduction, into the estimated prediction densities, of the variability due to parameter estimation. Mizuno et al. (1998), employ ANN to Tokyo stock exchange to predict buying and selling signals with an overall prediction rate of 63%. Sexton et al. (1998) concluded that the use of momentum and start of learning at random points may solve the problems that may occur in training processes. Phua et al. (2000), applied neural networks with genetic algorithms to the stock exchange market of Singapore and predicted the market direction with an accuracy of 81%. In Turkey, ANNs are mostly used in predicting financial failures (Yildiz 2001). There is no empirical survey concerning the prediction of Turkish stock market values with exception that of Birgul Egeli et al. (2003), who use artificial neural networks to predict Istanbul Stock Exchange (ISE) market index value. To be more specific, the aim of their study was to use ANNs in order to forecast Istanbul Stock Exchange (ISE) market index values.

What must also be mentioned is the greater predictability performance of ANN compared to that of other conventional models like autoregressive models, as provided by the current literature. Evidently, according to Al Saba and El Amin (1998), F-M Tseng et al. (2002), Gutierrez-Estrada et al. (2003), Koutroumanis et al. (2009), Prybutok et al. (2000), Artificial Neural Networks tend to perform better and predict better results when compared to Auto Regressive Moving Average (ARIMA) models.

Le Baron and Weigend (1997) by using a bootstrap or resampling method, compare the uncertainty in the solution stemming from the data splitting with neural network specific uncertainties (parameter initialization, choice of number of hidden units, etc.).

Parisi et al. (2008), analyze recursive and rolling neural network models to forecast one-step-ahead sign variations in gold price. Different combinations of techniques and sample sizes are studied for feed forward and ward neural networks. White and Racine (2001) suggest tests for individual and joint irrelevance of network inputs. Tests of this type can be used to determine whether an input or group of inputs belong and quote in a particular model permitting valid statistical inference to be based on estimated feed forward neural network models. The approaches employ well known statistical resampling techniques. Lento and Gradojevic (2007) determine the profitability of technical trading rules by evaluating their ability to outperform the naive buy-and-hold trading strategy. The bootstrap methodology is used to determine the statistical significance of the results.

3. Artificial neural networks

A neural network consists of a number of elements called neurons. Each neuron receives a number of signals which come as an input to it. The neuron has some possible states on which his internal structure may be, that receives the input signals and finally has only one output which is a function of input signals. Each signal transmitted from one neuron to another through the neural network is coupled with a weight value, w, which indicates how closely these two neurons are connected with this weight. This value fluctuates on a specific interval, for example, the interval between -1 and 1, although this interval is an arbitrary choice and is dependent on the problem we want to solve. The meaning of the weight value is to show us the importance of the contribution of the specified signal to the configuration of the structure of the network for the two neurons that connects. When w is big then the contribution of the signal is also big.

The primary aim of an Artificial Neural Network is to solve specific problems that we present to it, or to perform certain tasks (like image recognition). In order to solve or perform these tasks the network must be trained. This is exactly the main characteristic of neural networks meaning that they learn by training. By using the word "training" in neural networks we mean that we provide some input and we get some outputs. Inputs are in essence, the presentation to the network of some signals taking arithmetic values, i.e. a binary number consisting of 0 and 1. These numbers given at the input of the network constitute a prototype. There is a possibility that for a given problem many prototypes are required. To each prototype corresponds a correct answer, which is the signal we must receive at the output or else the objective.

When the network stops changing weight values, we assume that training is complete. This happens because the error at the input is nearly or equal to zero. Typically the architecture of an ANN consists of the Input Layer where we provide the data, the Hidden Layer where data are being processed and may consist of various levels and finally the Output layer from where we read the results of the network (Argyrakis 2001).

4. Bootstrap method

Bootstrapping is the practice of estimating properties of an estimator (such as its variance) by measuring those properties when sampling from an approximating distribution. One standard choice for an approximating distribution is the empirical distribution of the observed data. In the case where a set of observations can be assumed to be from an independent and identically distributed population, this can be implemented by constructing a number of resamples of the observed dataset (and of equal size to the observed dataset), each of which is obtained by random sampling with replacement from the original dataset.

It may also be used for constructing hypothesis tests. It is often used as an alternative to inference based on parametric assumptions when those assumptions are in doubt, or where parametric inference is impossible or requires very complicated formulas for the calculation of standard errors (Fig. 1).

[FIGURE 1 OMITTED]

The idea of bootstrap is depicted in the diagram above. Suppose that the researcher wants to assess statistical accuracy of the sample data (statistics of sample), he can take N bootstrap samplings and compute the statistics from each bootstrap sampling. The values of bootstrap statistics are used to evaluate the statistical accuracy of the original sample statistics (Teknomo 2006).

The advantage of bootstrapping against other analytical methods is its great simplicity it is straightforward to apply the bootstrap to derive estimates of standard errors and confidence intervals for complex estimators of complex parameters of the distribution, such as percentile points, proportions, odds ratio, and correlation coefficients (Efron 1982).

5. Hybrid method

In this study we propose a hybrid methodology that may allow us to make forecasts on the confidence intervals of the predicted values of a time series. This method includes ANNs and Bootstrap methods. Those two methodologies can be combined by the use of Excel and Visual Basic for Applications. This application is implemented in the following steps:

1st Step: ANN is applied in order to estimate the real values of the time series and to make forecasts on its future values.

2nd Step: Initially, the residuals are estimated. The Bootstrap method is applied on the new residuals. The application of Breusch Godfrey LM test as well as ARCH and Breusch Pagan test did not trace any problem of autocorrelation and heteroskedasticity. This result implies that the bootstrap sample of the residuals [e.sup.*] of size N, can be considered as random independent and identically distributed sample drawn with replacement from the empirical distribution function (EDF) of the residuals. Furthermore, Bootstrap based methods can also be used to obtain prediction densities and intervals for future values of a given variable without making distributional assumptions on the innovations and, at the same time, allowing the introduction, into the estimated prediction densities, of the variability due to parameter estimation (Kim 2001; Thombs, Schucany 1990).

The sample [e.sup.*] can be considered a randomly resampled version of e residuals: its elements are the same as those of the original data set but some may appear once, some two or more times and some others may not appear at all. Supposing B is independent bootstrap time series of the residuals [e.sup.*1], [e.sup.*2], ........., [e.sup.*B]. Each time series consists of N data values generated with replacement from e (Mooney, Duval 1993). This means that for every real value of the time series we take B+1 residuals randomly distributed.

3rd step: For every residual B new bootstraping residuals are calculated. Within this process the [(1-[alpha]).sup.*]100% (Bootstrap C.I.) of each residual is estimated, a process that is repeated for all B+1 residuals. Based on the B.C.I, we estimate the C.I. of the predictions. The technique of bootstrapping applied on the residuals has been extensively used in the past (Shao, Tu 1955; Efron, Tibshirani 1986; Hall 1986, 1988; Beran 1988; Franklin, Wasserman 1992; Simar, Wilson 1998; Glaz, Sison 1999; Bjornstad, Falck 2001; Tribouley 2004; Chou 2006; Pesavento, Rossi 2006; Kapetanios 2008; Xiong, Li 2008; Charitos et al. 2009; Jun Li et al. 2009; Annaert et al. 2009; Kascha, Mertens 2009; Barnes et al. 2009).

4th step: Based on the process mentioned above two new time series of the upper and lower limits of the B.C.I. are generated. In the fourth step with the application of ANN on the upper and lower limit of BCI, we can make forecasts about the upper and lower limits of the C.I. of the predicted value respectively. Consequently, we can estimate the C.I. of the predicted values regarding the initial time series.

6. Application of the hybrid method--results

This methodology is applied to a time series of the stock prices of Alpha Bank for the time period from 28/01/2004 till 30/11/2005 (daily prices) that the initial ANN is used. In order to implement an accuracy test for the forecasts we used the last twenty observations of our time series, with the assistance of statistical tests. For the evaluation of forecasting accuracy the following statistical tests were used; RMSE, MAPE, NOF and Theil'--U Statistic. An A.N.N is created using as input the real values of the stock price of Alpha bank for the same time period (28/01/2004 till 30/11/2005) through which we estimate the forecasted values of the Alpha Bank stock prices. Then the residuals are estimated with the application of ANN. In order to train the neural network we used the Kalman filter which consists of one input neurons, 24 hidden neurons and 1 output neuron. During the ANN development, we created several other ANN's with different numbers of neurons. The network described here, provided the best results. The application we used (Neural Ware Predict) automatically uses a part of the time series for training, testing and validation, thus protecting the network from overfitting. Kalman filters are based on linear dynamical systems discretised in the time domain. They are modelled on a Markov chain built on linear operators perturbed by Gaussian noise. The state of the system is represented as a vector of real numbers. At each discrete time increment, a linear operator is applied to the state to generate the new state, with some noise mixed in, and optionally some information from the controls on the system if they are known. Then, another linear operator mixed with more noise generates the visible outputs from the hidden state. The Kalman filter may be regarded as analogous to the hidden Markov model, with the key difference that the hidden state variables are continuous (as opposed to being discrete in the hidden Markov model). Additionally, the hidden Markov model can represent an arbitrary distribution for the next value of the state variables, in contrast to the Gaussian noise model that is used for the Kalman filter (Haikin 2001). In our case, we used the Kalman filter in order to train the network which consists of two input neurons, 24 hidden neurons and 1 output neuron.

Kalman filters are based on linear dynamic systems. They are modelled on a Markov chain built on linear operators perturbed by Gaussian noise. The state of the system is represented as a vector of real numbers. At each discrete time increment, a linear operator is applied to the state to generate the new state, with some noise mixed in, and optionally, some information from the controls on the system, if they are known. Then, another linear operator mixed with more noise generates the visible outputs from the hidden state. The Kalman filter may be regarded similar to the hidden Markov model, with the key difference that the hidden state variables are continuous (as opposed to being discrete in the hidden Markov model). Additionally, the hidden Markov model can represent an arbitrary distribution for the next value of the state variables, in contrast to the Gaussian noise model that is used for the Kalman filter. There is a strong duality between the equations of the Kalman Filter and those of the hidden Markov model. The Kalman filter model assumes the true state at time k has evolved from the state at (k - 1) according to:

[x.sub.k] = [F.sub.k][x.sub.k-1] + [B.sub.k][u.sub.k] + [w.sub.k], (1)

where: [F.sub.k] is the state transition model which is applied to the previous state [x.sub.k-1], [B.sub.k] is the control-input model which is applied to the control vector uk; [w.sub.k] is the process noise which is assumed to be drawn from a zero mean multivariate normal distribution with covariance [Q.sub.k].

[w.sub.k] ~ N(0, [Q.sub.k]). (2)

At time k an observation (or measurement) [z.sub.k] of the true state [x.sub.k] is made according to:

[z.sub.k] = [H.sub.k][x.sub.k] + [v.sub.k], (3)

where: [H.sub.k] is the observation model which maps the true state space into the observed space; [v.sub.k] is the observation noise which is assumed to be zero mean Gaussian white noise with covariance [R.sub.k].

[v.sub.k] ~ N(0, [R.sub.k]). (4)

The initial state, and the noise vectors at each step {[x.sub.0], [w.sub.1], ..., [w.sub.k], [v.sub.1] ... [v.sub.k]} are all assumed to be mutually independent. In order to use the Kalman filter to estimate the internal state of a process, given only a sequence of noisy observations, one must model the process in accordance to the framework of the Kalman filter. This means specifying the matrices [F.sub.k], [H.sub.k], [Q.sub.k], [R.sub.k], and sometimes Bk for each time-step k, as described below. The Kalman filter is an efficient recursive filter that estimates the state of a dynamic system from a series of incomplete and noisy measurements. This means that only the estimated state from the previous time step and the current measurement are needed to compute the estimate for the current state. In contrast to batch estimation techniques, no history of observations and/or estimates is required. It is unusual in being purely a time domain filter; most filters (for example, a low-pass filter) are formulated in the frequency domain and then transformed back to the time domain for implementation. The state of the filter is represented by two variables:

[[??].sub.k/k] - the estimate of the state at time k;

[P.sub.k/k] - the error covariance matrix (a measure of the estimated accuracy of the state estimate).

The Kalman filter has two distinct phases: Predict and Update. The predict phase uses the estimate from the previous time step to produce an estimate of the current state. In the update phase, measurement information from the current time step is used to refine the prediction in order to arrive at a new, (hopefully) more accurate, estimate. Predict Phase:

[x.sub.k] = [F.sub.k][x.sub.k-1] + [B.sub.k][u.sub.k] (Predicted state), (5)

[P.sub.k/k-1] = [F.sub.k][P.sub.k-1]/k-1][F.sup.T.sub.k] + [Q.sub.k] (Predicted estimate covariance). (6)

Update Phase:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (Innovation or measurement residual), (7)

[S.sub.k] = [H.sub.k][P.sub.k/k-1][H.sup.T.sub.k][R.sub.k] (Innovation (or residual) covariance), (8)

[K.sub.k] = [P.sub.k/k-1][H.sup.T.sub.k][S.sup.-1.sub.k] (Optimal Kalman gain), (9)

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (Updated state estimate), (10)

[P.sub.k/k] = (I - [K.sub.k][H.sub.k])[P.sub.k/k-1] (Updated estimate covariance). (11)

In the case of the ANN studied here, the sigmoid function was used, as the activation function of each neuron. Because of this, the values of the data variables in the model must be normalized onto range [0.1] before applying the ANN methodology. This problem was solved through the following scaling:

[V.sup.*.sub.b] = [V.sub.b] - [V.sub.min,b]/[V.sub.max,b] - [V.sub.min,b], (12)

where: [V.sub.b] are the values of the data variables; [V.sup.*.sub.b] is the scaled value of the variable; [V.sub.min,b] is the minimum value of variable [V.sub.b] minus 15%; [V.sub.max,b] is the maximum value of variable [V.sub.b] plus 15%.

Hence the scaled series are in the range [0.1]. This scale has the advantage of mapping the desired range of a variable to the full working range of the network input and moreover, the scaled series lies in the central zone of the sigmoid function, where the function is approximately linear. Therefore, during the validation model which is described next, the problem of the output signal saturation that can sometimes be encountered in ANN applications is avoided.

With the use of the Bootstrap method 120 time series of residuals are generated, and consequently to every real value correspond 120 residuals which are randomly distributed around it. The application of the Bootstrap technique is through a special menu called Bootstrap menu. Thus, the Bootstrap method is activated by using as input data of a time series determined by the user. The application of this technique is being realized through sampling with reset, and a number of times series is generated whose number is also determined by the user. The menu is connected internally to a Visual Basic for Applications (VBA) code that implements the method by using the RAND function and by calculating for every real value of the time series of the residuals. This application may determine the number of the bootstrap time series generated by the initial time series (Fig. 2).

[FIGURE 2 OMITTED]

Then by using the Insertion sort (Knuth 1998) we estimate the 95% (C.I.) of the 120 generated residuals, and thus we estimated the Upper and Lower Confidence Limits of the 95% (C.I.) of the predicted values. The insertion technique sorts each series by repeatedly taking the next item and inserting it into the final data structure in its proper order with respect to items already inserted.

An example of the code used in order to implement the insertion technique is shown in the following example:
Module InsertionSort
  Sub InsertionSort(ByRef a() As Integer)
     Dim i As Integer
     For i = 0 To a.Length - 1
      insert a(i) into sorted suhlist
     Next
  End Sub
  test main
End Module


Two new time series are created, one of the Upper Confidence Limit (UCL) and the other of the Lower Confidence Limit (LCL) of the [(1-[alpha]).sup.*]100% (C.I) of the predicted prices. Consequently, we use the initial ANN for the calculation of twenty new values of the stock price of Alpha Bank, by using as an input the closing prices of EFG, and also by using an ANN having as input the upper and lower values of the residuals that were calculated by using Bootstrap, we calculate and the new expected upper and lower limits of the forecasted prices, given by the initial ANN.

The results of the methodology are presented in Fig. 3. In this figure we present the observed and the predicted prices of the stock prices of Alpha Bank, the Upper Confidence Limit (UCL) and the Lower Confidence Limit (LCL) of the [(1-[alpha]).sub.*]100% (C.I) of the predicted prices.

[FIGURE 3 OMITTED]

Table 1 presents the forecasted prices, the Upper Confidence Limit (UCL) and the Lower Confidence Limit (LCL) of the [(1-[alpha]).sup.*]100% (C.I) of the forecasted prices for the last 20 observations of the sample.

Figure 4 depicts the forecasted prices and the Upper Confidence Limit (UCL) and the Lower Confidence Limit (LCL) of the (1-a)*100% (C.I) of the forecasted prices for the last twenty observations of the sample.

[FIGURE 4 OMITTED]

As it becomes evident by the figure the observed and the forecasted prices are bounded by the confidence limits, giving us an indication for the accuracy of the methodology. The following section presents the quantitative evaluation of the forecasting accuracy of the particular methodology.

7. Evaluation of forecasting accuracy of the forecasted prices

The validity of output from ANNs model was tested with the application of different criteria: Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Normalized Objective Function (NOF) and Theil's U--Statistic. The parameters RMSE, MAPE and NOF have to be as close to 0.0 as possible for the forecast to be considered satisfactory. However, when the parameter NOF is less than 1.0, then the theoretical method is reliable and can be used with sufficient accuracy (Hession et al. 1994; Kornecki, Sabbagh 1999; Tsihrintzis et al. 1998). The Theil's U--Statistic must be less than one.

According to the results we calculate the following evaluation criteria of accuracy of forecasting; RMSE = 0.8192, MAPE = 4.4819%, NOF = 0.048087 and Theil's U--Statistic = 0.048082.

The NOF is the ratio of the RMSE to the overall mean <[[??].sub.t]> of the forecasted by the model data (Tsihrintzis et al. 1998), defined as:

NOF = RMSE/<[[??].sub.t]>, (13)

where: <[[??].sub.t]> = 1/m [M.summation over (1)] [[??].sub.t] (14)

is the average value of the model output data.

The Theil's U--Statistic defined as:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (15)

where: [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] the forecasted prices and M the number of the forecasted prices.

8. Evaluation of forecasting accuracy of the [(1-a).sup.*]100% Confidence Intervals of the Forecasted prices

For the Evaluation of forecasting accuracy of the [(1-a).sup.*]100% Confidence Intervals of the Forecasted prices, a statistical test as follows is introduced;

We define the following distances:

[absolute value of Observed price - Forecasted price], [absolute value of Forecasted price - UCL], [absolute value of Forecasted price - LCL] for all the Forecasted prices.

We also define the min {[absolute value of Forecasted price - UCL], [absolute value of Forecasted price - LCL]} for all the Forecasted prices.

If the Probability,

P (|Observed price - Forecasted price [absolute value of [less than or equal to] min { Forecasted price - UCL], [absolute value of Forecasted price - LCL]} [greater than or equal to] 1-a, for all the forecasted prices then we can agree that [(1-a).sup.*]100% Confidence Intervals of the forecasted prices give a satisfactory forecast.

The absolute value of the differences (Observed - Forecasted), (Forecasted - UCL), (Forecasted - LCL) for all the Forecasted prices are given in Table 2.

Consequently, given that the probability;

P ([absolute value of Observed price - Forecasted price] [less than or equal to] min { [absolute value of Forecasted price - UCL], [absolute value of Forecasted price - LCL] } [greater than or equal to] 0.95 for all the forecasted prices, then we may argue that the [(1-a).sup.*]100%Confidence Intervals of the forecasted prices may give us a satisfactory forecast.

9. Conclusion--Discussion

This is the first time that the hybrid method described above is used in Finance. The particular methodology is combinational, since the Bootstrap method and ANNs are used for the determination of the [(1-[alpha]).sup.*]100 % (C. I.). The method was used to estimate the C.I. involving the predicted values of stock prices. We used the aforementioned Hybrid method for first time in the field of Finance. The method is completed in four steps. The main objective of this survey was to estimate the C.I. of the predicted values regarding the initial time series and its main accomplishment was to amplify the validity of the [(1-[alpha]).sup.*]100% Confidence Interval of the forecasted prices. We used different forecasting criteria like RMSE, MAPE, NOF and Theil's U--Statistic. In order to test the forecasting ability of the particular methodology, based on the results we calculated the following evaluation criteria of accuracy of forecasting RMSE = 0.8192, MAPE = 4.4819%, NOF = 0.048087 and Theil's U--Statistic = 0.048082, that all confirm a satisfactory forecast.

In the future, this method could be further developed with the use of another programming language in order to create a stand alone software toolbox for the prediction of stock prices.

doi: 10.3846/16111699.2011.555388

References

Aktan, B.; Korsakiene, R.; Smaliukiene, R. 2010. Time-varying volatility modelling of Baltic stock markets, Journal of Business Economics and Management 11(3): 511-532. doi:10.3846/jbem.2010.25

Al-Saba, T.; El-Amin, I. M. 1999. Artificial neural networks as applied to long-term demand forecasting. Journal of Artificial Intelligence in Engineering 13(2): 189-197. doi:10.1016/S0954-1810(98)00018-1

Annaert, J.; Van Osselaerand, S.; Verstraete, B. 2009. Performance evaluation of portfolio insurance strategies using stochastic dominance criteria, Journal of Banking and Finance 33(2): 272-280. doi:10.1016/j.jbankfin.2008.08.002

Ankenbrand, T.; Tomassini, M. 1996. Forecasting financial multivariate time series with neural networks, Neuro-Fuzzy Systems International Symposium 13(9): 95-101. doi:10.1109/ISNFS.1996.603826

Ankenbrand, T.; Tomassini, M. 1996. Predicting multivariate financial time series using neural networks: The Swiss bond case, in Computational Intelligence for Financial Engineering: Proceedings of the IEEE/IAFE 1996, March 24-26. Selected papers, 27-3. doi:10.1109/CIFER.1996.501819

Argyrakis, P. 2001. Neural Networks and Applications. Patra, EAP Editions (Greek Open University).

Barnes, A. P.; Moran, D.; Topp, K. 2009. The scope for regulatory incentives to encourage increased efficiency of input use by farmers, Journal of Environmental Management 90(2): 808814. doi:10.1016/j.jenvman.2008.01.017

Beltrati, A.; Margarita, S.; Terna, P. 1996. Neural Networks for Economic and Financial Modelling. London: Thomson Computer Press. 248 p.

Beran, R. 1988. Prepivoting test statistics: A bootstrap view of asymptotic refinements, Journal of the American Statistical Association 83(403): 687-697. doi:10.2307/2289292

Bjornstad, O. N.; Falck, W. 2001. Nonparametric spatial covariance functions: estimation and testing, Environmental and Ecological Statistics 8: 53-70. doi:10.1023/A:1009601932481

Bookstaber, R. M.; McDonald, J. B. 1987. A general distribution for describing security price returns, Journal of Business 60(3): 401-424. doi:10.1086/296404

Borovkova, S.; Dehling, H.; Renkema, J.; Tulleken, H. 2003. A potential-field approach to financial time series modelling, Economics Modelling 22: 139-161. doi:10.1023/A:1026181713294

Brock, W.; Lakonish, J.; Le Baron, B. 1992. Simple technical rules and the stochastic properties of stock returns, Journal of Finance 47: 1731-1764. doi:10.2307/2328994

Claeskens, G.; Keilgom, J. 2003. Bootstrap confidence bands for regression curves and their derivatives, Annals of Statistics 31(6): 1852-1884. doi:10.1214/aos/1074290329

Charitos, T.; Van der Gaag, L. C.; Visscher, S.; Schurink, K. A. M.; Lucas, P. J. F. 2009. A dynamic Bayesian network for diagnosing ventilator-associated pneumonia in ICU patients, EXpert Systems with Applications 36(21): 1249-1258. doi:10.1016/j.eswa.2007.11.065

Chatterjee, S.; Pari, R. A. 1990. Bootstrapping the number of factors in the arbitrage pricing theory, Journal of Financial Research 13(1): 15-21.

Chou, C. Y.; Lin, C. Y.; Chang, C. L.; Chen, C. H. 2006. On the bootstrap confidence intervals of the process incapability index Cpp, Reliahility Engineering and System Safety 91: 452-459. doi:10.1016/j.ress.2005.03.004

De Peretti, C. 2003. Bilateral bootstrap tests for long memory: an application to the silver market, Computational Economics 22(2): 187-212. doi:10.1023/A:1026129729224

Dogan, G. 2007. Bootstrapping for confidence interval estimation and hypothesis testing for parameters of system dynamics models, System Dynamics Review 23(4): 415-436. doi:10.1002/sdr.362

Efron, B. 1982. The jackknife, the bootstrap, and other resampling plans, in CBMS-NSF Regional Conference Series in Applied Mathematics 38, SIAM.

Efron, B.; Tibshirani, R. 1986. Bootstrap method for standard errors, confidence intervals, and other measures of statistical accuracy, Statistical Science 1(1): 54-75. doi:10.1214/ss/1177013815

Egeli, B.; Ozturan, M.; Badur, B. 2003. Stock Market Prediction Using Artificial Neural Networks. Department of Management Information Systems, Bogazici University, Istanbul, Turkey.

Fama, E. F. 1991. Efficient capital markets: II, Journal of Finance 46(5): 1575-1617. doi:10.2307/2328565

Fernandez-Rodriguez, F.; Gonzalez-Martel, Ch.; Sosvilla-Rivero, S. 2000. On the profitability of technical trading rules based on artificial neural networks: Evidence from the Madrid stock market, Economics Letters 69(1): 89-94. doi:10.1016/S0165-1765(00)00270-6

Fernandez-Rondriquez, F.; Sosvilla-Rivero, S.; Garcia Artiles, M. D. 1997. Using nearest-neighbour predictors to forecast the Spanish stock markets, Investigaciones Econdmicas 21: 75-91.

Focarelli, D. 2005. Bootstrap bias-correction procedure in estimating long-run relationships from dynamic panels, with an application to money demand in the Euro area, Economic Modelling 22(2): 305-325. doi:10.1016/j.econmod.2003.12.007

Franklin, L. A.; Wasserman, G. S. 1992. Bootstrap lower confidence limits for capability indices, Journal of Quality Technology 24(4): 196-240.

Gencay, R. 1998a. Optimization of technical trading strategies and the profitability in security markets, Economics Letters 59(2): 249-254. doi:10.1016/S0165-1765(98)00051-2

Gencay, R. 1998b. The predictability of security returns with simple technical trading rules, Journal of Empirical Finance 5(4): 347-359. doi:10.1016/S0927-5398(97)00022-4

Gencay, R. 1999. Linear, non-linear and essential foreign exchange rate prediction with some simple technical trading rules, Journal of International Economics 47(1): 91-107. doi:10.1016/S0022-1996(98)00017-8

Gencay, R.; Stengos, T. 1997a. Technical trading rules and the size of the risk premium in security returns, Studies in Nonlinear Dynamics and Econometrics 2(2): 23-32. doi:10.2202/1558-3708.1026

Gencay, R.; Stengos, T. 1998b. Moving average rules, volume and predictability of security returns with feedforward networks, Journal of Forecasting 17: 401-114. doi:10.1.1.129.3033

Glaz, J.; Sison, C. P. 1999. Simultaneous confidence intervals for multinomial proportions, Journal of Statistical Planning Inference 82(1): 251-262. doi:10.1016/S0378-3758(99)00047-6

Gutierrez-Estrada, J. C.; de Pedro-Sanz, E.; Lopez-Luque, R.; Pulido-Calvo, I. 2003. Comparison between traditional methods and artificial neural networks for ammonia concentration forecasting in an eel (Anguilla Anguilla L.) intensive rearing system, Aquacultural Engineering 31(3): 183-203. doi:10.1016/j.aquaeng.2004.03.001

Hall, P. 1988. Theoretical comparison of bootstrap confidence intervals, Annals of Statistics 16(3): 927-953. doi:10.1214/aos/1176350933

Hall, P. 1986. On the number of bootstrap simulations required to construct a confidence interval, The Annals of Statistics 14(4): 1453-1462. doi:10.1214/aos/1176350169

Hatemi, J. A.; Roca, E. 2006. A re-examination of international portfolio diversification based

on evidence from leveraged bootstrap methods, Economic Modelling 23(6): 993-1007. doi:10.1016/j.econmod.2006.04.009

Haykin, S. 2001. Kalman filtering and neural networks. John Wiley and Sons Editions. doi:10.1002/0471221546

Hession, W. C.; Shanholtz, V. O.; Mostaghimi, S.; Dillaha, T. A. 1994. Uncalibrated performance of the finite element storm hydrograph model, Transactions of ASAE 37: 777-783.

Hsie, D. A.; Miller, M. H. 1990. Margin regulation and stock market volatility, Journal of Finance 45: 3-29. doi:10.2307/2328807

Ioannou, K.; Arabatzis, G.; Lefakis, P. 2009. Predicting the prices of forest energy resources with the use of artificial neural networks (ANNs): the case of conifer fuel wood in Greece, Journal of Environmental Protection and Ecology 10(3): 678-694.

Kim, J. H. 2002. Bootstrap prediction intervals for autoregressive models of unknown or infinite lag order, Journal of Forecasting 21(4): 265-280. doi:10.1002/for.823

Kapetanios, G. 2008. Bootstrap based tests for deterministic time-varying coefficients in regression models, Computational Statistics and Data Analysis 53(2): 534-545. doi:10.1016/j.csda.2008.09.006

Kascha, C.; Mertens, K. 2009. Business cycle analysis and VARMA models, Journal of Economic Dynamics and Control 33(2): 267-282. doi:10.1016/j.jedc.2008.05.006

Kiani, K.; Kastens, T. 2008. Testing Forecast Accuracy of Foreign Exchange Rates: Predictions from Feed Forward and Various Recurrent Neural Network Architectures, Computational Economics 32: 383-406. doi:10.1007/s10614-008-9144-4

Kimoto, T.; Asakaya, K.; Yoda, M.; Takeoka, M. 1990. Stock market prediction system with modular neural networks, in Proc. IEEE International Joint Conference on Neural Networks, I1-I6.

Kim, J. H. 2001. Bootstrap-after-bootstrap prediction intervals for autoregressive models, Journal of Business and Economic Statistics 19: 117-128. doi:10.1198/07350010152472670

Knuth, D. 1998. The Art of Computer Programming, Volume 3: Sorting and Searching. Second Edition. Addison-Wesley,. ISBN 0-201-89685-0. Section 5.2.1: Sorting by Insertion, 80-105.

Kolsrud, D. 2007. Time-Simultaneous Prediction Band for a Time Series, Journal of Forecasting 26: 171-188. doi:10.1002/for.1020

Korajczyk, R. A. 1985. The pricing of forward contracts for foreign exchange, Journal of Political Economy 93: 346-368. doi:10.1086/261303

Kornecki, T. S.; Sabbagh, G. J.; Storm, D. E. 1999. Evaluation of runoff, erosion and phosphorus modelling system- SIMPLE, Journal of the American Water Resources Association 35: 807-820. doi:10.1111/j.1752-1688.1999.tb04176.x

Koutroumanidis, T.; Ioannou, K.; Arabatzis, G. 2009. Predicting fuelwood prices in Greece with the use of ARIMA models, artificial neural networks and a hybrid ARIMA-ANN model, Energy Policy 37(9): 2327-2634. doi:10.1016/j.enpol.2009.04.024

Le Baron, B. 1992. Do moving average rule results imply nonlinearities in foreign eXchange markets? Working Paper 9222, SSRI, University of Wisconsin-Madison.

Le Baron, B. 1998. Technical trading rules and regime shifts in foreign eXchange, in Acar, E.; Satchell, S. (Eds.). Advanced Trading Rules, 5-40. Butterworth-Heinemann, London.

Le Baron, B. 1999. Technical trading rule profitability and foreign exchange intervention, Journal of International Economics 49: 125-143. doi:10.1016/S0022-1996(98)00061-0

Lento, C.; Gradojevic, N. 2007. The Profitability of Technical Trading Rules: A Combined Signal Approach, Journal of Applied Business Research 23(1):13-28.

Levich, R. M.; Thomas, L. R. 1993. The significance of technical trading-rule profits in the foreign exchange market: a bootstrap approach, Journal of International Money and Finance 12: 451-474. doi:10.1016/0261-5606(93)90034-9

Li, J.; Daniel, R.; Pettyjohn, J. 2009. Approximate and generalized pivotal quantities for deriving confidence intervals for the offset between two clocks, Statistical Methodology 6(1): 97-107. doi:10.1016/j.stamet.2008.04.002

Maddala, G. S.; Li, H. 1996. Bootstrap based tests in financial models, in Maddala, G. S. and Rao, C. R. (Eds.). Handbook of Statistics14: 463-488. doi:10.1016/S0169-7161(96)14017-7

Mark, N. C. 1995. Exchange rates and fundamentals: evidence on long-horizon predictability, American Economic Review 85: 201-218.

McAdam, P.; McNellis, P. 2005. Forecasting inflation with thick models and neural networks, Economic Modelling 22: 848- 867. doi:10.1016/j.econmod.2005.06.002

McCullough, B. D. 1994. Bootstrapping forecast intervals: an application to AR(p) models, Journal of Forecasting 13: 51-66. doi:10.1002/for.3980130107

Mizuno, H.; Kosaka, M.; Yajima, H.; Komoda, N. 1998. Application of neural network to technical analysis of stock market prediction, Studies in Informatics and Control 7(3): 111-120.

Mooney, C. Z.; Duval, R. D. 1993. Bootstrapping: a nonparametric approach to statistical inference, in Sage University paper series on quantitative applications in the social sciences, 7-95.

Parisi, A.; Parisi, F.; Diaz, D. 2008. Forecasting gold price changes: Rolling and recursive neural network models, Journal of Multinational Financial Management 18(5): 477-487. doi:10.1016/j.mulfin.2007.12.002

Pesavento, S.; Rossi, B. 2006. Small-sample confidence intervals for multivariate impulse response functions at long horizons, Journal of Applied Econometrics 21: 1135-1155. doi:10.1002/jae.894

Phua, P. K. H.; Ming, D.; Lin, W. 2000. Neural network with genetic algorithms. For stocks prediction, in Fifth Conference of the Association of Asian-Pacific Operations Research Societies, 5th-7th July, Singapore.

Prybutok, V. R.; Junsub, Y.; Mitchell, D. 2000. Comparison of neural network models with ARIMA and regression models for prediction of Houston's daily maximum ozone concentrations, European Journal of Operational Research 122: 31-40. doi:10.1016/S0377-2217(99)00069-7

Ruiz, E.; Pascual, L. 2002. Bootstrapping Financial Time Series, Journal of Economic Surveys 16(3): 271-300. doi:10.1111/1467-6419.00170

Rumelhart, D. E.; McClelland, J. L. 1986. Parallel distributed processing: eXploration in the microstructure of cognition. Cambridge, MIT Press.

Sexton, R. S.; Dorsey, R. E.; Johnson, J. D. 1998. Toward global optimization of neural networks: A comparison of the genetic algorithm and backpropagation, Decision Support Systems 22:171-185. doi:10.1016/S0167-9236(97)00040-7

Shao, J.; Tu, D. 1955. The Jack-knife and Bootstrap. Springer, New York.

Simar, L.; Wilson, P. W. 1998. Sensitivity analysis of efficiency scores: how to bootstrap in nonparametric frontier models, Management Science 44: 49-61. doi:10.1287/mnsc.44.1.49

Skabar, A.; Cloete, I. 2002. Neural Networks, Financial Trading and the Efficient Markets Hypothesis, Twenty-Fifth Australasian Computer Science Conference (ACSC2002), Melbourne, Australia, in Conferences on Research and Practice in Information Technology, Vol. 4.

Sosvilla-Rivero, S.; Andrada-Felix, J.; Fernandez-Rodriguez, F. 1999. Further evidence on technical analysis and profitability of foreign exchange intervention, Working Paper 99-01, FEDEA.

Tambakis, D.; Van Royen, N. A. 2002. Conditional predictability of daily exchange rates, Journal of Forecasting 21(5): 301-315. doi:10.1002/for.834

Teresiene, D. 2009. Lithuanian stock market analysis using a set of garch models, Journal of Business Economics and Management 10(4): 349-360. doi:10.3846/1611-1699.2009.10.349-360

Teknomo, K. 2006. Transportation Research Part F: Traffic Psychology and Behaviour. 9(1):15-27. doi:10.1016/j.trf.2005.08.006

Tribouley, K. 2004. Adaptive simultaneous confidence intervals in non-parametric estimation, Statistics Probability Letters 69(1): 37-51. doi:10.1016/j.spl.2004.04.008

Tseng, F. M.; Yu, H. C.; Tzeng, G. H. 2002. Combining neural network model with seasonal time series ARIMA model, Technological Forecasting and Social Change 69: 71-87. doi:10.1016/ S0040-1625(00)00113-X

Thombs, L.; Schuchany, W. 1990. Bootstrap prediction intervals for autoregression, Journal of the American Statistical Association 85(48): 6-92.

Tsihrintzis, V. A.; John, D. L.; Tremblay, P. J. 1998. Hydro-dynamic Modelling of Wetlands for flood Detention, Water Resources Management 12: 251-269. doi:10.1023/A:1008031011773

Van Eyden, R. J. 1995. The Application of Neural Networks in the Forecasting of Share Prices, Finance and Technology Publishing. Haymarket, VA.

Weigend, A. S.; Gershenfeld, N. A. (Eds.). 1994. Time Series Prediction: Forecasting the Future and Understanding the Past. Reading: Addison-Wesley.

White, H. 1988. Economic,prediction using neural networks: the case of the IBM daily stock returns, in Proc. IEEE International Conference on Neural Networks, 451-458. doi:10.1109/ICNN.1988.23959

White, H.; Racine, J. 2001. Statistical inference, the bootsrap, and neural--network modelling with application to foreign exchange rates, IEEE Transactions on Neural Networks 12: 657-671. doi:10.1109/72.935080

Xiong, S.; Li, G. 2008. Some results on the convergence of conditional distributions, Statistical & Probability Letters 78(18): 3249-3253. doi:10.1016/j.spl.2008.06.026

Yildiz, B. 2001. Finansal Basarisizligm Ongorulmesinde Yapay Sinir Agi Kullanimi veHalka Acik Sirketlerde Ampirik Bir Uygulama (Use of Artificial Neural Networks in Prediction of Financial Failures), Journal of IMKB 5(17): 51-67.

Theodoros Koutroumanidis (1), Konstantinos Ioannou (2), Eleni Zafeiriou (3)

(1, 3) Democritus University of Thrace, Department of Agricultural Development, Pantazidou 193, Orestiada 68200, Greece

(2) Laboratory of Forest Informatics, School of Forestry and Natural Environment, Aristotle University of Thessaloniki, Box 247, 54 124 Thessaloniki, Greece

E-mails: (1) tkoutrou@agro.duth.gr; (2) ioannou.konstantinos@gmail.com;

(3) ezafir@agro.duth.gr (corresponding author)

Received 12 May 2010; accepted 17 December 2010

Theodoros KOUTROUMANIDIS was born in Komotini. He is a Professor in Democritus University of Thrace with PhD in the Department of Civil Engineering in the Polytechnic School of Thrace. His scientific interests are related to time series Analysis (mainly ARIMA models), Fuzzy Logic analysis. He has written in different cited Journals like Forest Policy and Economics, Journal of Hydrology, Energy Policy and others, while he has been a reviewer in different journals.

Konstantinos IOANNOU is a PhD, MSc Forester, specialized in the development of Information Systems using modern computer languages and statistical tools. In detail he works in the field of Artificial Intelligence, creating and studying Decision support Systems, Expert Systems, Artificial Neural Networks. He has great teaching experience in Universities, where he teaches Information Technologies, Geographical Information Systems and Computer Aided Design. His published work includes 13 papers in international science journals, most of them recorded by Web of Science. Additionally he has published 14 papers in Greek and International Conferences, on Planning and Development of Natural Resources, with the use of information systems and statistical tools.

Eleni ZAFEIRIOU was born in Thessaloniki, Greece in 1973. She is a Lecturer in Democritus University of Thrace, with PhD in Applied Econometrics, and her scientific interests are related to the time series Analysis, cointegration, chaotic behavior and others. She has written in cited Journals like Forest Policy and Economics and Journal of Hydrology, while she has been a reviewer in different journals like Computational Economics and Operational Research. She has been employed as a scientific partner in different projects elaborated by the Aristotle University of Thessaloniki.
Table 1. Forecasted prices Alpha Bank stock, Upper Confidence Limit
(UCL) and the Lower Confidence Limit (LCL) of the (1-[alpha]) * 100%
(C.I) of the forecasted prices for the last twenty observations of our
sample

Observed   Forecasted    Upper Confidence    Lower Confidence
prices     prices        Limit of the 95%    Limit of the 95%
                         (C.I.) of the       (C.I.) of the
                         forecasted prices   forecasted prices

17.93      17.315485     18.42955947         16.09441543
17.89      17.50035477   18.60822296         16.27662981
17.96      17.57427025   18.67965317         16.34950793
18.11      17.18260002   18.30110979         15.96350431
17.86      17.12118912   18.24173307         15.9030273
17.87      16.95606422   18.08198738         15.74048936
17.79      17.0229435    18.14670765         15.80630672
17.86      17.0229435    18.14670765         15.80630672
17.63      16.94869041   18.07484949         15.73323393
17.61      16.91200066   18.03932714         15.69713652
18.03      16.95606422   18.08198738         15.74048936
18.14      17.3791275    18.49106801         16.15713298
17.99      17.18260002   18.30110979         15.96350431
17.84      16.76817131   17.89993477         15.55569184
17.74      16.80368614   17.93437803         15.59060836
17.53      16.5951004    17.73177969         15.38562953
17.66      16.73997498   17.872576           15.52797508
17.66      16.73997498   17.872576           15.52797508
17.91      17.12118912   18.24173307         15.9030273
17.66      16.84669304   17.97606468         15.63289917

Table 2. The absolute value of the differences (Observed--
Forecasted), (Forecasted--UCL), (Forecasted--LCL) for all the
Forecasted prices

Absolute                 Absolute      Absolute
(Observed--Forecasted)   (For--UCL)    (For--LCL)

0.614515                 1.1140745     1.22106957
0.389645                 1.1078682     1.22372496
0.38573                  1.1053829     1.22476232
0.9274                   1.1185098     1.21909571
0.738811                 1.120544      1.21816182
0.913936                 1.1259232     1.21557486
0.767056                 1.1237642     1.21663678
0.837056                 1.1237642     1.21663678
0.68131                  1.1261591     1.21545648
0.697999                 1.1273265     1.21486414
1.073936                 1.1259232     1.21557486
0.760873                 1.1119405     1.22199452
0.8074                   1.1185098     1.21909571
1.071829                 1.1317635     1.21247947
0.936314                 1.1306919     1.21307778
0.9349                   1.1366793     1.20947087
0.920025                 1.132601      1.2119999
0.920025                 1.132601      1.2119999
0.788811                 1.120544      1.21816182
0.813307                 1.1293716     1.21379387
联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有