# Long Memory in the Volatility of Indian Financial Market

An Empirical Analysis

Professorial Dissertation 2014 109 Pages

## Excerpt

## Table of Contents

List of Tables

List of Figures

Chapter 1: Introduction

1.1 Volatility and long memory

1.2 Structure of the book

Chapter 2: Literature Review

2.1. Long range dependence in the financial time series

Chapter 3: Long memory tests

3.1. Long memory in a financial time series

3.2. R/S analysis

3.3. Modified R/S analysis (R/S-AL)

3.4. Detrending moving average analysis (DMA)

3.5. Generalized Hurst exponent

3.6. Lo’s modified R/S analysis

3.7. Detrended fluctuation analysis (DFA)

3.8. Local Whittle method

3.9. Exact Local Whittle test

3.10. Discrete wavelet transform

3.11. Aggregated variance method

3.12. Geweke and Porter-Hudak (GPH) (1983) test

3.13. The autoregressive fractionally integrated moving average (ARFfMA) model

3.14. The fractionally integrated generalized autoregressive conditional heteroskedasticity (FIGARCH) model

3.15. The fractionally integrated exponential generalized autoregressive conditional heteroskedasticity (FIEGARCH) model

3.16. The fractionally integrated asymmetric power autoregressive conditional heteroskedasticity (FIAPARCH) model

Chapter 4: Long memory in the volatility of the Indian stock market

4.1. Abstract

4.2. Data description and computational details

4.3. Empirical results

4.3.1. Evidence from semi-parametric long memory test

4.3.2. Evidence from the FIGARCH model

4.4. Conclusion

Chapter 5: Long memory in the volatility of the Indian exchange rates

5.1. Abstract

5.2. Monte Carlo experiment

5.3. Data description and computational details

5.4. Emp irical Results

5.4.1. Evidence from aggregated variance method

5.4.2. Evidence from semi-parametric (Geweke and Porter-Hudak (GPH) (1983)) long memory test

5.5. Conclusion

Chapter 6: Asymmetry and long memory in the volatility of the Indian banking sector

6.1. Abstract

6.2. Data and computational details

6.3. Empirical results

6.3.1. Long memory in absolute daily returns and squared daily returns

6.3.2. Results of GARCHfamily models across the periods

6.3.3. Impact of sub-prime crisis on the volatility ofthe CNXBank Nifty index

6.3.4. Asymmetric long memory characteristics in volatility

6.4. Conclusion

References

## List of Tables

Descriptive Statistics of Stock Returns

Results of Local Whittle test for estimation of parameter d

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models for S&P CNX Nifty under Gaussian distribution

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models for CNX 100 under Gaussian distribution

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models for CNX 500 under Gaussian distribution

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models for CNX Nifty Junior under Gaussian distribution

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models for CNX Midcap under Gaussian distribution

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models for CNX Smallcap under Gaussian distrib ution

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models under for S&P CNX Nifty Student-t distribution

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models under for CNX 100 Student-t distribution

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models under for CNX 500 Student-t distribution

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models under for CNX Nifty Junior Student-t distribution

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models under for CNX Midcap Student-t distribution

Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH models under for CNX Smallcap Student-t distrib ution

Standard deviation and mean absolute error for all the tests for varying sample size

Descriptive Statistics ofINR exchange rates

Results of Aggregate Variance method test for estimation of Hurst exponent H.

Results of GPH test for estimation of Hurst exponent H

Descriptive statistics of stock returns

The Hurst exponent for the sub-sample periods for various volatility proxies.. 77 Ranking the market efficiency during the sub-periods and the overall period

based on the estimated Hurst exponents

Estimat ion results and diagnostics for the pre-crisis period

Estimat ion results and diagnostics for the crisis period

Estimat ion results and diagnostics for the post-crisis period

Estimat ion results and diagnostics for the whole period

Estimat ion results for the FIGARCH model

Estimât ion results for the FIAPARCH model

## List of Figures

Time plots of returns and prices data series for all the indices

ACF (Autocorrelation function) plots for all the indices. Horizontal dashed lines represent([Abbildung in dieser Leseprobe nicht enthalten])

Plots of the mean value and 95% confidence band (2.5% and 97.5% quantiles) of the estimated Hurst exponents for the Gaussian white noise and the fractional

Gaussian noise (for H = 0.4, 0.5 and 0.6) for aggregated variance method (AVM) and Geweke and Porter-Hudak (GPH) approaches

Comparison of the estimated power-law exponents for AVM and GPH approaches with ideal (Hin = Hout) values

Price and Return plots for all Indian exchange rates

ACF (Autocorrelation function) plots for all the exchange rates. Horizontal dashed lines represent ([Abbildung in dieser Leseprobe nicht enthalten])

Time plots of returns and prices data series

## Chapter 1: Introduction

### 1.1 Volatility and long memory

Volatility is considered to be an important ingredient of quantitative finance and a plethora of literature exist related to estimating, modeling and forecasting volatility. If we want to estimate daily volatility using daily closing prices, the most widely used estimators are the demeaned squared daily returns and the demeaned absolute daily returns. But these estimates of volatility are very noisy, inefficient and biased in nature. Another way of estimating volatility more precisely is to use intraday high frequency data. However, in many cases, high frequency data is not available at all or sometime it is available only over smaller intervals. High frequency data is generally very expensive and requires considerable computational resources for analysis. High frequency data also suffers from market microstructure issues which makes volatility estimation using high frequency data highly complex. Several studies have highlighted the importance of volatility estimators that utilize the opening, high, low and closing prices of an asset because they give rise to much more efficient estimates of volatility compared to volatility estimated using conventional return data. The opening, high, low and closing prices are also readily available for most of the tradable assets and indices in financial markets and potentially contain more information for estimating volatility when compared to the close to close return data that is conventionally made use of.

The Indian stock market has grown rapidly and significantly in the last decade and provides enormous opportunities to investors seeking high returns. Now, the Indian equity markets have global presence and have attracted the attention of global investors. Hence, it is essential to study the behavior of the volatility of returns from the Indian stock market. Long memory in the volatility of stock returns is one of the important stylized facts to consider. This has triggered the interests of researchers, market participants, practitioners and regulators to unravel the behavior of the equity markets. Being an emerging market, the Indian markets are characterized by high volatility, thin trading, high returns and various frictions. High volatility in the returns, in one sense, highlights the vulnerability of the markets and hence the respective economy.

The fluctuation in exchange rates can significantly impact the returns of an asset in foreign currency. The Indian exchange rate market has grown significantly in the last decade and provides enormous opportunities to investors. Like in other emerging markets, foreign investors also face higher risk when investing in India. So, to earn higher returns, it is essential to analyze the behavior of the volatility of Indian exchange rates relative to liquid currencies like US dollar, GBP, Euro and Japanese yen to develop meaningful investment and trading strategies and to mitigate the associated risks. The Indian banking sector has also experienced significant growth in the last decade and has become an important investment target, by providing enormous investment opportunities to investors and portfolio managers. Hence, it is essential to study the behavior of the volatility of returns from the Indian banking sector.

Volatility is an important input to various finance applications which includes portfolio selection and allocation, derivatives pricing, futures hedging, risk management, implementing trading strategies, asset pricing and asset allocation. It is to be noted that volatility of a market is not directly observable and hence the literature on volatility is devoted on the procedures to extract volatility from the observable data. This has resulted in the development of various volatility estimators and models to measure and forecast volatility.

Market inefficiency refers to the fact that the market does not react immediately as new information flows in but responds to it gradually over a period of time.

The violation of efficient market hypothesis supports the presence of persistence (long memory) or anti-persistence (mean reversion) in the stock market. In economics and finance, whether or not asset prices display long-range dependence is still an important area to explore in research because of its importance for capital market theories. The analysis related to the long memory property is realized through the estimation of the power-law scaling exponent or the Hurst exponent. The term power-law scaling exponent has its origin in physics (Hurst, 1951) but also finds application in financial markets (Mandelbrot, 1971, 1997). The subject of detecting long memory in a given time series was first studied by Hurst (1951), an English hydrologist, who proposed the concept of the Hurst exponent based on Einstein’s contributions to the Brownian motion in physics to deal with the obstacles related to the reservoir control near the Nile river dam. Scaling exponents have characteristics that reflect facts having a bearing on market efficiency. The presence of long memory in the evolution of asset prices describes the higher-order correlation structure in the series and supports the possibility of predicting its behaviour in a market setting.

The study of long-range dependence in financial time-series has a long history and has remained an active topic of research in economics and finance (Mandelbrot (1971), Greene and Fielitz (1977), Cutland et al. (1995), Baillie et al. (1996)). The analysis related to the long memory property can be realized through the estimation of the fractional integration parameter or the Hurst exponent. It has been observed that the squared return, the absolute return and the logarithm of the squared return of financial assets or exchange rates exhibit serial correlations that show hyperbolic decay similar to those of an 1(d) process (Taylor (1986)). This persistence has a major impact on the future volatility of the stock markets, exchange rates and banking sector under the influence of shocks.

Asymmetric long memory in the volatility of stock returns is one of the important areas to explore in research. The asymmetric response of volatility to news indicates that the falling prices result in greater volatility than rising prices of the same magnitude. This phenomenon is known as leverage effect (Black, 1976 and Nelson, 1991). After the pioneering work of Black (1976), many studies have tried to examine the story behind the asymmetry of volatility. The literature talks about two possible explanations for the asymmetry in volatility, the first being operating and financial leverage effect (Black, 1976) and the second being the volatility feedback effect (Bekaert and Wu, 2000). The EGARCH model (Nelson, 1991), GJR-GARCH model (Glosten, Jagannathan and Runkle, 1993) and APARCH model (Ding, Granger and Engle, 1993) are popular asymmetric GARCH class models.

Varity of measures of long-range dependence are used in the finance literature. In the time domain, long memory of the financial time series is related to a hyperbolically decaying of the autocovariance function. On the other hand, in the frequency doma in, the presence of long memory in the financial time series is highlighted by a spectral density function that approaches infinity near the zero frequency; in other words, such series display power at low frequencies (Lo, 1991; Di Sario et al, 2009). Such developments in the literature have led researchers to develop stochastic models that can capture long memory characteristics of the financial time series, such as the fractionally-integrated I(d) time series models introduced to economics and finance by of Granger (1980), Granger and Joyeux (1980), and Hosking (1981). The commonly used measure of long memory is the Hurst exponent (H) or the self-similarity parameter which is a dimens ionless parameter and diverse methodologies exist to estimate it. The concept of the Hurst exponent finds its applications in many research fields including the field of financ ial studies due to the groundbreaking work of Mandelbrot (1963, 1997) and Peters (1991, 1994). The Hurst exponent lies in the range 0 < H < 1. If the Hurst exponent is 0.5 then the process is said to follow a random walk. When the Hurst exponent is greater than 0.5, it suggests positive long-range autocorrelation in the return series or persistence in the stock price series. On the other hand, when the Hurst exponent is smaller than 0.5, it suggests the presence of negative autocorrelation in the returns or mean reversion in the stock price series. The second measure, d, is the fractional integration parameter, which can be estimated from fitting an ARFIMA(p,d,q) model on volatility series or by applying Fractionally Integrated GARCH class models on the log-differenced series. As highlighted by the fractional integration theory that the fractional difference parameter is not an integer value (for example: 0 or 1) but a fractional value (see Baillie, 1996). The fractional differencing parameter indicates the order of integration of the financial time series. Fractionally integrated time series is different from both the stationary and the unit-root processes in a way that the fractionally integrated processes are persistent (i.e., they reflect long memory) and mean reverting. The fractionally integrated parameter is given as d e (0, 0.5). When d >0.5 the time series is considered to be non-stationary in nature and when d e (-0.5, 0) the time series is considered to be anti-persistent in nature. In this book, we have highlighted the popular approaches in the asset pricing literature to examine the long memory in the volatility.

The long memory characteristics of the financial time series are widely studied and have implications for various economics and finance theories. The most important financial implication is related to the violation of the weak-form of market efficiency which encourages the traders, investors and portfolio managers to develop models for making predictions and to construct and implement speculative trading and investment strategies. In an efficient market, the price of an asset should follow a random walk process in which the price change is unaffected by its lagged price changes and has no memory. Random walks in stock prices and exchange rates present important challenges to market participants and analysts. If the random walk model holds then prediction by analysts is like that of astrologers. Traditional random walk tests (based on studies before 1980s) of asset returns were primarily based on the serial correlation of price changes. The pioneering work of Kendall and Hill (1953) and Fama (1965) present strong and voluminous evidence in favour of the random walk hypothesis which support the weak-form efficient market hypothesis. Past studies have provided evidence in support of the hypothesis that nominal exchange rate series follow random walks. See for instance, Bachelier (1900), Cootner (1964), Samuelson (1965), Malkiel and Fama (1970), Giddy and Dufey (1975), Roll (1979), Meese and Singleton (1982), Adler and Lehmann (1983), Darby (1983), Hsieh (1988) and Baillie and Bollerslev (1989). However, Huizinga (1987) and Grilli and Kaminsky (1991) find evidence against the random walk hypothesis for exchange rates. It is to be noted that pricing derivative securities with random walk process or Brownian motion process may not be appropriate if the true underlying stochastic process exhibits long memory property. Hence, it is important to consider the long memory property while exploring the characteristics of financial derivatives which has implications for the derivative market participants, risk managers and asset allocation decisions makers.

### 1.2 Structure of the book

The book comprises of six chapters. The outlines of the chapters are as follow:

Chapter 1 highlights importance of volatility in various finance applications. It also provides the background information about market efficiency and long memory property in the financial time series.

Chapter 2 highlights the development in the literature related to the long memory property of the financial time series and estimation of long memory parameters.

In Chapter 3, we provide various approaches to estimate the long memory parameter in both the time and the frequency doma in.

In Chapter 4, we examine the long memory characteristics in the volatility of the Indian stock market. We apply both the semi-parametric method (the Local Whittle (LW) estimator) and the parametric method (GARCH class models) to accomplish our goal. Modeling long memory in the volatility of the Indian stock market has been a neglected area of research because very few studies exist in the literature mainly focused on the long memory property of volatility in the Indian stock market.

In Chapter 5, we examine the long memory characteristics in the volatility of the Indian exchange rates relative to US dollar, GBP, Euro and Japanese yen. We apply the aggregated variance method (AVM) and the Geweke and Porter- Hudak (GPH) (1983) analysis (a semi-parametric technique) to accomplish our goals. Modeling long memory in the volatility of the Indian exchange rates has been a neglected area of research because very few studies exist in the literature mainly focused on the long memory property of volatility in the Indian exchange rates. Hence, our study can be considered as a contribution on this topic which involves the analysis of the main proxies of volatility. On the other hand, this chapter also investigates the accuracy of the AVM and the GPH approaches by means of Monte Carlo simulation experiments.

In Chapter 6, we examine the asymmetric long memory characteristics in the volatility of the Indian banking sector and in particular, the CNX Bank Nifty index. We apply the Detrended Fluctuation Analysis (DFA) technique and the GARCH family of models (namely, GARCH (1,1), EGARCH (1,1), GJR-

GARCH (1,1), FIGARCH (l,d,l) and FIAPARCH (l,d,l)) to accomplish our goal. Modeling asymmetric long memory in the volatility of the Indian stock market has been a neglected area of research because very few studies exist in the literature that are mainly focused on the long memory property of volatility in the Indian stock market. Hence, our study can be considered as a contribution on this topic which involves the analysis of the main proxies of volatility and the asymmetric behavior of the conditional volatility.

## Chapter 2: Literature Review

### 2.1. Long range dependence in the financial time series

The study of long-range dependence in financial time series has a long history and has remained an active topic of research in economics and finance. See, for instance, Mandelbrot (1971), Greene and Fielitz (1977) and Cutland, Kopp, and Willinger (1995). Mandelbrot (1972) finds that the R/S analysis shows superior properties over autocorrelation and variance analysis (because it can work with distributions with infinite variance) and spectral analysis (because it can detect non-periodic cycles). Greene and Fielitz (1977) utilize the Hurst rescaled-range (R/S) method and provide evidence in support of long memory in the daily stock return series. With the development of the log periodogram regression estimator by Geweke and Porter-Hudak (1983), based on the order of integration parameter d in the ARFIMA model of Granger and Joyeux (1980) and Hosking (1981), triggered the literature of the fractionally integrated models. Diebold and Rudebusch (1989) explore the long memory characteristics of the US real GNP data. Lo (1991) find that the classical R/S test used by Mandelbrot and Green and Fielitz suffers from a drawback in that it is unable to distinguish between long memory and short range dependence. Lo (1991) proposes a modified test of the R/S statistic which can distinguish between short term dependence and long memory and finds that daily stock returns do not show long-range dependence properties. Cheung and Lai (1995) analyze data from Austria, Italy, Japan and Spain and detect long memory in these markets. In addition, this finding was invariant to the choice of estimation methods employed. In particular, results from both the modified ‘rescaled range’ and the spectral regression method, which was used to model an ARFIMA process indicated the presence of long memory dynamics in the data. Willinger, Taqqu, and Teverovsky (1999) empirically find that Lo’s modified R/S test leading to the acceptance of the null hypothesis of no long-range dependence for CSRP (Center for Research in Security Prices) data is less conclusive than it appears.

This is so because of the conservative nature of the test statistic in rejecting the null hypothesis of no long-range dependence, by attributing what is found in the data to short-term dependence instead. Peters (1991) use R/S approach to study the long memory characteristics of daily exchange rates data of US dollars, Japanese yen, British pounds, Euros and Singapore dollars, and finds evidence that support the presence of long memory properties in exchange rates. Baillie, Chung, and Tieslau (1996) investigate the long-range dependence properties in inflation time series and find positive results. Corazza and Malliaris (2002) carry out a study on foreign currency markets and find evidence of long memory. They also find that Hurst exponent does not remain fixed but changes dynamically with time. In addition, they provide evidence that foreign currency returns follow either a fractional Brownian motion or a Pareto-Levy stable distribution. Cajueiro and Tabak (2004) use the rolling sample approach to calculate Hurst exponents over the period October 1992 to October 1996 and provide evidence of long-range dependence in Asian markets. Carbone, Castelli, and Stanley (2004) propose the detrending moving average (DMA) algorithm to estimate the Hurst exponent, which does not require any assumption regarding the underlying stochastic process or the probability distribution function of the random variable. Matteo, Aste, and Dacorogna (2005) study the scaling properties of daily foreign exchange rates, stock market indices and fixed income instruments by using the generalized Hurst exponent approach and find that the scaling exponents can be used to differentiate markets in their stage of development. Cajueiro and Tabak (2005) study the possible sources of long- range dependence in returns of Brazilian stocks and find that firm specific variables can partially explain the long-range dependence measures, such as the Hurst exponent. Souza, Tabak, and Cajue iro (2008) study the evolution of long memory over time in returns and volatilities of British pound futures contracts by using the classic R/S approach, the detrended fluctuation analysis (DFA) approach and the generalized Hurst exponent (GHE) approach and find a change in the long memory characteristics of the British pound around the time of the European financial crisis. Serletis and Rosenberg (2009) use the detrending moving average (DMA) approach to calculated the Hurst exponent and find evidence in support of anti-persistence (mean reversion) in the US stock market. They also estimate the local Hurst exponent (on non-overlapping windows of 50 observations) to examine the evolution of efficiency characteristics of index returns over time. Kristoufek (2010) re-examines the results of Serletis and Rosenberg (2009) and finds that there are no signs of antipersistence in the US stock market.

After the Autoregressive Conditional Heteroskedasticity (ARCH) model and the Generalized ARCH (GARCH) model were introduced by Engle (1982) and Bollerslev (1986) respectively, numerous extensions of ARCH models have been proposed in the literature, by specifying the conditional mean and conditional variance equations, which are potentially helpful in forecasting the future volatility of stock prices. Engle and Bollerslev (1986) propose the Integrated GARCH (IGARCH) model to capture the impact of a shock on the future volatility over an infinite horizon. However, these GARCH and

IGARCH models are not able to capture the long memory property of volatility satisfactorily. To deal with this shortcoming, Baillie et al. (1996) propose the fractionally integrated GARCH (FIGARCH) model to allow for fractional orders I(d) of integration, where 0 < d < 1. This model estimates an intermediate process between GARCH and IGARCH. They apply the FIGARCH model to examine the persistence in Deutschmark - U.S. dollar exchange rates volatility. Vilasuso (2002) obtains the exchange rate volatility forecast by us ing FIGARCH model and finds that the FIGARCH model produces significantly better volatility forecasts (for 1-day and 10-days ahead) compared to GARCH and IGARCH. Kang and Yoon (2006) investigate the asymmetric long memory features in the volatility of Asian stock markets. Cheong, Nor and Isa (2007) investigate the asymmetry and long memory volatility behavior of the Malaysian Stock Exchange daily data by considering the financial crisis between 1991 to 2006 on various sub-periods (pre-crisis, crisis and post-crisis) and find mixed results.

Granger and Ding (1995) utilize the Geweke and Porter-Hudak (1983) test to examine the presence of long-memory in absolute returns of the S&P 500 Index. The estimation of the long memory parameter d in the volatility series as per the Geweke and Poter-Hudak test involves an ordinary linear regression of the log periodogram of a volatility series (with the proxy being the absolute return or the squared return) with the log frequency as the explanatory variable. Lobato and Velasco (2000) apply a two-step semi-parametric estimator to obtain the long-memory parameter of stock market volatility and trading volume. They conduct the ir analysis in the frequency domain which involves tapering the data. Assaf and Cavalcante (2005) use the modified rescaled range (R/S) statistic of Lo (1991), the rescaled variance measure of Giraitis et al. (2000), and the semi-parametric estimator proposed by Robinson (1995) and the Fractionally Integrated Generalized Autoregressive Conditional

Heteroskedasticity (FIGARCH) by Baillie et al. (1996) to estimate the fractional parameter d for the Brazilian stock market. Kilic (2004) makes use of both parametric and nonparametric methods to examine the long memory characteristics in the volatility of the Istanbul Stock Exchange National 100 Index.

Gu and Zhou (2007) apply Detrended Fluctuation Analysis (DFA), R/S analysis and modified R/S analysis to study the long memory property of the volatility of 500 stocks traded on the Shanghai Stock Exchange (SHSE) and Shenzhen Stock Exchange (SZSE) and find strong evidence in support of long memory in the volatility of the 500 stocks. Dionisio et al. (2007) analyze the behavior of volatility for various international stock market indices in the context of non- stationarity and prefer the FIGARCH model over the GARCH and the IGARCH models for capturing the behavior of volatility. Bentes et al. (2008) use the FIGARCH model and entropy measures to study the long memory property of the volatility time series for S&P 500, NASDAQ 100 and Stoxx 50 indices to compare US and European Markets and find that both perspectives show nonlinear dynamics in the volatility time series. Oh et al. (2008) study the longterm memory in the KOSPI 1 - minute market index and exchange rates of six countries relative to US dollar (5-minutes data of exchange rates are used for Euro, UK GBP, Japanese Yen, Singapore SGD, Switzerland CHF and Australia AUD) using DFA and the FIGARCH model. The ir findings are supportive of long memory in the volatility series which can be attributed to the volatility clustering observed in the series. Di Sario et al. (2008) utilize approaches based on wavelets and aggregate series to test for long memory in the volatility of the Istanbul Stock Exchange National 100 Index. They make use of absolute returns, squared returns and log squared returns as proxies of volatility and find that all volatility series display long memory property. Kang et al. (2010) utilize two semi-parametric tests (the Geweke and Porter-Hudak (GPH) test and the Local Whittle (LW) test) and the FIGARCH model to examine the long memory property in the volatility of the Chinese stock market and find evidence of long memory features in the volatility time series and suggest that the assumption of non-normality provides better specifications regarding the long memory volatility processes. Fleming and Kirby (2011) apply fractional- integrated time series models on realized volatility and trading volume of 20 firms to investigate the joint dynamics of the trading volume of stocks and their volatility and find a strong degree of correlation between the innovations to volume and volatility. They suggest that trading volume can be used to obtain more precise estimates of daily volatility for cases in which high-frequency returns are unavailable.

## Chapter 3: Long memory tests

### 3.1. Long memory in a financial time series

Both time domain and frequency domain measures are available to detect the presence of long memory in the time series. In the time domain, a hyperbolically decaying autocovariance function characterizes the presence of long memory. Suppose xt is a stationary process and λτ is its autocovariance function at lag τ, then, the asymptotic property of the autocovariance function is given as:

Abbildung in dieser Leseprobe nicht enthalten

where H e (0,1) is a long memory parameter and called the Hurst exponent.

In the frequency domain, the long memory is present when the spectral density function approaches infinity at low frequencies. Suppose f(X) is the spectral density function. The series xt is said to exhibit long memory if

Abbildung in dieser Leseprobe nicht enthalten

where Cf> 0 and H e (0,1).

There exist various approaches to test the long memory property of the time series. In this chapter, we briefly explain the most popular approaches which include R/S analysis, modified R/S analysis, detrending moving average analysis, generalized Hurst exponent approach, Lo’s modified R/S analysis, detrended fluctuation analysis, Local Whittle approach, exact Local whittle approach and discrete wavelet transform approach.

### 3.2. R/S analysis

Mandelbrot and Wallis (1969) propose the R/S analysis based on Hurst (1951), which helps in the estimation of the self-similarity parameter and the long-range dependence parameter H in the time series. The procedure for using the R/S analysis is as follows:

First divide the time series (of returns) of length L into d subseries (Zj,m) of length n. For each subseries m — 1, ,d:

1. Find the mean (Em) and the standard deviation (Sm).

2. Next, normalize the data of subseries [Abbildung in dieser Leseprobe nicht enthalten] by subtracting the sample mean,

Abbildung in dieser Leseprobe nicht enthalten

3. Find a cumulative time series [Abbildung in dieser Leseprobe nicht enthalten]

4. Find the range [Abbildung in dieser Leseprobe nicht enthalten]

5. Rescale the range [Abbildung in dieser Leseprobe nicht enthalten]

6. Calculate the mean value of the rescaled range for all sub-series of length n

Abbildung in dieser Leseprobe nicht enthalten

R/S statistics asymptotically follows the relation:

Abbildung in dieser Leseprobe nicht enthalten

The value of the Hurst exponent H can be estimated by running an ordinary least squares (OLS) linear regression over a sample of increasing time horizons.

Abbildung in dieser Leseprobe nicht enthalten

Abbildung in dieser Leseprobe nicht enthalten

Note that H — 0.5 for white noise. When the process is persistent (i.e., has long memory), then H > 0.5 and for an anti-persistent process (i.e., with mean reversion), H < 0.5.

### 3.3. Modified R/S analysis (R/S-AL)

In small samples, the R/S analysis can show a significant deviation of the estimates of the Hurst exponent from 0.5 even for the Gaussian white noise. To overcome this problem, Annis and Lloyd (1976) and Peters (1994) introduce a new formulation to improve the performance of R/S analysis for small n.

Abbildung in dieser Leseprobe nicht enthalten

The Hurst exponent is calculated as 0.5 plus the slope of (R/S)n - E(R/S)n. The resulting statistics is known as R/S-AL.

### 3.4. Detrending moving average analysis (DMA)

In order to calculate the Hurst exponent, the detrending moving average (DMA) approach is used. Suppose xt is a financial time series with t = 1, , N.

The nth order moving average of xt is given by:

Abbildung in dieser Leseprobe nicht enthalten

In finding xn t, the last point of the time window of size n is taken as the reference point. The series xt is detrended by subtracting xn t and the standard deviation of xt about the moving average xn t is computed as follows:

Abbildung in dieser Leseprobe nicht enthalten

[Abbildung in dieser Leseprobe nicht enthalten] is computed for different values of the moving average window n over the interval (n, N). The Hurst exponent is computed as the slope of a log-log plot between aDMA and n. Arianos and Carbone (2007) show the power law behavior:

Abbildung in dieser Leseprobe nicht enthalten

where His the Hurst exponent and 0 <H< 1.

Equation (6) can also be written as:

Abbildung in dieser Leseprobe nicht enthalten

This linear relationship between aDMA and n on a log-log plot supports the presence of power law (fractal) scaling which indicates that there is selfsimilarity in the series. This means the fluctuations over small time scales are related to fluctuations over larger time scales. In particular, the Hurst exponent can be used to identify the long memory properties of the time series.

### 3.5. Generalized Hurst exponent

Di Matteo and Aste (2002) propose the generalized Hurst exponent (GHE) approach for financial time series which is based on the scaling of qth order moments of the distribution of the series. The GHE is a generalization of the approach proposed by Hurst (1951). Suppose xt is the time series of logarithmic exchange rates. The qth order moment of the distribution of the increments (with t = v, 2v, , T) of the time series xt is given by:

Abbildung in dieser Leseprobe nicht enthalten

Where v is the time resolution and it is kept constant (here v = 1 day) and τ is the time interval which varies between v and ^ax.

The generalized Hurst exponent H(q) is defined from the scaling behaviour of Kqj (Barabási & Vicsek, 1991), which can be assumed to follow the relation

Abbildung in dieser Leseprobe nicht enthalten

For q = 1, the generalized Hurst exponent approach yields the Hurst exponent (H(l)) which describes the scaling behaviour of the absolute values of increments and is similar to the Hurst exponent obtained from the R/S analysis. The scaling exponent for q = 2 is associated with the scaling of the autocorrelation function and is related to the spectral density function (Flandrin, 1989).

### 3.6. Lo’s modified R/S analysis

Lo (1989) proposes a modified R/S statistic which can be applied to distinguish between long-range and short-range dependence in a series. Suppose yt is the log difference time series with t = 1, , N, then range R is defined as:

Where у is the sample estimator of the population mean. The range is usually rescaled by the sample standard deviation σ to find the R/S statistic. Lo (1989) finds that the distributional properties of the rescaled range are affected by the presence of short-range dependence. Therefore, he proposes a modified R/S statistic Qn and its statistical behaviour is invariant over a general class of short memory processes, but deviates for long memory processes. The modified R/S statistic Qn is defined as:

Abbildung in dieser Leseprobe nicht enthalten

Where ây and fj are the sample variance and autocovariance estimators of y and u)j(q) are the weights suggested by Newey and West (1986).

Under the null hypothesis of no long-term memory, the distribution of the random variable Vn(q) is

Abbildung in dieser Leseprobe nicht enthalten

which converges to the Brownian Bridge. Vn(q) denotes the dependence of the modified R/S on the truncation lag. We compute Vn(q) for several different values of q to check the sensitivity of the statistic to the lag length. The null hypothesis is accepted at 99% confidence level if Vn(q) is contained in the interval (0.721, 2.098).

### 3.7. Detrended fluctuation analysis (DFA)

Peng et al. (1994) propose the Detrended Fluctuation Analysis (DFA) to examine the long-range dependence property in the time series. Suppose x(t) be the integrated financial time series of logarithm returns, i.e. x(t) — ln(P(t)) with t = 1, , N. In this method, the integrated time series is divided into blocks of the same length n. The ordinary least square method is used to estimate the trend in each block. In each block, the ordinary least square line is expressed as xn(t). The trend from the series is removed by subtracting xn(t) from the integrated series x(t) in each block. This procedure is applied to each block, and the fluctuation magnitude is defined as

Abbildung in dieser Leseprobe nicht enthalten

This step is repeated for every scale n, and to estimate Hurst exponent, the following scaling relationship is defined as

Abbildung in dieser Leseprobe nicht enthalten

Where H is the Hurst exponent and 0 < H < 1.

Equation (12) can also be written as:

Abbildung in dieser Leseprobe nicht enthalten

This linear relationship between oDFA and и on a log-log plot supports the presence of power law (fractal) scaling which indicates that there is selfsimilarity in the series. This means the fluctuation over small time scales are related to fluctuations over larger time scales. The Hurst exponent can be used to identify the long memory properties of the time series.

We consider two extensions of Detrended Fluctuation Analysis (DFA) to deal with any short-range dependence in the series:

1) Apply DFA analysis to shuffled data: Divide the series in non-overlapping blocks of 5 observations. Shuffle the data in each block (permutation of data in each block) and apply DFA approach. The goal of shuffling the data series in each block is to destroy any structure of autocorrelation within these blocks (Cajueiro and Tabak, 2004, 2005).

2) Apply DFA analysis to aggregated data: Here also divide the series in non-overlapping blocks of 5 observations. Take the average of each block and apply DFA approach to calculate Hurst exponent (Cajueiro and Tabak, 2005).

**[...]**