I want to perform Variance Ratio tests (Lo-MackKinlay, Chow-Denning) but I have some problem with the running of the commands.
I have a price Index for 1957 to 2007. Do I need to perform the variance ratio tests on the level series or on the series of returns?
How do you fix the kvec? It is a vector with the lags for which you want to do the test right?
So here is my output:
> rcorr
[1] 0.0000 -0.1077 0.4103 -0.0347 0.1136 0.0286 0.0104 0.0104 0.1915
[10] -0.0025 0.0665 0.2127 0.0116 -0.1288 0.1640 0.3089 0.2098 -0.1071
[19] -0.2079 -0.1082 0.0022 0.1419 0.0641 -0.0082 -0.1163 -0.1731 0.0260
[28] 0.0468 0.0882 0.2640 0.3946 0.2094 0.2754 0.0623 -0.3696 -0.1095
[37] -0.1463 0.0118 0.0152 -0.0103 0.0223 0.0379 0.0580 -0.0091 -0.0510
[46] 0.0765 0.0984 0.1250 0.0519 0.1623 0.2552
> kvec<--c(2,5,10)
> Lo.Mac(rcorr,kvec)
Error in y[index] : only 0's may be mixed with negative subscripts
Why do I get this error?
It is the same error as in your other question I just answered:
kvec<--c(2,5,10)
is the same as
kvec <- -c(2,5,10)
ie
kvec <- -1 * c(2,5,10)
Remove the second dash.