Interview Questions IV

Another question about correlations today, this time I thought we could have a look at a simple type of random walk, in which the distance travelled at each step either backwards or forwards and has a random length, and how to deal with the issues that come up.

Let’s say at each step we move a distance along the x-axis that is distributed randomly and uniformly between -1 and +1, and importantly that each step is independent of the others. So, after N steps the total distance travelled, L, is

L_N = \sum_{i=0}^{N} x_i\ ; \qquad x_i\sim {\mathbb U}[-1,+1]

where \inline x_i is the i-th step length.

Calculate:

i) the expected distance travelled after N steps

ii) the standard deviation of the distance travelled after N steps

iii) the autocorrelation between the distance travelled at N steps and the distance travelled at N+n steps

Since we’re dealing with uniform variables, it makes sense to start by calculating the expectation and variance of a single realisation of a variable of this type. The expectation is trivially 0, while the variance is

\begin{align*} {\rm Var}[x_i] & = {\mathbb E}[x_i^2] - {\mathbb E}[x_i]^2\\ & = \int_{-1}^{+1} x^2 dx - 0 \\ & = {2\over 3} \end{align}

We’ll also make use of the independence of the individual variables at several points, we recall that for independent variables x and y, that \inline {\mathbb E}[xy] = {\mathbb E}[x] {\mathbb E}[y]

 

i) This one is fairly straight-forward. Expectation is a linear operator, so we can take it inside the sum. We know the expectation of an individual variate, so the expectation of the sum is just the product of these

\begin{align*} {\mathbb E}\Big[\sum_{i=0}^N x_i \Big] & = \sum_{i=0}^N {\mathbb E}[ x_i ]\\ & = N\cdot 0\\ & = 0 \end{align}

 

ii) The standard deviation is the square root of the variance, which is the expectation of the square minus the square of the expectation. We know the second of these is 0, so we only need to calculate the first,

\begin{align*} {\rm Var}\Big[\sum_{i=0}^N x_i \Big] & = {\mathbb E}\Big[\Big(\sum_{i=0}^N x_i \Big)^2\Big]\\ & = {\mathbb E}\Big[\sum_{i,j=0}^N x_i x_j\Big]\\ &=\sum_{i,j=0}^N {\mathbb E} [x_i x_j] \end{align}

There are two types of term here. When i and j are not equal, we can use the independence criterion given above to express this as the product of the two individual expectations, which are both 0, so these terms don’t contribute. So we are left with

\begin{align*} {\rm Var}\Big[\sum_{i=0}^N x_i \Big] &=\sum_{i=0}^N {\mathbb E} [(x_i)^2] \\ &= N\cdot {2\over3} \end{align}and the standard deviation is simply the square root of this.

 

iii) This is where things get more interesting – the autocorrelation is the correlation of the sum at one time with its value at a later time. This is a quantity that quants are frequently interested in, since the value of a derivative that depends on values of an underlying stock at several times will depend sensitively on the autocorrelation. We recall the expression for correlation

\rho(x,y) = {{\rm Cov}(x,y) \over \sqrt{{\rm Var}(x){\rm Var}(y) } }

So we are trying to calculate

\begin{align*} \rho(L_N, L_{N+n}) = {\rm Cov}\Big[\sum_{i=0}^N x_i \cdot \sum_{j=0}^{N+n} x_j \Big] \cdot {3 \over 2\sqrt{N (N+n)}} \end{align}

where I’ve substituted in the already-calculated value of the variances of the two sums.

We can again use the independence property of the steps to separate the later sum into two, the earlier sum and the sum of the additional terms. Also, since the expectation of each sum is zero, the covariance of the sums is just the expectation of their product

\begin{align*} \rho(L_N, L_{N+n})&= {\rm Cov}\Big[\sum_{i=0}^N x_i \cdot \Big(\sum_{j=0}^{N} x_j + \sum_{j=N+1}^{N+n} x_j \Big) \Big] \cdot {3 \over 2\sqrt{N (N+n)}}\\&= {\mathbb E}\Big[\sum_{i=0}^N x_i \cdot \Big(\sum_{j=0}^{N} x_j + \sum_{j=N+1}^{N+n} x_j \Big) \Big] \cdot {3 \over 2\sqrt{N (N+n)}}\\&= {\mathbb E}\Big[\sum_{i,j=0}^N x_i x_j + \sum_{i=0}^N x_i \cdot\sum_{j=N+1}^{N+n} x_j \Big] \cdot {3 \over 2\sqrt{N (N+n)}} \end{align}and using the results above and the independence of the final two sums (because they are the sums of different sets of terms, and each term is independent to all the others) we know

{\mathbb E}\Big[\sum_{i,j=0}^N x_i x_j \Big] = {2 \over 3}N

{\mathbb E}\Big[\sum_{i=0}^N x_i \cdot\sum_{j=N+1}^{N+n} x_j \Big] ={\mathbb E}\Big[\sum_{i=0}^N x_i \Big]\cdot {\mathbb E}\Big[\sum_{j=N+1}^{N+n} x_j \Big] = 0

so

\begin{align*}\rho(L_N, L_{N+n}) & = {N\over \sqrt{N(N+n)}}\\ &= \sqrt{N\over N+n} \end{align*}

What does this tell us? Roughly that the sum of the sequence up to N+n terms is correlated to its value at earlier points, but as n gets larger the correlation decreases, as the new random steps blur out the position due to the initial N steps.

We can test our expressions using the RAND() function in excel. Try plotting a sequence of sets of random numbers and summing them, and then plotting the set of sums of 100 terms against the set of sums of 120 or 200 terms (nb. in excel, you probably want to turn auto-calculate off first to stop the randoms from refreshing every time you make a change – instructions can be found here for Excel 2010; for Excel 2013 I found the option inside the “FORMULAS” tab and at the far end – set the ‘Calculation Options’ to manual). I’ve done exactly that, and you can see the results below.

The sum of 100 terms vs. the sum of 120 terms. These are of course highly correlated, as the additional 20 terms usually don't affect the overall sum to a significant extent
The sum of 100 terms vs. the sum of 120 terms. These are of course highly correlated, as the additional 20 terms usually don’t affect the overall sum to a significant extent
The sum of the first 100 terms against the sum of 200 terms. We can see that the sums are slowly becoming less correlated
The sum of the first 100 terms against the sum of 200 terms. We can see that the sums are slowly becoming less correlated
This is the sum of the first 100 terms against the first 500. The correlation is much lower than in the graphs above, but not that from the formula we derived we still expect a correlation of around 45% despite the large number of extra terms in the second sum.
This is the sum of the first 100 terms against the first 500. The correlation is much lower than in the graphs above, but note that from the formula we derived we still expect a correlation of around 45% despite the large number of extra terms in the second sum.

You can also try calculating the correlation of the variables uing Excel’s CORREL() that you generate – these should tend towards the expression above as the number of sums that you compute gets large (if you press F9, all of the random numbers in your sheet will be recomputed and you can see the actual correlation jump around, but these jumps will be smaller as the number of sums gets larger).