I have the following piece of code for calculating the cross-correlation between to signals. Visually, the signals are correlating very well. The signals are of different length but both have a sampling rate of 100Hz. There is a lag between the signals (called timeDiff below).
[acor,lag] = xcorr(signal1,signal2);
[cor,I] = max(abs(acor));
lagDiff = lag(I);
timeDiff = lagDiff/100;
fprintf('Correlation = %0.5f \n',cor);
I'm getting a correlation of 6239.06131. How can I normalize this to -1 and 1? Because otherwise it is hard to interpret.
Following the documentation https://it.mathworks.com/help/signal/ref/xcorr.html
There is the Normalization option on xcorr function
[acor,lag] = xcorr(signal1,signal2,'coeff');
that "normalizes the sequence so that the autocorrelations at zero lag equal 1". The 'coeff' option only produces a value of 1 or -1 if a given time shift (lag) results in a perfect positive or negative correlation of two series.