I have a 10 Hz time series measured by a fast instrument and a 1 minute time series measured by a slow reference instrument. The data consists of a fluctuating meteorological parameter. The slow reference instrument is used to calibrate the fast instrument measurements. Both time series are synchronised.
My idea:
Average the 10 Hz data into 1 minute blocks.
Take 5 one minute block from each time series and calculate the linear regression equations.
Use the regression equations to calibrate the 10 Hz data in 5 minute blocks (3000 data points).
What would be the best way to match (calibrate) the high frequency data using the low frequency data? I use MATLAB.
More background: The fast instrument outputs a fluctuating voltage signal while the slow instrument outputs the true value of a trace gas concentration in ppb (parts per billion). The slow instrument samples every ten seconds and outputs the average every one minute.
In short I would like to have my fast signal also in ppb but without losing it's integrity (I need the turbulent fluctuations to remain unfiltered), hence the need to use a linear fit.
Here's my approach and the results I got...
I modelled the problem as there being
real
.lf
(short for low frequency).hf
(short for high frequency).The task was to take the slow and fast signals and try to reconstruct the real signal. (Using least squares as a scoring metric)
hf_lp
hf_lp_pl
hf_diff = hf_lp - hf_lp_pl
.hf_diff
should be added to the low frequency signal (lf
) such that the squared error between real_estimated
and real
is minimized.
I fitted a function along the lines of real_estimated = lf + diff.*(a1*uncertainty + a2*uncertainty.^2 + a3*uncertainty.^3)
fminsearch
or other optimization techniques to get a1
, a2
, a3
...Here is a sample plot of my results - you can see that real_estimated
is much closer to real
than the slow signal lf
.