I have a time-series of voltage values recorded in mV
every 0.02 ms
, stored as a numpy array.
If I do this,
dv_dt = np.gradient(v),
what will the units of dv_dt
be? Will it be some multiple of V/s
; e.g. mV/s
, mV/(0.02 ms)
, etc?
My understanding is that gradient
returns the derivative of the argument passed to it. Is that right?
See this related question.
Watch out for the unit spacing of dt. As noted in the documentation gradient
assumes unit spacing of 1 unless you provide the sample distance by the vararg
argument. Your case is only correct, if dt = 1
for all datapoints.
You have to define your units by yourself. It is fine to use milliseconds but unless you do not have a good reason for it, I would use SI units (in this case seconds and Volts; which is the same as mV and ms). The gradient will have units of mV/ms or V/s.
In your case np.gradient(v,0.02)
will give you the first order difference of the voltage signal corrected with your spacing of the time axis.