Search code examples
physicssamplingdefinition

Sampling frequency for any Continous Signal on a Time Interval


While learning about Sampling Frequency I came across two definition

First :

The sampling frequency in a given time interval represents the number of samples taken per unit of time during that interval.

Second :

It is Reciprocal of Sampling Period(1/ Sampling Period). Where Sampling Period is the time taken between two consecutive samples.


So for Instance we are having Samples at time t=[0, 25, 50, 75, 100] in the time interval 0 - 100

Going by the first definition :

Sampling Frequency = (No.of Samples)/ (Time Interval)
                   = 5/100
                   = 0.05 Hz

By Second definition :

Sampling Period = 25

Sampling Period = (1 / Sampling Period)
                = 1/25
                = 0.04 Hz

Now I am confused out of both, which one is correct? And what it actually is Sampling Frequency over a Time interval and how does it changes with it..?


Solution

  • The second one seems correct. You cannot include both the first and last samples to find the total interval. Choose either one and your calculation will be OK (4/100). To understand this another way, think of a data point being representative of an average time interval, eg what happens between t=12.5 and t=37.5 is encapsulated in the data point corresponding to t=25. Then it becomes clear that your data is basically giving you information from t=-12.5 to t=112.5, in which case you also get the correct answer (5/125)