Search code examples
audiointegralnormalizediscrete

How to normalize samples of an ongoing cumulative sum?


For simplicity let's assume we have a function sin(x) and calculated 1000 samples between -1 and 1 with it. We can plot those samples. Now in the next step we want to plot the integral of sin(x) which would be - cos(x) + C. Now i can calculate the integral with my existing samples like this:

y[n] = x[n] + y[n-1]

Because it's a cumulative sum we will need to normalize it to get samples between -1 and 1 on the y axis.

y = 2 * ( x - min(x) / max(x) - min(x) ) - 1

To normalize we need a maximum and a minimum.

Now we want to calculate the next 1000 samples for sin(x) and calculate the integral again. Because it's a cumulative sum we will have a new maximum which means we will need to normalize all of our 2000 samples.

Now my question basically is:

How can i normalize samples in this context without knowing the maximum and minimum? How can i prevent, to normalize all previous samples again, if i have a new set of samples with a new maximum/minimum?


Solution

  • I've found a solution :)

    I also want to mention: This is about periodic functions like Sine, so basically the maximum and minimum should be always the same, right?

    In a special case this isn't true:

    If you samples don't contain a full period of the function (with global maximum and minimum of the function). This can happen when you choose a very low frequency.

    What can you do:

    • Simply calculate the samples for a function like sin(x) with a frequency of 1. It will contain the global maximum and minimum of the function (it's important that y varies between -1 and 1, not 0 and 1!).

    • Then you calculate the integral with the cumulative sum.

    • get maximum and minimum of the samples

    • you can scale it up or down: maximum/frequency, minimum/frequency

    • can be used now to normalize samples which were calculated with any other frequency.

    It only need to be calculated once at the beginning.