I am trying to create a sine wave oscillator to play with audio sources.
At first I created a simple one like this:
public class FirstOscillator : MonoBehaviour
{
public double frequency = 400.0;
private double increment;
private double phase;
private double sampleRate = 48000.0;
public float gain;
void OnAudioFilterRead(float[] data, int channels)
{
increment = frequency * 2.0 * Mathf.PI / sampleRate;
for (int i = 0; i < data.Length; i += channels)
{
phase += increment;
data[i] = (float) (gain * Mathf.Sin((float) phase));
if (channels == 2)
{
data[i + 1] = data[i];
}
}
}
}
This works perfectly fine and generates a nice sounding sine wave.
But I decided that design-wise it would be better for my oscillators to be sort of functions for amplitude(frequency, time)
, so I was trying to modify it to actually use time and extract the oscillation into a method for now:
public class SecondOscillator : MonoBehaviour, IAudioFilter
{
public double frequency = 400.0;
private double sampleRate = 48000.0;
public float gain;
public void OnAudioFilterRead(float[] data, int channels)
{
var time = AudioSettings.dspTime;
for (int i = 0; i < data.Length; i += channels)
{
data[i] = gain * Amplitude((float)frequency, (float)(time + i / sampleRate / channels));
if (channels == 2)
{
data[i + 1] = data[i];
}
}
}
private float Amplitude(float freq, float time)
{
return Mathf.Sin(freq * time * 2 * Mathf.PI);
}
}
For some reason this produces weirdly sounding metallic noise, which reacts to frequency changes though. I wonder what could be the problem.
UPDATE
In the comments section people suggest that AudioSettings.dspTime
is a number of samples, not time in seconds. I think that's not the case so I wrote a quick script to test it:
class SampleRateTest : MonoBehaviour
{
public int sampleRate;
public void Awake()
{
sampleRate = AudioSettings.outputSampleRate;
}
private StringBuilder sb = new StringBuilder();
private int samplesTakenStart = 5;
private int samplesTakenEnd = 5;
private int loggedFrames = 5;
private int loggedFrameIndex = 0;
public void OnAudioFilterRead(float[] data, int channels)
{
var time = AudioSettings.dspTime;
for (var index = 0; index < data.Length; index++)
{
if (index < samplesTakenStart || index > data.Length - samplesTakenEnd)
sb.AppendLine($"Sample {index} time is {time}");
else if (index == samplesTakenStart)
sb.AppendLine("...");
time += 1.0 / sampleRate / channels;
}
sb.AppendLine("End of frame " + loggedFrameIndex);
if (loggedFrameIndex++ == loggedFrames)
Debug.Log(sb.ToString());
}
}
This produces the following output:
Sample 0 time is 1196.144
Sample 1 time is 1196.14401041667
Sample 2 time is 1196.14402083333
Sample 3 time is 1196.14403125
Sample 4 time is 1196.14404166667
...
Sample 508 time is 1196.14929166661
Sample 509 time is 1196.14930208328
Sample 510 time is 1196.14931249994
Sample 511 time is 1196.14932291661
End of frame 0
Sample 0 time is 1196.14933333333
Sample 1 time is 1196.14934375
Sample 2 time is 1196.14935416667
Sample 3 time is 1196.14936458333
Sample 4 time is 1196.149375
...
Sample 508 time is 1196.15462499994
Sample 509 time is 1196.15463541661
Sample 510 time is 1196.15464583328
Sample 511 time is 1196.15465624994
End of frame 1
Sample 0 time is 1196.15466666667
Sample 1 time is 1196.15467708333
Sample 2 time is 1196.1546875
Sample 3 time is 1196.15469791667
Sample 4 time is 1196.15470833333
...
Sample 508 time is 1196.15995833328
Sample 509 time is 1196.15996874994
Sample 510 time is 1196.15997916661
Sample 511 time is 1196.15998958328
End of frame 2
Sample 0 time is 1196.16
Sample 1 time is 1196.16001041667
Sample 2 time is 1196.16002083333
Sample 3 time is 1196.16003125
Sample 4 time is 1196.16004166667
...
Sample 508 time is 1196.16529166661
Sample 509 time is 1196.16530208328
Sample 510 time is 1196.16531249994
Sample 511 time is 1196.16532291661
End of frame 3
Sample 0 time is 1196.16533333333
Sample 1 time is 1196.16534375
Sample 2 time is 1196.16535416667
Sample 3 time is 1196.16536458333
Sample 4 time is 1196.165375
...
Sample 508 time is 1196.17062499994
Sample 509 time is 1196.17063541661
Sample 510 time is 1196.17064583328
Sample 511 time is 1196.17065624994
End of frame 4
Sample 0 time is 1196.17066666667
Sample 1 time is 1196.17067708333
Sample 2 time is 1196.1706875
Sample 3 time is 1196.17069791667
Sample 4 time is 1196.17070833333
...
Sample 508 time is 1196.17595833328
Sample 509 time is 1196.17596874994
Sample 510 time is 1196.17597916661
Sample 511 time is 1196.17598958328
End of frame 5
So to me it looks like AudioSettings.dspTime
is time indeed so it shouldn't be divided by sampleRate
. The channels
count is 2 on my system if that matters.
UPDATE 2: I also tried removing the channel-specific code and setting the project audio settings to Mono (instead of Stereo), but it didn't help so I assume the channels are not the issue here.
Also, I created a wav demo of the sound I get. To me it sounds more like a square than a sine.
Oh. My. God. The problem was I was using float
values instead of double
and it appears that float-pointing calculations are just not precise enough for this sort of data. Changed all my floats to doubles and it works now.