I wrote a script in Python using SciPy to perform a short-time Fourier transform on a signal. When I ran it on a signal with a thousand timepoints, it ran fine. When I ran it on a signal with a million timepoints, it froze my computer (computer doesn't respond, and if audio was playing, the computer outputs a skipping and looping buzz); this has consistently occurred all three times I attempted it. I've written scripts that would take hours, but I've never encountered one that actually froze my computer. Any idea why? The script is posted below:
import scipy as sp
from scipy import fftpack
def STFT(signal, seconds_per_sample, window_seconds, min_Hz):
window_samples = int(window_seconds/seconds_per_sample) + 1
signal_samples = len(signal)
if signal_samples <= window_samples:
length = max(signal_samples, int(1/(seconds_per_sample*min_Hz)) + 1)
return sp.array([0]), fftpack.fftshift(fftpack.fftfreq(length, seconds_per_sample)), fftpack.fftshift(fftpack.fft(signal, n = length))
else:
length = max(window_samples, int(1/(seconds_per_sample*min_Hz)) + 1)
frequency = fftpack.fftshift(fftpack.fftfreq(length, seconds_per_sample))
time = []
FTs = []
for i in range(signal_samples - window_samples):
time.append(seconds_per_sample*i)
FTs.append(fftpack.fftshift(fftpack.fft(signal[i:i + window_samples], n = length)))
return sp.array(time), frequency, sp.array(FTs)
in the script is consumed too much RAM when you run it over a too large number of points, see Why does a simple python script crash my system
The process in that your program runs stores the arrays and variables for the calculations in process memory which is ram
you can fix this by forcing the program to use hard disk memory.
For workarounds (shelve
,...) see the following links
memory usage, how to free memory
Python large variable RAM usage
I need to free up RAM by storing a Python dictionary on the hard drive, not in RAM. Is it possible?