Search code examples
pythonmemory

File load error: not enough storage available with 1.7TB storage free


I'm using the following code to load my files in NiFTI format in Python.

import nibabel as nib 

img_arr = []
for i in range(len(datadir)):
    img = nib.load(datadir[i])
    img_data = img.get_fdata()
    img_arr.append(img_data)
    img.uncache()

A small amount of images works fine, but if I want to load more images, I get the following error:

OSError                                   Traceback (most recent call last)
<ipython-input-55-f982811019c9> in <module>()
     10     #img = nilearn.image.smooth_img(datadir[i],fwhm = 3) #Smoothing filter for preprocessing (necessary?)
     11     img = nib.load(datadir[i])
---> 12     img_data = img.get_fdata()
     13     img_arr.append(img_data)
     14     img.uncache()

~\AppData\Roaming\Python\Python36\site-packages\nibabel\dataobj_images.py in get_fdata(self, caching, dtype)
    346             if self._fdata_cache.dtype.type == dtype.type:
    347                 return self._fdata_cache
--> 348         data = np.asanyarray(self._dataobj).astype(dtype, copy=False)
    349         if caching == 'fill':
    350             self._fdata_cache = data

~\AppData\Roaming\Python\Python36\site-packages\numpy\core\_asarray.py in asanyarray(a, dtype, order)
    136 
    137     """
--> 138     return array(a, dtype, copy=False, order=order, subok=True)
    139 
    140 

~\AppData\Roaming\Python\Python36\site-packages\nibabel\arrayproxy.py in __array__(self)
    353     def __array__(self):
    354         # Read array and scale
--> 355         raw_data = self.get_unscaled()
    356         return apply_read_scaling(raw_data, self._slope, self._inter)
    357 

~\AppData\Roaming\Python\Python36\site-packages\nibabel\arrayproxy.py in get_unscaled(self)
    348                                        offset=self._offset,
    349                                        order=self.order,
--> 350                                        mmap=self._mmap)
    351         return raw_data
    352 

~\AppData\Roaming\Python\Python36\site-packages\nibabel\volumeutils.py in array_from_file(shape, in_dtype, infile, offset, order, mmap)
    507                              shape=shape,
    508                              order=order,
--> 509                              offset=offset)
    510             # The error raised by memmap, for different file types, has
    511             # changed in different incarnations of the numpy routine

~\AppData\Roaming\Python\Python36\site-packages\numpy\core\memmap.py in __new__(subtype, filename, dtype, mode, offset, shape, order)
    262             bytes -= start
    263             array_offset = offset - start
--> 264             mm = mmap.mmap(fid.fileno(), bytes, access=acc, offset=start)
    265 
    266             self = ndarray.__new__(subtype, shape, dtype=descr, buffer=mm,

OSError: [WinError 8] Not enough storage is available to process this command

I thought that img.uncache() would delete the image from memory so it wouldn't take up too much storage but still being able to work with the image array. Adding this bit to the code didn't change anything though.

Does anyone know how I can help this? The computer I'm working on has 24 core 2,6 GHz CPU, more than 52 GB memory and the working directory has over 1.7 TB free storage. I'm trying to load around 1500 MRI images from the ADNI database.


Solution

  • This error is not being caused because the 1.7TB hard drive is filling up, it's because you're running out of memory, aka RAM. It's going to be important to understand how those two things differ.

    uncache() does not remove an item from memory completely, as documented here, but that link also contains more memory saving tips.

    If you want to remove an object from memory completely, you can use the Garbage Collector interface, like so:

    import nibabel as nib 
    import gc
    
    img_arr = []
    for i in range(len(datadir)):
        img = nib.load(datadir[i])
        img_data = img.get_fdata()
        img_arr.append(img_data)
        img.uncache()
        # Delete the img object and free the memory
        del img
        gc.collect()
    

    That should help reduce the amount of memory you are using.