Search code examples
pythonnumpyconvolutionndimagenumpy-memmap

Performing ndimage.convolve on big numpy.memmap: Unable to allocate 56.0 GiB for an array


While trying to do ndimage.convolve on big numpy.memmap, exception occurs:

Exception has occurred: _ArrayMemoryError Unable to allocate 56.0 GiB for an array with shape (3710, 1056, 3838) and data type float32

Seems that convolve creates a regular numpy array which won't fit into memory.

Could you tell me please if there is a workaround?

Thank you for any input.


Solution

  • Instead of running your own implementation, another approach that might work would be to use dask's wrapped ndfilters with a dask array created from the memmap. That way, you can delegate the chunking/out-of-memory-calculation parts to Dask.

    I haven't actually done this myself, but I see no reason why it wouldn't work!