Search code examples
python-3.xmemorymultiprocessingshared-memoryallocation

Spurious out-of-memory error when allocating shared memory with multiprocessing


I'm trying to allocate a set of image buffers in shared memory using multiprocessing.RawArray. It works fine for smaller numbers of images. However, when I get to a certain number of buffers, I get a OSError indicating that I've run out of memory.

Obvious question, am I actually out of memory? By my count, the buffers I'm trying to allocate should be about 1 GB of memory, and according to the Windows Task Manager, I have about 20 GB free. I don't see how I could actually be out of memory!

Am I hitting some kind of artificial memory consumption limit that I can increase? If not, why is this happening, and how can I get around this?

I'm using Windows 10, Python 3.7, 64 bit architecture, 32 GB RAM total.

Here's a minimal reproducible example:

import multiprocessing as mp
import ctypes

imageDataType = ctypes.c_uint8
imageDataSize = 1024*1280*3   # 3,932,160 bytes
maxBufferSize = 300
buffers = []
for k in range(maxBufferSize):
    print("Creating buffer #", k)
    buffers.append(mp.RawArray(imageDataType, imageDataSize))

Output:

Creating buffer # 0
Creating buffer # 1
Creating buffer # 2
Creating buffer # 3
Creating buffer # 4
Creating buffer # 5

...etc...

Creating buffer # 278
Creating buffer # 279
Creating buffer # 280
Traceback (most recent call last):
  File ".\Cruft\memoryErrorTest.py", line 10, in <module>
    buffers.append(mp.RawArray(imageDataType, imageDataSize))
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\context.py", line 129, in RawArray
    return RawArray(typecode_or_type, size_or_initializer)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\sharedctypes.py", line 61, in RawArray
    obj = _new_value(type_)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\sharedctypes.py", line 41, in _new_value
    wrapper = heap.BufferWrapper(size)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\heap.py", line 263, in __init__
    block = BufferWrapper._heap.malloc(size)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\heap.py", line 242, in malloc
    (arena, start, stop) = self._malloc(size)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\heap.py", line 134, in _malloc
    arena = Arena(length)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\heap.py", line 38, in __init__
    buf = mmap.mmap(-1, size, tagname=name)
OSError: [WinError 8] Not enough memory resources are available to process this command

Solution

  • Ok, the folks over at Python bug tracker figured this out for me. For posterity:

    I was using 32-bit Python, which is limited to a memory address space of 4 GB, much less than my total available system memory. Apparently enough of that space was taken up by other stuff that the interpreter couldn't find a large enough contiguous block for all my RawArrays.

    The error does not occur when using 64-bit Python, so that seems to be the easiest solution.