使用多处理分配共享内存时出现虚假的内存不足错误

Spurious out-of-memory error when allocating shared memory with multiprocessing

我正在尝试使用 multiprocessing.RawArray 在共享内存中分配一组图像缓冲区。它适用于较少数量的图像。但是,当我达到一定数量的缓冲区时,我得到一个 OSError 指示我 运行 内存不足。

很明显的问题,我真的失忆了吗?根据我的计算,我尝试分配的缓冲区应该是大约 1 GB 的内存,根据 Windows 任务管理器,我有大约 20 GB 的空闲空间。我不明白我怎么会失忆!

我是否达到了某种我可以增加的人为内存消耗限制?如果不是,为什么会发生这种情况,我该如何解决?

我正在使用 Windows 10,Python 3.7,64 位架构,总共 32 GB RAM。

这是一个最小的可重现示例:

import multiprocessing as mp
import ctypes

imageDataType = ctypes.c_uint8
imageDataSize = 1024*1280*3   # 3,932,160 bytes
maxBufferSize = 300
buffers = []
for k in range(maxBufferSize):
    print("Creating buffer #", k)
    buffers.append(mp.RawArray(imageDataType, imageDataSize))

输出:

Creating buffer # 0
Creating buffer # 1
Creating buffer # 2
Creating buffer # 3
Creating buffer # 4
Creating buffer # 5

...等...

Creating buffer # 278
Creating buffer # 279
Creating buffer # 280
Traceback (most recent call last):
  File ".\Cruft\memoryErrorTest.py", line 10, in <module>
    buffers.append(mp.RawArray(imageDataType, imageDataSize))
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\context.py", line 129, in RawArray
    return RawArray(typecode_or_type, size_or_initializer)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\sharedctypes.py", line 61, in RawArray
    obj = _new_value(type_)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\sharedctypes.py", line 41, in _new_value
    wrapper = heap.BufferWrapper(size)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\heap.py", line 263, in __init__
    block = BufferWrapper._heap.malloc(size)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\heap.py", line 242, in malloc
    (arena, start, stop) = self._malloc(size)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\heap.py", line 134, in _malloc
    arena = Arena(length)
  File "C:\Users\Brian Kardon\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\heap.py", line 38, in __init__
    buf = mmap.mmap(-1, size, tagname=name)
OSError: [WinError 8] Not enough memory resources are available to process this command

好的,the folks over at Python bug tracker figured this out for me。为了后代:

我使用的是 32 位 Python,它的内存地址 space 限制为 4 GB,比我的可用系统总内存少得多。显然 space 被其他东西占用了,解释器无法为我所有的 RawArray 找到足够大的连续块。

使用 64 位时不会出现错误 Python,所以这似乎是最简单的解决方案。