python for 循环中的内存泄漏,即使我在每次迭代结束时删除所有变量
Memory leak in python for loop even if I delete all variables at the end of each iteration
下面的两个循环会占用内存直到我运行出来,但我不明白为什么。我在每次迭代结束时删除了所有创建的变量,但它仍然泄漏。
!pip3 install cupy-cuda101
import cupy as cp
import numpy as np
from sklearn.preprocessing import PolynomialFeatures
xtrain = cp.asnumpy(cp.random.uniform(-1,1,size = (150000,50)))
for i in range(0,1000):
weights = cp.random.uniform(-1,1,size = (1275,1000))
for chunk in range(0,xtrain.shape[0],5000):
xchunk = xtrain[chunk:chunk+5000,:]
poly=PolynomialFeatures(interaction_only = True, include_bias = False)
xchunk = cp.array(poly.fit_transform(xchunk))
ranks = cp.matmul(xchunk,weights)
del ranks, xchunk, poly
del weights
xtrain 也只是浮点数据,介于 -1 和 1 之间。
这些行在每次迭代结束时插入修复它:
cp.get_default_memory_pool().free_all_blocks()
cp.get_default_pinned_memory_pool().free_all_blocks()
下面的两个循环会占用内存直到我运行出来,但我不明白为什么。我在每次迭代结束时删除了所有创建的变量,但它仍然泄漏。
!pip3 install cupy-cuda101
import cupy as cp
import numpy as np
from sklearn.preprocessing import PolynomialFeatures
xtrain = cp.asnumpy(cp.random.uniform(-1,1,size = (150000,50)))
for i in range(0,1000):
weights = cp.random.uniform(-1,1,size = (1275,1000))
for chunk in range(0,xtrain.shape[0],5000):
xchunk = xtrain[chunk:chunk+5000,:]
poly=PolynomialFeatures(interaction_only = True, include_bias = False)
xchunk = cp.array(poly.fit_transform(xchunk))
ranks = cp.matmul(xchunk,weights)
del ranks, xchunk, poly
del weights
xtrain 也只是浮点数据,介于 -1 和 1 之间。
这些行在每次迭代结束时插入修复它:
cp.get_default_memory_pool().free_all_blocks()
cp.get_default_pinned_memory_pool().free_all_blocks()