为什么我的训练函数会抛出名称错误 "name decaying is not defined"?
Why does my training function throw up the Name error "name decaying is not defined"?
所以,我是 NLP 的新手,我正在尝试使用 spacy_transformers 训练文本分类器。此代码已显示给 运行,但它在我的计算机上引发了错误。作为旁注,这可能是我 运行 将它放在 cpu 上的脂肪引起的吗?
def train_classifier(n_epoch:int=5,
train_data:list=None,
val_text:tuple=None,
val_label:list=None,
batch_size:int=32,
lr:float=1e-3):
train_stats = []
dropout = decaying(0.2, 0.1, 0.3) # Gradually decrease dropout rate from 0.2 to 0.1
# Cyclic triangular rate (https://arxiv.org/abs/1506.01186)
learn_rates = cyclic_triangular_rate(
lr / 3, lr * 3, 2 * len(train_data) // batch_size
)
for epoch in range(n_epoch):
random.shuffle(train_data)
batches = minibatch(train_data, size=batch_size)
losses = {}
for batch in batches:
optimizer.trf_lr = next(learn_rates)
texts, cats = zip(*batch)
nlp.update(
texts,
cats,
drop = next(dropout),
sgd = optimizer,
losses=losses)
然后当我传递函数时
train_classifier(n_epoch=10, train_data=train_data, val_text=val_text, val_label=val_label, batch_size=32, lr=2e-6)
我收到以下错误
<ipython-input-55-5bb071ef310c> in train_classifier(n_epoch, train_data, val_text, val_label, batch_size, lr)
6 lr:float=1e-3):
7 train_stats = []
----> 8 dropout = decaying(0.2, 0.1, 0.3) # Gradually decrease dropout rate from 0.2 to 0.1
9 # Cyclic triangular rate (https://arxiv.org/abs/1506.01186)
10 learn_rates = cyclic_triangular_rate(
NameError: name 'decaying' is not defined
我对 spacy 一点也不熟悉,但经过几次 google 搜索后,看起来这个函数可能依赖于 spacy 的 util.decaying 函数。如果没有将衰减函数加载到内存中,此 train_classifier
函数将抛出 NameError
。
decaying
函数的代码如下,可参考here.
def decaying(start, stop, decay):
"""Yield an infinite series of linearly decaying values."""
curr = float(start)
while True:
yield max(curr, stop)
curr -= decay
所以,我是 NLP 的新手,我正在尝试使用 spacy_transformers 训练文本分类器。此代码已显示给 运行,但它在我的计算机上引发了错误。作为旁注,这可能是我 运行 将它放在 cpu 上的脂肪引起的吗?
def train_classifier(n_epoch:int=5,
train_data:list=None,
val_text:tuple=None,
val_label:list=None,
batch_size:int=32,
lr:float=1e-3):
train_stats = []
dropout = decaying(0.2, 0.1, 0.3) # Gradually decrease dropout rate from 0.2 to 0.1
# Cyclic triangular rate (https://arxiv.org/abs/1506.01186)
learn_rates = cyclic_triangular_rate(
lr / 3, lr * 3, 2 * len(train_data) // batch_size
)
for epoch in range(n_epoch):
random.shuffle(train_data)
batches = minibatch(train_data, size=batch_size)
losses = {}
for batch in batches:
optimizer.trf_lr = next(learn_rates)
texts, cats = zip(*batch)
nlp.update(
texts,
cats,
drop = next(dropout),
sgd = optimizer,
losses=losses)
然后当我传递函数时
train_classifier(n_epoch=10, train_data=train_data, val_text=val_text, val_label=val_label, batch_size=32, lr=2e-6)
我收到以下错误
<ipython-input-55-5bb071ef310c> in train_classifier(n_epoch, train_data, val_text, val_label, batch_size, lr)
6 lr:float=1e-3):
7 train_stats = []
----> 8 dropout = decaying(0.2, 0.1, 0.3) # Gradually decrease dropout rate from 0.2 to 0.1
9 # Cyclic triangular rate (https://arxiv.org/abs/1506.01186)
10 learn_rates = cyclic_triangular_rate(
NameError: name 'decaying' is not defined
我对 spacy 一点也不熟悉,但经过几次 google 搜索后,看起来这个函数可能依赖于 spacy 的 util.decaying 函数。如果没有将衰减函数加载到内存中,此 train_classifier
函数将抛出 NameError
。
decaying
函数的代码如下,可参考here.
def decaying(start, stop, decay):
"""Yield an infinite series of linearly decaying values."""
curr = float(start)
while True:
yield max(curr, stop)
curr -= decay