Pytorch1.6训练时实际学习率是多少?
Pytorch1.6 What is the actual learning rate during training?
我想知道训练过程中的实际学习率,这是我的代码。
learning_rate = 0.001
optimizer = torch.optim.Adam(net.parameters(), lr=learning_rate)
scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=[1, 2], gamma=0.1)
def train(epoch):
train_loss = 0
for batch_idx, (input, target) in enumerate(train_loader):
predict_label = net(input)
loss = criterion(predict_label, target)
train_loss += loss.item()
optimizer.zero_grad()
loss.backward()
optimizer.step()
print(optimizer.param_groups[0]['lr'])
scheduler.step()
print(scheduler.state_dict()['_last_lr'])
print(optimizer.param_groups[0]['lr'])
输出为 0.001, 0.0001, 0.0001。那么 optimizer.step() 期间的实际 lr 是多少? 0.001 还是 0.0001?谢谢
重要的部分在这里:
for batch_idx, (input, target) in enumerate(train_loader):
...
optimizer.zero_grad()
loss.backward()
optimizer.step()
print(optimizer.param_groups[0]['lr']) #### CURRENT LEARNING RATE
scheduler.step() #step through learning rate
print(scheduler.state_dict()['_last_lr']) #### NEW LEARNING RATE
print(optimizer.param_groups[0]['lr']) #### NEW LEARNING RATE
因为您在纪元之后步进了调度程序,所以第一个纪元的初始值将设置为 0.001
。如果你 运行 多个纪元,那么它将继续退火。
我想知道训练过程中的实际学习率,这是我的代码。
learning_rate = 0.001
optimizer = torch.optim.Adam(net.parameters(), lr=learning_rate)
scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=[1, 2], gamma=0.1)
def train(epoch):
train_loss = 0
for batch_idx, (input, target) in enumerate(train_loader):
predict_label = net(input)
loss = criterion(predict_label, target)
train_loss += loss.item()
optimizer.zero_grad()
loss.backward()
optimizer.step()
print(optimizer.param_groups[0]['lr'])
scheduler.step()
print(scheduler.state_dict()['_last_lr'])
print(optimizer.param_groups[0]['lr'])
输出为 0.001, 0.0001, 0.0001。那么 optimizer.step() 期间的实际 lr 是多少? 0.001 还是 0.0001?谢谢
重要的部分在这里:
for batch_idx, (input, target) in enumerate(train_loader):
...
optimizer.zero_grad()
loss.backward()
optimizer.step()
print(optimizer.param_groups[0]['lr']) #### CURRENT LEARNING RATE
scheduler.step() #step through learning rate
print(scheduler.state_dict()['_last_lr']) #### NEW LEARNING RATE
print(optimizer.param_groups[0]['lr']) #### NEW LEARNING RATE
因为您在纪元之后步进了调度程序,所以第一个纪元的初始值将设置为 0.001
。如果你 运行 多个纪元,那么它将继续退火。