如何在 Pytorch Lightning 中禁用进度条

How to disable progress bar in Pytorch Lightning

我在 Pytorch Lightning 中的 tqdm 进度条有很多问题:

INFO:root:  Name    Type Params
0   l1  Linear    7 K
Epoch 2:  56%|████████████▊          | 2093/3750 [00:05<00:03, 525.47batch/s, batch_nb=1874, loss=0.714, training_loss=0.4, v_nb=51]
INFO:root:  Name    Type Params
0   l1  Linear    7 K
Epoch 1:  50%|█████     | 1875/3750 [00:05<00:05, 322.34batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  50%|█████     | 1879/3750 [00:05<00:05, 319.41batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  52%|█████▏    | 1942/3750 [00:05<00:04, 374.05batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  53%|█████▎    | 2005/3750 [00:05<00:04, 425.01batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  55%|█████▌    | 2068/3750 [00:05<00:03, 470.56batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  57%|█████▋    | 2131/3750 [00:05<00:03, 507.69batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  59%|█████▊    | 2194/3750 [00:06<00:02, 538.19batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  60%|██████    | 2257/3750 [00:06<00:02, 561.20batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  62%|██████▏   | 2320/3750 [00:06<00:02, 579.22batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  64%|██████▎   | 2383/3750 [00:06<00:02, 591.58batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  65%|██████▌   | 2445/3750 [00:06<00:02, 599.77batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  67%|██████▋   | 2507/3750 [00:06<00:02, 605.00batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  69%|██████▊   | 2569/3750 [00:06<00:01, 607.04batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  70%|███████   | 2633/3750 [00:06<00:01, 613.98batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]

我想知道这些问题是否可以解决,否则我该如何禁用进度条,而是在屏幕上打印一些日志详细信息。

在 Trainer 中使用命令 show_progress_bar=False

F.Y.I。 show_progress_bar=False 自 0.7.2 版起已弃用,但您可以使用 progress_bar_refresh_rate=0


更新:

progress_bar_refresh_rate 已在 v1.5 中弃用,并将在 v1.7 中删除。要禁用进度条,请将 enable_progress_bar 设置为 false

progress_bar_refresh_rate: How often to refresh progress bar (in steps). Value ``0`` disables progress bar.
    Ignored when a custom progress bar is passed to :paramref:`~Trainer.callbacks`. Default: None, means
    a suitable value will be chosen based on the environment (terminal, Google COLAB, etc.).

    .. deprecated:: v1.5
        ``progress_bar_refresh_rate`` has been deprecated in v1.5 and will be removed in v1.7.
        Please pass :class:`~pytorch_lightning.callbacks.progress.TQDMProgressBar` with ``refresh_rate``
        directly to the Trainer's ``callbacks`` argument instead. To disable the progress bar,
        pass ``enable_progress_bar = False`` to the Trainer.

enable_progress_bar: Whether to enable to progress bar by default.

I would like to know if these issues can be solved or else how can I disable the progress bar and instead, just print some log details on the screen.

据我所知这个问题还没有解决。 pl 团队指出这是“与 TQDM 相关的事情”,他们对此无能为力。也许您想阅读 this issue

我的临时修复是:

from tqdm import tqdm

class LitProgressBar(ProgressBar):
   
    def init_validation_tqdm(self):
        bar = tqdm(            
            disable=True,            
        )
        return bar

bar = LitProgressBar()
trainer = Trainer(callbacks=[bar])

此方法只是禁用验证进度条并允许您保持正确的训练条[refer 1 and 2]。请注意,使用 progress_bar_refresh_rate=0 将禁用所有进度条的更新。

进一步的解决方案:(Update1 2021-07-22)

根据this answer, the tqdm seems to only be glitching in the PyCharm console. So, a possible answer is to do something with Pycharm setting. Fortunately, I find

  1. Go to "Edit configurations". Click on the run/debug configuration that is being used. There should be an option "Emulate terminal in output console". Check that. Image added for reference.

  2. Along with the position argument also set the leave argument. The code should look like this. I have added ncols so that the progress bar doesn't take up whole of the console.

from tqdm import tqdm import time
for i in tqdm(range(5), position=0, desc="i", leave=False, colour='green', ncols=80):
    for j in tqdm(range(10), position=1, desc="j", leave=False, colour='red', ncols=80):
        time.sleep(0.5) 

When the code is now run, the output of the console is as shown below.

i:  20%|████████▍                                 | 1/5 [00:05<00:20,  5.10s/it] 
j:  60%|████████████████████████▌            | 6/10 [00:03<00:02,  1.95it/s] 

经过以上两步,我们就可以在Pycharm中正常显示进度条了。要完成 Pytorch-lightning 中的步骤 2,我们需要覆盖函数 init_train_tqdm()、init_validation_tqdm()、init_test_tqdm()改变 ncols。像这样的一些代码(希望它有帮助):

class LitProgressBar(ProgressBar):

    def init_train_tqdm(self) -> tqdm:
        """ Override this to customize the tqdm bar for training. """
        bar = tqdm(
            desc='Training',
            initial=self.train_batch_idx,
            position=(2 * self.process_position),
            disable=self.is_disabled,
            leave=True,
            dynamic_ncols=False,  # This two lines are only for pycharm
            ncols=100,
            file=sys.stdout,
            smoothing=0,
        )
        return bar

    def init_validation_tqdm(self) -> tqdm:
        """ Override this to customize the tqdm bar for validation. """
        # The main progress bar doesn't exist in `trainer.validate()`
        has_main_bar = self.main_progress_bar is not None
        bar = tqdm(
            desc='Validating',
            position=(2 * self.process_position + has_main_bar),
            disable=self.is_disabled,
            leave=False,
            dynamic_ncols=False,
            ncols=100,
            file=sys.stdout
        )
        return bar

    def init_test_tqdm(self) -> tqdm:
        """ Override this to customize the tqdm bar for testing. """
        bar = tqdm(
            desc="Testing",
            position=(2 * self.process_position),
            disable=self.is_disabled,
            leave=True,
            dynamic_ncols=False,
            ncols=100,
            file=sys.stdout
        )
        return bar

如果它不适合您,请将 Pytorch-lightning 的版本更新到最新版本。

如果您想关闭进度条,可以在Trainer 中使用。将“progress_bar_refresh_rate”的参数设置为0将禁用进度条,但如果您在回调中指定自己的进度条,则此设置将被省略。请注意,pl 是 pytorch lightning 模块(import pytorch_lightning as pl),可能与您的风格不同。

trainer = pl.Trainer(..., progress_bar_refresh_rate=0)

禁用进度条传递 enable_progress_bar = False 给训练师。