Azure DevOps 管道不理解 flake8 忽略列表

flake8 ignore list isn't understood by Azure DevOps pipeline

我想在 ontology_tagger.ipynb 中捕获发送到 Stream 的日志。

所述文件是上/后 2 个文件夹和 1 个文件夹:

from ontology_tagger.notebooks.ontology_tagger import main

我已经尝试了各种解决方案,但没有成功,来自这个post

ModuleNotFoundError: No module named 'ontology_tagger.notebooks.ontology_tagger'

如何将笔记本 MyCode.ipynb 导入 testing.py 并调用函数名称?

注意:我想要进行的单元测试恰好用于测试我的 .ipynb 文件中的日志记录。不要与底部的调试日志混淆。


尝试的解决方案

pip install import_ipynb

import import_ipynb
from ontology_tagger.notebooks.ontology_tagger.ipynb import main

但是,因为我没有直接调用 import_ipynb,Azure DevOps 引发了 linting 错误:

#17 2.530 /home/worker/python/ontology_tagger/ontology_tagger/tests/test_ontology_tagger.py:19:1: F401 'import_ipynb' imported but unused

因此,在终端中:

flake8 --per-file-ignores="test_ontology_tagger.py:F401"

然而,上面的错误仍然出现:(


testing.py:

import unittest
from unittest import TestCase
import sys
import logging

from ontology_tagger.notebooks.ontology_tagger import main


class TestExample(TestCase):
    def test_logging(self):
        with self.assertLogs() as captured:
            ontology_tagger.main()  ########## HERE !
            print('captured.records: ', captured.records)
            print('list(captured.records): ', list(captured.records))
            self.assertTrue(len(captured.records) > 2)
            self.assertTrue("Started" in captured.records[0].getMessage())

            success_log_msgs = ['self.tokenizer.model_max_length >= DEFAULT_MODEL_MAX_LEN:', 'self._num_classes:']  # successful runtime logs in 'validation()' methods

            for slm in success_log_msgs:
                self.assertTrue(any(True for log in list(captured.records) if slm in log))  # If there's a subset or full occurance of log message; True
                self.assertTrue(len(captured.records) > len(success_log_msgs))  # check that there are general log messages

if __name__ == '__main__':
    unittest.main()

MyCode.ipynb:

def main():
    logger = logging.getLogger()
    streamHandler = logging.StreamHandler(sys.stdout)
    formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
    streamHandler.setFormatter(formatter)
    logger.addHandler(streamHandler)
    logger.error('Started')
    # ...
    logger.error('Finished')

Azure DevOps 上的原始构建日志:

#17 [test 5/5] RUN cd ontology_tagger && poetry run invoke deploy
#17 sha256:80a75555cf9574e61fc67c0cfb698e36af3784a913f0e5be863aa61acc4d7fc8
#17 2.600 ============================= test session starts ==============================
#17 2.600 platform linux -- Python 3.7.12, pytest-3.10.1, py-1.10.0, pluggy-1.0.0
#17 2.600 rootdir: /home/worker/python/ontology_tagger, inifile: pytest.ini
#17 2.600 collecting ... 
collecting 0 items / 1 errors                                                  
collected 0 items / 1 errors                                                   
#17 3.558 
#17 3.558 ==================================== ERRORS ====================================
#17 3.559 ________ ERROR collecting ontology_tagger/tests/test_ontology_tagger.py ________
#17 3.559 ImportError while importing test module '/home/worker/python/ontology_tagger/ontology_tagger/tests/test_ontology_tagger.py'.
#17 3.559 Hint: make sure your test modules/packages have valid Python names.
#17 3.559 Traceback:
#17 3.559 tests/test_ontology_tagger.py:19: in <module>
#17 3.559     from ontology_tagger.notebooks.ontology_tagger import main
#17 3.559 E   ModuleNotFoundError: No module named 'ontology_tagger.notebooks.ontology_tagger'
#17 3.559 !!!!!!!!!!!!!!!!!!! Interrupted: 1 errors during collection !!!!!!!!!!!!!!!!!!!!
#17 3.559 =========================== 1 error in 0.96 seconds ============================
#17 ERROR: executor failed running [/bin/sh -c cd ontology_tagger && poetry run invoke deploy]: exit code: 2
------
 > [test 5/5] RUN cd ontology_tagger && poetry run invoke deploy:
------
executor failed running [/bin/sh -c cd ontology_tagger && poetry run invoke deploy]: exit code: 2
##[error]Bash exited with code '1'.
Finishing: Test worker

如果还有什么我应该添加到 post 的,请告诉我。

要使 per-file-ignores 正常工作,您需要一个与您的文件匹配的 glob 该文件的路径

所以如果您使用 --per-file-ignores tests/test_ontology_tagger.py:F401 那么它应该可以按照您的意愿工作

然而,更好的解决方案是对错误的导入使用内联忽略:

import import_ipynb  # noqa: F401

免责声明:我是当前的 flake8 维护者