未触发气流工作进程 - 调度程序抛出错误消息(Docker 撰写 - 芹菜执行模式)

Airflow worker process not getting triggered - Scheduler throws an error message (Docker compose - Celery Execution Mode)

相同的设置适用于 Mac,并将图像和合成文件移动到 CentOS 服务器。最初,我遇到了路由问题,并通过使用 firewalld 更新规则解决了这个问题。

scheduler_1  | Celery Task ID: ('parallel_dag', 'task_4', datetime.datetime(2020, 3, 17, 0, 0, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1)
scheduler_1  | Traceback (most recent call last):
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/kombu/utils/functional.py", line 42, in __call__
scheduler_1  |     return self.__value__
scheduler_1  | AttributeError: 'ChannelPromise' object has no attribute '__value__'
scheduler_1  |
scheduler_1  | During handling of the above exception, another exception occurred:
scheduler_1  |
scheduler_1  | Traceback (most recent call last):
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/kombu/transport/virtual/base.py", line 921, in create_channel
scheduler_1  |     return self._avail_channels.pop()
scheduler_1  | IndexError: pop from empty list
scheduler_1  |
scheduler_1  | During handling of the above exception, another exception occurred:
scheduler_1  |
scheduler_1  | Traceback (most recent call last):
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/executors/celery_executor.py", line 118, in send_task_to_executor
scheduler_1  |     result = task.apply_async(args=[command], queue=queue)
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/celery/app/task.py", line 570, in apply_async
scheduler_1  |     **options
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/celery/app/base.py", line 756, in send_task
scheduler_1  |     amqp.send_task_message(P, name, message, **options)
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/celery/app/amqp.py", line 552, in send_task_message
scheduler_1  |     **properties
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/kombu/messaging.py", line 181, in publish
scheduler_1  |     exchange_name, declare,
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/kombu/connection.py", line 510, in _ensured
scheduler_1  |     return fun(*args, **kwargs)
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/kombu/messaging.py", line 187, in _publish
scheduler_1  |     channel = self.channel
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/kombu/messaging.py", line 209, in _get_channel
scheduler_1  |     channel = self._channel = channel()
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/kombu/utils/functional.py", line 44, in __call__
scheduler_1  |     value = self.__value__ = self.__contract__()
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/kombu/messaging.py", line 224, in <lambda>
scheduler_1  |     channel = ChannelPromise(lambda: connection.default_channel)
scheduler_1  |   File "/usr/local/lib/python3.7/site-packages/kombu/connection.py", line 852, in default_channel
scheduler_1  |     self.ensure_connection(**conn_opts)

这类似于此错误 https://issues.apache.org/jira/browse/AIRFLOW-6527。因此,

我必须将这些参数 operation_timeoutsend_task_timeout 属性值设置为更高的值。默认情况下,它设置为 2。

在我的例子中,我使用 docker 来启动进程。因此,在 entrypoint.sh 中将您的值设置为大于 2。