当我们从 Airflow 网络服务器触发 DAG 时,Airflow 调度程序崩溃
Airflow scheduler crash when we Trigger DAG from Airflow web-server
如果我们从 Airflow 网络服务器开始打开 DAG 和触发 DAG,Airflow 调度程序进程会崩溃。
Airflow 版本 - **v1.10.4
Redis 服务器 v=5.0.7
执行器=芹菜执行器
broker_url = 'redis://:password@redis-host:2287/0'
sql_alchemy_conn = postgresql+psycopg2://user:password@host/dbname
result_backend = 'db+postgresql://user:password@host/dbname'
崩溃并显示以下错误消息。
scheduler_job.py:1325} ERROR - Exception when executing execute_helper
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/airflow/jobs/scheduler_job.py", line 1323, in _execute
self._execute_helper()
File "/usr/lib/python2.7/site-packages/airflow/jobs/scheduler_job.py", line 1412, in _execute_helper
self.executor.heartbeat()
File "/usr/lib/python2.7/site-packages/airflow/executors/base_executor.py", line 132, in heartbeat
self.trigger_tasks(open_slots)
File "/usr/lib/python2.7/site-packages/airflow/executors/celery_executor.py", line 203, in trigger_tasks
cached_celery_backend = tasks[0].backend
File "/usr/lib/python2.7/site-packages/celery/local.py", line 146, in __getattr__
return getattr(self._get_current_object(), name)
File "/usr/lib/python2.7/site-packages/celery/app/task.py", line 1037, in backend
return self.app.backend
File "/usr/lib/python2.7/site-packages/kombu/utils/objects.py", line 44, in __get__
value = obj.__dict__[self.__name__] = self.__get(obj)
File "/usr/lib/python2.7/site-packages/celery/app/base.py", line 1223, in backend
return self._get_backend()
File "/usr/lib/python2.7/site-packages/celery/app/base.py", line 940, in _get_backend
self.loader)
File "/usr/lib/python2.7/site-packages/celery/app/backends.py", line 74, in by_url
return by_name(backend, loader), url
File "/usr/lib/python2.7/site-packages/celery/app/backends.py", line 54, in by_name
cls = symbol_by_name(backend, aliases)
File "/usr/lib/python2.7/site-packages/kombu/utils/imports.py", line 57, in symbol_by_name
module = imp(module_name, package=package, **kwargs)
File "/usr/lib64/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
ImportError: No module named 'db
为什么触发 DAG 时调度程序会崩溃?
我试过 运行 pip install DB
但没有解决问题。
如错误所述。您一定没有正确设置数据库。
你做了吗
$ airflow initidb
在尝试启动之前 webserver
?
另外,您似乎在使用 python 2.7
,您确定它与您使用的最新版本 airflow
兼容吗?
我使用的是 python 3.5.2
和最新的 airflow
,它对我不起作用,因此我不得不稍微降级我的 airflow
版本。
Airflow 与 Python 2.7 版不兼容
运行 airflow with python 3.6,然后创建数据库用户并授予权限,然后 运行 命令“airflow initdb”。这将在气流中初始化您的数据库。
如果我们从 Airflow 网络服务器开始打开 DAG 和触发 DAG,Airflow 调度程序进程会崩溃。
Airflow 版本 - **v1.10.4
Redis 服务器 v=5.0.7
执行器=芹菜执行器
broker_url = 'redis://:password@redis-host:2287/0'
sql_alchemy_conn = postgresql+psycopg2://user:password@host/dbname
result_backend = 'db+postgresql://user:password@host/dbname'
崩溃并显示以下错误消息。
scheduler_job.py:1325} ERROR - Exception when executing execute_helper
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/airflow/jobs/scheduler_job.py", line 1323, in _execute
self._execute_helper()
File "/usr/lib/python2.7/site-packages/airflow/jobs/scheduler_job.py", line 1412, in _execute_helper
self.executor.heartbeat()
File "/usr/lib/python2.7/site-packages/airflow/executors/base_executor.py", line 132, in heartbeat
self.trigger_tasks(open_slots)
File "/usr/lib/python2.7/site-packages/airflow/executors/celery_executor.py", line 203, in trigger_tasks
cached_celery_backend = tasks[0].backend
File "/usr/lib/python2.7/site-packages/celery/local.py", line 146, in __getattr__
return getattr(self._get_current_object(), name)
File "/usr/lib/python2.7/site-packages/celery/app/task.py", line 1037, in backend
return self.app.backend
File "/usr/lib/python2.7/site-packages/kombu/utils/objects.py", line 44, in __get__
value = obj.__dict__[self.__name__] = self.__get(obj)
File "/usr/lib/python2.7/site-packages/celery/app/base.py", line 1223, in backend
return self._get_backend()
File "/usr/lib/python2.7/site-packages/celery/app/base.py", line 940, in _get_backend
self.loader)
File "/usr/lib/python2.7/site-packages/celery/app/backends.py", line 74, in by_url
return by_name(backend, loader), url
File "/usr/lib/python2.7/site-packages/celery/app/backends.py", line 54, in by_name
cls = symbol_by_name(backend, aliases)
File "/usr/lib/python2.7/site-packages/kombu/utils/imports.py", line 57, in symbol_by_name
module = imp(module_name, package=package, **kwargs)
File "/usr/lib64/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
ImportError: No module named 'db
为什么触发 DAG 时调度程序会崩溃?
我试过 运行 pip install DB
但没有解决问题。
如错误所述。您一定没有正确设置数据库。
你做了吗
$ airflow initidb
在尝试启动之前 webserver
?
另外,您似乎在使用 python 2.7
,您确定它与您使用的最新版本 airflow
兼容吗?
我使用的是 python 3.5.2
和最新的 airflow
,它对我不起作用,因此我不得不稍微降级我的 airflow
版本。
Airflow 与 Python 2.7 版不兼容 运行 airflow with python 3.6,然后创建数据库用户并授予权限,然后 运行 命令“airflow initdb”。这将在气流中初始化您的数据库。