来自不同日志文件中不同应用程序的芹菜任务

Celery tasks from different applications in different log files

我正在 FreeBSD 服务器上寻找配置 Celery,根据日志文件我遇到了一些问题。

我的配置:

我的 Celery 配置文件:

我在 /etc/default/celeryd_app1 :

# Names of nodes to start
CELERYD_NODES="worker"

# Absolute or relative path to the 'celery' command:
CELERY_BIN="/usr/local/www/app1/venv/bin/celery"

# App instance to use
CELERY_APP="main"

# Where to chdir at start.
CELERYD_CHDIR="/usr/local/www/app1/src/"

# Extra command-line arguments to the worker
CELERYD_OPTS="--time-limit=300 --concurrency=8"

# Set logging level to DEBUG
#CELERYD_LOG_LEVEL="DEBUG"

# %n will be replaced with the first part of the nodename.
CELERYD_LOG_FILE="/var/log/celery/app1/%n%I.log"
CELERYD_PID_FILE="/var/run/celery/app1/%n.pid"

# Workers should run as an unprivileged user.
CELERYD_USER="celery"
CELERYD_GROUP="celery"

# If enabled pid and log directories will be created if missing,
# and owned by the userid/group configured.
CELERY_CREATE_DIRS=1

我有 celeryd_app2

完全相同的文件

带有 Celery 设置的 Django 设置文件:

CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_IGNORE_RESULT = False
CELERY_TASK_TRACK_STARTED = True
# Add a one-minute timeout to all Celery tasks.
CELERYD_TASK_SOFT_TIME_LIMIT = 60

两种设置都具有相同的 redis 端口。

我的问题:

当我为 app1 执行 celery 任务时,我在 app2 日志文件中发现来自该任务的日志有如下问题:

Received unregistered task of type 'app1.task.my_task_for_app1'
...
KeyError: 'app1.task.my_task_for_app1'

我的 Celery 配置文件有问题?我必须设置不同的 redis 端口?如果是,我该怎么做?

非常感谢

我想问题出在您为两个应用程序使用相同的 Redis 数据库:

CELERY_BROKER_URL = 'redis://localhost:6379'

查看 the guide 使用 Redis 作为代理。只需更改每个应用程序的数据库,例如

CELERY_BROKER_URL = 'redis://localhost:6379/0'

CELERY_BROKER_URL = 'redis://localhost:6379/1'