Celery 连接到 rabbitmq-server 而不是 redis-server

Celery connecting to rabbitmq-server instead of redis-server

我有一个 Django 应用程序,我想将它配置为 运行 后台任务。

包:

  1. 芹菜==4.2.1

  2. Django==2.1.3

  3. Python==3.5

  4. Redis-server==3.0.6

settings.py文件中celery的配置为:

CELERY_BROKER_URL = 'redis://localhost:6379'

CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERY_BEAT_SCHEDULE = {
    'task-number-one': {
            'task': 'app.tasks.task_number_one',
            'schedule': crontab(minute='*/1'),
    },
}

celery.py文件:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings.prod')

app = Celery('project')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

当我 运行 : celery -A project worker -l info -B -E

指向rabmmitmq server,应该指向redis-server,如下图:

 -------------- celery@user-desktop v4.2.1 (windowlicker)
---- **** ----- 
--- * ***  * -- Linux-4.15.0-39-generic-x86_64-with-Ubuntu-18.04-bionic 2018-11-21 12:04:51
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         project:0x7f8b80f78d30
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:     redis://localhost:6379/
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: ON
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . app.tasks.task_number_one
  . project.celery.debug_task

[2018-11-21 12:04:51,741: INFO/Beat] beat: Starting...

同样的事情发生在生产环境中。 在生产环境中,我已经使用 Gunicorn 和 Nginx 部署了 Django 应用程序,现在我想对 运行 后台任务实施一些方法,因为 django-crontab 包不工作。

Problem:

  1. What is the problem with celery configuration?

  2. Could anyone please recommend a method to run periodic background task?

**注意:我试过实施supervisor,但似乎supervisor与python3不兼容,因此无法配置它。

broker url changed in v4 的设置。它应该是 BROKER_URL 而不是 CELERY_BROKER_URL

替换 CELERY_BROKER_URL = 'redis://localhost:6379'BROKER_URL = 'redis://localhost:6379'。这对我有用。

如果你从celery官网复制了celery.py的内容https://docs.celeryproject.org/en/latest/django/first-steps-with-django.html

尝试更改以下行,来自

app.config_from_object('django.conf:settings', namespace='CELERY')

app.config_from_object('django.conf:settings', namespace='')