如何在django中设置celeryconfig文件

How to set celeryconfig file in in django

我在 Ubuntu EC2 节点上有一个 Django 项目,我一直使用它来使用 Celery 设置异步。我一直在努力关注 http://michal.karzynski.pl/blog/2014/05/18/setting-up-an-asynchronous-task-queue-for-django-using-celery-redis/

我已经能够在命令行上完成一项基本任务,使用:

(env1)ubuntu@ip-172-31-22-65:~/projects/tp$ celery --app=tp.celery:app worker --loglevel=INFO

(env1)ubuntu@ip-172-31-22-65:~/projects/tp$ celery --app=tp.celery:app worker --loglevel=INFO

 -------------- celery@ip-172-31-22-65 v3.1.17 (Cipater)
---- **** -----
--- * ***  * -- Linux-3.13.0-44-generic-x86_64-with-Ubuntu-14.04-trusty
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         tp:0x7f66a89c0470
- ** ---------- .> transport:   redis://localhost:6379/0
- ** ---------- .> results:     disabled
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ----
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery

但是,如果我 运行 像下面这样的其他芹菜命令,我会得到以下信息:

(env1)ubuntu@ip-172-31-22-65:~/projects/tp$ celery worker                       [2015-04-03 13:17:21,553: WARNING/MainProcess] /home/ubuntu/.virtualenvs/env1/lib/python3.4/site-packages/celery/apps/worker.py:161: 


 -------------- celery@ip-172-31-22-65 v3.1.17 (Cipater)
---- **** -----
--- * ***  * -- Linux-3.13.0-44-generic-x86_64-with-Ubuntu-14.04-trusty
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         default:0x7f1653eae7b8 (.default.Loader)
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:     disabled
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ----
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery


[2015-04-03 13:17:21,571: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.

看来芹菜认为我在使用 amqp 作为代理,但我在使用 redis!!

根据Celery tries to connect to the wrong broker,似乎是celery找不到配置文件,使用默认的。

在上面的问题中他们推荐:

import your celery and add your broker like that : 

celery = Celery('task', broker='redis://127.0.0.1:6379')
celery.config_from_object(celeryconfig)

我应该在哪里做这个?我的 celery.py 文件(下方)是否与芹菜配置相同?

/projects/tp/tp/celery.py

from __future__ import absolute_import

import os
import django
from celery import Celery
from django.conf import settings

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'tp.settings')
django.setup()

app = Celery('hello_django')

# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

tasks.py:

from __future__ import absolute_import
from celery import shared_task
from django.core.cache import cache


@shared_task
def tester1(param):
    return 'The test task executed with argument "%s" ' % param

tp/tp1/views

@csrf_exempt
def tester(request):

    tester1.delay('hi')

    return HttpResponse('hTML')

/etc/supervisor/conf.d/tp-celery.conf

[program:tp-celery]
command=/home/ubuntu/.virtualenvs/env1/bin/celery --app=tp.celery:app worker --loglevel=INFO
directory=/home/ubuntu/projects/tp
user=ubuntu
numprocs=1
stdout_logfile=/var/log/celery-worker-out.log
stderr_logfile=/var/log/celery-worker-err.log

autostart=true
autorestart=true
startsecs=10

; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 600

; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true

/var/log/celery-worker-out.log

 -------------- celery@ip-172-31-22-65 v3.1.17 (Cipater)
---- **** ----- 
--- * ***  * -- Linux-3.13.0-44-generic-x86_64-with-Ubuntu-14.04-trusty
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         tp:0x7fa33e424cf8
- ** ---------- .> transport:   redis://localhost:6379/0
- ** ---------- .> results:     disabled
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ---- 
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery


[tasks]
  . testapp.tasks.tester1

不要 运行 celery woker 只有..运行 喜欢 celery -A tp worker -l info。它将采用默认 config.

对于celery inspect

celery --app=tp.celery:app inspect active_queues

或简单地

celery -A tp inspect active_queues