芹菜+ Django不能同时工作
Celery + Django not working at the same time
我有 Django 2.0
项目运行良好,它与 Celery 4.1.0
集成,我正在使用 jquery 向后端发送 ajax 请求,但我刚刚意识到由于芹菜的一些问题,它会无休止地加载。
芹菜设置(celery.py)
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'converter.settings')
app = Celery('converter', backend='amqp', broker='amqp://guest@localhost//')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
芹菜任务(tasks.py)
from __future__ import absolute_import, unicode_literals
from celery import shared_task
@shared_task(time_limit=300)
def add(number1, number2):
return number1 + number2
Django 视图 (views.py)
class AddAjaxView(JSONResponseMixin, AjaxResponseMixin, View):
def post_ajax(self, request, *args, **kwargs):
url = request.POST.get('number', '')
task = tasks.convert.delay(url, client_ip)
result = AsyncResult(task.id)
data = {
'result': result.get(),
'is_ready': True,
}
if result.successful():
return self.render_json_response(data, status=200)
当我向 Django 应用程序发送 ajax 请求时,它会无休止地加载,但是当终止 Django 服务器时,我 运行 celery -A demoproject worker --loglevel=info
那是我的任务 运行宁.
问题
我如何自动执行此操作,以便当我 运行 Django 项目时,我的芹菜任务将在我发送 ajax 请求时自动运行?
如果您在开发环境中,您必须 运行 手动 celery worker,因为它不会 运行 在后台自动执行,以便处理队列中的作业。所以如果你想要一个完美的工作流程,你需要 Django 默认服务器和 celery worker 运行ning。如文档中所述:
In a production environment you’ll want to run the worker in the background as a daemon - see Daemonization - but for testing and development it is useful to be able to start a worker instance by using the celery worker manage command, much as you’d use Django’s manage.py runserver:
celery -A proj worker -l info
您可以阅读他们的守护进程文档。
http://docs.celeryproject.org/en/latest/userguide/daemonizing.html
我有 Django 2.0
项目运行良好,它与 Celery 4.1.0
集成,我正在使用 jquery 向后端发送 ajax 请求,但我刚刚意识到由于芹菜的一些问题,它会无休止地加载。
芹菜设置(celery.py)
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'converter.settings')
app = Celery('converter', backend='amqp', broker='amqp://guest@localhost//')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
芹菜任务(tasks.py)
from __future__ import absolute_import, unicode_literals
from celery import shared_task
@shared_task(time_limit=300)
def add(number1, number2):
return number1 + number2
Django 视图 (views.py)
class AddAjaxView(JSONResponseMixin, AjaxResponseMixin, View):
def post_ajax(self, request, *args, **kwargs):
url = request.POST.get('number', '')
task = tasks.convert.delay(url, client_ip)
result = AsyncResult(task.id)
data = {
'result': result.get(),
'is_ready': True,
}
if result.successful():
return self.render_json_response(data, status=200)
当我向 Django 应用程序发送 ajax 请求时,它会无休止地加载,但是当终止 Django 服务器时,我 运行 celery -A demoproject worker --loglevel=info
那是我的任务 运行宁.
问题 我如何自动执行此操作,以便当我 运行 Django 项目时,我的芹菜任务将在我发送 ajax 请求时自动运行?
如果您在开发环境中,您必须 运行 手动 celery worker,因为它不会 运行 在后台自动执行,以便处理队列中的作业。所以如果你想要一个完美的工作流程,你需要 Django 默认服务器和 celery worker 运行ning。如文档中所述:
In a production environment you’ll want to run the worker in the background as a daemon - see Daemonization - but for testing and development it is useful to be able to start a worker instance by using the celery worker manage command, much as you’d use Django’s manage.py runserver:
celery -A proj worker -l info
您可以阅读他们的守护进程文档。
http://docs.celeryproject.org/en/latest/userguide/daemonizing.html