故障排除 运行 celery worker with supervisor for Django app
Troubleshooting running celery worker with supervisor for Django app
我有一个 Django 应用程序,我的目标是 运行 通过 celery 通过 redis 完成一项任务。
项目文件夹结构如下:
/mhb11/myfolder/myproject
├── myproject
│ ├── celery.py # The Celery app file
│ ├── __init__.py # The project module file (modified)
│ ├── settings.py # Including Celery settings
│ ├── urls.py
│ └── wsgi.py
├── manage.py
├── celerybeat-schedule
└── myapp
├── __init__.py
├── models.py
├── tasks.py # File containing tasks for this app
├── tests.py
└── views.py
我在 /etc/supervisor/conf.d
中有 celery.conf
,其中包含:
[program:celery]
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app worker -l info
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app beat -l info
directory = /home/mhb11/myfolder/myproject
user=mhb11
numprocs=1
stdout_logfile = /etc/supervisor/logs/celery-worker.log
stderr_logfile = /etc/supervisor/logs/celery-worker.log
autostart = true
autorestart = true
startsecs=10
stopwaitsecs = 600
killasgroup = true
priority = 998
并且在 /etc/supervisor/logs
中,我有一个名为 celery-worker.log
的空文件。设置好之后,我 运行 以下命令:
sudo supervisorctl reread
sudo supervisorctl update
执行此操作后,我的芹菜工人应该开始,但他们没有。 IE。我设置的 celery-worker.log
文件中没有显示任何内容。我不知道我错过了什么,因为这是我第一次设置所有这些。你能帮我解决这个问题吗?
djcelery
是 INSTALLED_APPS 的一部分。此外,settings.py
中的其他相关设置是:
import djcelery
djcelery.setup_loader()
BROKER_URL = 'redis://localhost:6379/0'
BROKER_TRANSPORT = 'redis'
CELERY_IMPORTS = ('myapp.tasks', )
CELERY_ALWAYS_EAGER = False
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_IGNORE_RESULT=True
from datetime import timedelta
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
# CELERYBEAT_SCHEDULE = {
# 'tasks.rank_all_photos': {
# 'task': 'tasks.rank_all_photos',
# 'schedule': timedelta(seconds=30),
# },
# }
CELERY_TIMEZONE = 'UTC'
我的 celery.py
包含:
#this is the celery daemon
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')
app = Celery('myapp', broker='redis://localhost:6379/0', backend='redis://localhost:6379/0',include=['myfolder.myapp.tasks'])
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
app.conf.update(
CELERY_TASK_RESULT_EXPIRES=3600,
)
if __name__ == '__main__':
app.start()
__init__.py
包含:
from __future__ import absolute_import
from .celery import app as celery_app1
tasks.py
包含:
import os
from myproject import celery_app1
import time
from myapp.models import Photo
@celery_app1.task(name='tasks.rank_all_photos')
def rank_all_photos():
for photo in Photo.objects.order_by('-id')[:400]:
photo.set_rank()
最后,在我的 Django 管理面板中,我还设置了 crontab
和 periodic task
。
我应该怎么做才能开始一切?
你是运行你的worker,worker只是执行任务,但是需要把任务放到队列中让worker去寻找任务。 Celery beat 根据通过 Django Admin 或日程表文件设置的日程表将任务放入队列中。在beat queue的task之后,workers找到它并执行它。
所以你需要单独运行一个芹菜打浆过程。一个单独的主管进程
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app beat -l info
使用 periodic/schedules 任务时需要 Celery beat。如果您只是通过调用任务的 .delay()
方法手动排队任务,那么您不需要 Celery beat 运行ning.
因此您的 2 个主管文件将是
节拍
[program:celerybeat]
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app beat -l info
directory = /home/mhb11/myfolder/myproject
user=mhb11
numprocs=1
stdout_logfile = /etc/supervisor/logs/celery-beat.log
stderr_logfile = /etc/supervisor/logs/celery-beatlog
autostart = true
autorestart = true
startsecs=10
stopwaitsecs = 600
killasgroup = true
priority = 998
工人
[program:celeryworker]
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app worker -l info
directory = /home/mhb11/myfolder/myproject
user=mhb11
numprocs=1
stdout_logfile = /etc/supervisor/logs/celery-worker.log
stderr_logfile = /etc/supervisor/logs/celery-worker.log
autostart = true
autorestart = true
startsecs=10
stopwaitsecs = 600
killasgroup = true
priority = 998
我有一个 Django 应用程序,我的目标是 运行 通过 celery 通过 redis 完成一项任务。
项目文件夹结构如下:
/mhb11/myfolder/myproject
├── myproject
│ ├── celery.py # The Celery app file
│ ├── __init__.py # The project module file (modified)
│ ├── settings.py # Including Celery settings
│ ├── urls.py
│ └── wsgi.py
├── manage.py
├── celerybeat-schedule
└── myapp
├── __init__.py
├── models.py
├── tasks.py # File containing tasks for this app
├── tests.py
└── views.py
我在 /etc/supervisor/conf.d
中有 celery.conf
,其中包含:
[program:celery]
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app worker -l info
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app beat -l info
directory = /home/mhb11/myfolder/myproject
user=mhb11
numprocs=1
stdout_logfile = /etc/supervisor/logs/celery-worker.log
stderr_logfile = /etc/supervisor/logs/celery-worker.log
autostart = true
autorestart = true
startsecs=10
stopwaitsecs = 600
killasgroup = true
priority = 998
并且在 /etc/supervisor/logs
中,我有一个名为 celery-worker.log
的空文件。设置好之后,我 运行 以下命令:
sudo supervisorctl reread
sudo supervisorctl update
执行此操作后,我的芹菜工人应该开始,但他们没有。 IE。我设置的 celery-worker.log
文件中没有显示任何内容。我不知道我错过了什么,因为这是我第一次设置所有这些。你能帮我解决这个问题吗?
djcelery
是 INSTALLED_APPS 的一部分。此外,settings.py
中的其他相关设置是:
import djcelery
djcelery.setup_loader()
BROKER_URL = 'redis://localhost:6379/0'
BROKER_TRANSPORT = 'redis'
CELERY_IMPORTS = ('myapp.tasks', )
CELERY_ALWAYS_EAGER = False
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_IGNORE_RESULT=True
from datetime import timedelta
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
# CELERYBEAT_SCHEDULE = {
# 'tasks.rank_all_photos': {
# 'task': 'tasks.rank_all_photos',
# 'schedule': timedelta(seconds=30),
# },
# }
CELERY_TIMEZONE = 'UTC'
我的 celery.py
包含:
#this is the celery daemon
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')
app = Celery('myapp', broker='redis://localhost:6379/0', backend='redis://localhost:6379/0',include=['myfolder.myapp.tasks'])
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
app.conf.update(
CELERY_TASK_RESULT_EXPIRES=3600,
)
if __name__ == '__main__':
app.start()
__init__.py
包含:
from __future__ import absolute_import
from .celery import app as celery_app1
tasks.py
包含:
import os
from myproject import celery_app1
import time
from myapp.models import Photo
@celery_app1.task(name='tasks.rank_all_photos')
def rank_all_photos():
for photo in Photo.objects.order_by('-id')[:400]:
photo.set_rank()
最后,在我的 Django 管理面板中,我还设置了 crontab
和 periodic task
。
我应该怎么做才能开始一切?
你是运行你的worker,worker只是执行任务,但是需要把任务放到队列中让worker去寻找任务。 Celery beat 根据通过 Django Admin 或日程表文件设置的日程表将任务放入队列中。在beat queue的task之后,workers找到它并执行它。
所以你需要单独运行一个芹菜打浆过程。一个单独的主管进程
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app beat -l info
使用 periodic/schedules 任务时需要 Celery beat。如果您只是通过调用任务的 .delay()
方法手动排队任务,那么您不需要 Celery beat 运行ning.
因此您的 2 个主管文件将是
节拍
[program:celerybeat]
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app beat -l info
directory = /home/mhb11/myfolder/myproject
user=mhb11
numprocs=1
stdout_logfile = /etc/supervisor/logs/celery-beat.log
stderr_logfile = /etc/supervisor/logs/celery-beatlog
autostart = true
autorestart = true
startsecs=10
stopwaitsecs = 600
killasgroup = true
priority = 998
工人
[program:celeryworker]
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app worker -l info
directory = /home/mhb11/myfolder/myproject
user=mhb11
numprocs=1
stdout_logfile = /etc/supervisor/logs/celery-worker.log
stderr_logfile = /etc/supervisor/logs/celery-worker.log
autostart = true
autorestart = true
startsecs=10
stopwaitsecs = 600
killasgroup = true
priority = 998