具有 Redis 代理的芹菜工人无法执行 Django 任务
Celery worker with Redis broker can't execute Django task
这些天我正在通过开发自己的 Reddit 克隆(在 ubuntu 14.04 LTS 上)学习 Python(2.7)/Django(1.5)。我正在尝试将 Celery(3.1) 与 Redis 结合起来,使用它定期 运行 排名算法作为一项任务(在我的 local 设置中)。但不幸的是,我无法让这个简单的任务执行一次!你能帮我发现我做错了什么吗?
这是我的目录结构:
-unconnectedreddit (manage.py is here)
-links (tasks.py, models.py, views.py, admin.py)
-unconnectedreddit (celery.py, __init.py___, settings.py, urls.py)
-static
-templates
Celery.py:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'unconnectedreddit.settings')
app = Celery('unconnectedreddit', broker='redis://localhost:6379/0', backend='redis://localhost:6379/0',include=['unconnectedreddit.links.tasks'])
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
app.conf.update(
CELERY_TASK_RESULT_EXPIRES=3600,
)
if __name__ == '__main__':
app.start()
对settings.py的补充如下。请注意,在将 'djcelery' 添加到已安装的应用程序后,我做了 运行 migrate
:
INSTALLED_APPS = ('djcelery',)
import djcelery
djcelery.setup_loader()
BROKER_URL = 'redis://localhost:6379/0'
CELERY_IMPORTS = ('links.tasks', )
CELERY_ALWAYS_EAGER = False
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_IGNORE_RESULT=True
from datetime import timedelta
CELERYBEAT_SCHEDULE = {
'rank_all-every-30-seconds': {
'task': 'tasks.rank_all',
'schedule': timedelta(seconds=30),
},
}
CELERY_TIMEZONE = 'UTC'
__init__.py:
from __future__ import absolute_import
from .celery import app as celery_app1
tasks.py:
import os
from unconnectedreddit import celery_app1
import time
from links.models import Link
@celery_app1.task
def rank_all():
for link in Link.with_votes.all():
link.set_rank() #ranks interesting 'links' submitted by users
我正在 运行 在终端上执行此命令以启动工作程序:celery -A unconnectedreddit worker -l info
我得到的输出如下:
-------------- celery@has-VirtualBox v3.1.18 (Cipater)
---- **** -----
--- * *** * -- Linux-3.16.0-30-generic-x86_64-with-Ubuntu-14.04-trusty
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: unconnectedreddit:0x7f938b838910
- ** ---------- .> transport: redis://localhost:6379/0
- ** ---------- .> results: redis://localhost:6379/0
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ----
--- ***** ----- [queues]
-------------- .> celery exchange=celery(direct) key=celery
[tasks]
. links.tasks.rank_all
[2015-06-04 12:01:17,083: INFO/MainProcess] Connected to redis://localhost:6379/0
[2015-06-04 12:01:17,098: INFO/MainProcess] mingle: searching for neighbors
[2015-06-04 12:01:18,107: INFO/MainProcess] mingle: all alone
/home/has/.virtualenvs/unconnectedreddit/local/lib/python2.7/site-packages/celery/fixups/django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2015-06-04 12:01:18,136: WARNING/MainProcess] /home/has/.virtualenvs/unconnectedreddit/local/lib/python2.7/site-packages/celery/fixups/django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2015-06-04 12:01:18,137: WARNING/MainProcess] celery@has-VirtualBox ready.
就是这样。我已开始每隔 30 秒定期 运行 此任务(请参阅我的 CELERYBEAT_SCHEDULE)。但是我的代码甚至不会导致它执行一次——我的 reddit 克隆上的排名根本没有改变。任何专家都可以指出我在此设置中缺少的内容吗?
错误的任务名称。通过将任务装饰器更改为 @celery_app1.task(name='tasks.rank_all')
并调整我的节拍时间表以包含正确的名称来解决此问题:
CELERYBEAT_SCHEDULE = {
'tasks.rank_all': {
'task': 'tasks.rank_all',
'schedule': timedelta(seconds=30),
},
}
这些天我正在通过开发自己的 Reddit 克隆(在 ubuntu 14.04 LTS 上)学习 Python(2.7)/Django(1.5)。我正在尝试将 Celery(3.1) 与 Redis 结合起来,使用它定期 运行 排名算法作为一项任务(在我的 local 设置中)。但不幸的是,我无法让这个简单的任务执行一次!你能帮我发现我做错了什么吗?
这是我的目录结构:
-unconnectedreddit (manage.py is here)
-links (tasks.py, models.py, views.py, admin.py)
-unconnectedreddit (celery.py, __init.py___, settings.py, urls.py)
-static
-templates
Celery.py:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'unconnectedreddit.settings')
app = Celery('unconnectedreddit', broker='redis://localhost:6379/0', backend='redis://localhost:6379/0',include=['unconnectedreddit.links.tasks'])
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
app.conf.update(
CELERY_TASK_RESULT_EXPIRES=3600,
)
if __name__ == '__main__':
app.start()
对settings.py的补充如下。请注意,在将 'djcelery' 添加到已安装的应用程序后,我做了 运行 migrate
:
INSTALLED_APPS = ('djcelery',)
import djcelery
djcelery.setup_loader()
BROKER_URL = 'redis://localhost:6379/0'
CELERY_IMPORTS = ('links.tasks', )
CELERY_ALWAYS_EAGER = False
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_IGNORE_RESULT=True
from datetime import timedelta
CELERYBEAT_SCHEDULE = {
'rank_all-every-30-seconds': {
'task': 'tasks.rank_all',
'schedule': timedelta(seconds=30),
},
}
CELERY_TIMEZONE = 'UTC'
__init__.py:
from __future__ import absolute_import
from .celery import app as celery_app1
tasks.py:
import os
from unconnectedreddit import celery_app1
import time
from links.models import Link
@celery_app1.task
def rank_all():
for link in Link.with_votes.all():
link.set_rank() #ranks interesting 'links' submitted by users
我正在 运行 在终端上执行此命令以启动工作程序:celery -A unconnectedreddit worker -l info
我得到的输出如下:
-------------- celery@has-VirtualBox v3.1.18 (Cipater)
---- **** -----
--- * *** * -- Linux-3.16.0-30-generic-x86_64-with-Ubuntu-14.04-trusty
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: unconnectedreddit:0x7f938b838910
- ** ---------- .> transport: redis://localhost:6379/0
- ** ---------- .> results: redis://localhost:6379/0
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ----
--- ***** ----- [queues]
-------------- .> celery exchange=celery(direct) key=celery
[tasks]
. links.tasks.rank_all
[2015-06-04 12:01:17,083: INFO/MainProcess] Connected to redis://localhost:6379/0
[2015-06-04 12:01:17,098: INFO/MainProcess] mingle: searching for neighbors
[2015-06-04 12:01:18,107: INFO/MainProcess] mingle: all alone
/home/has/.virtualenvs/unconnectedreddit/local/lib/python2.7/site-packages/celery/fixups/django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2015-06-04 12:01:18,136: WARNING/MainProcess] /home/has/.virtualenvs/unconnectedreddit/local/lib/python2.7/site-packages/celery/fixups/django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2015-06-04 12:01:18,137: WARNING/MainProcess] celery@has-VirtualBox ready.
就是这样。我已开始每隔 30 秒定期 运行 此任务(请参阅我的 CELERYBEAT_SCHEDULE)。但是我的代码甚至不会导致它执行一次——我的 reddit 克隆上的排名根本没有改变。任何专家都可以指出我在此设置中缺少的内容吗?
错误的任务名称。通过将任务装饰器更改为 @celery_app1.task(name='tasks.rank_all')
并调整我的节拍时间表以包含正确的名称来解决此问题:
CELERYBEAT_SCHEDULE = {
'tasks.rank_all': {
'task': 'tasks.rank_all',
'schedule': timedelta(seconds=30),
},
}