Celery 任务未执行自定义 Django 管理命令
Custom Django Management Command not being executed by Celery Task
我在 Django 应用程序中创建了这个自定义管理命令,用于删除数据库中除一条记录外的所有记录 table:
from django.core.management.base import BaseCommand
from tokenizer.models import OauthToken
class Command(BaseCommand):
help = 'Deletes all but the most recent oauthtoken'
def handle(self, *args, **options):
latest_token_id = OauthToken.objects.latest("gen_time").id
OauthToken.objects.exclude(id=latest_token_id).delete()
并且当 运行 手动时它按预期工作,像这样:
python manage.py oauth_table_clearout
但是,当我尝试让 Celery 任务执行它时,虽然任务似乎已被拾取并成功,但记录并没有从数据库中删除,也没有给出明显的错误。
我是运行宁docker-这样写:
version: '3.7'
services:
redis:
image: redis:alpine
django:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
env_file:
- ./.env
depends_on:
- redis
celery:
build: .
command: celery -A token_generator worker -l debug --without-gossip --without-mingle --without-heartbeat -Ofair --pool=solo
volumes:
- .:/usr/src/app/
depends_on:
- redis
env_file:
- ./.env
celery-beat:
build: .
command: celery -A token_generator beat -l debug
volumes:
- .:/usr/src/app/
depends_on:
- redis
env_file:
- ./.env
请注意,我已尝试将“--without-gossip --without-mingle --without-heartbeat -Ofair”附加到 worker 命令,(这似乎为其他所有人解决了这个特定问题! )
日志如下所示:
celery-beat_1 | [2020-11-26 21:51:00,049: DEBUG/MainProcess] beat: Synchronizing schedule...
celery-beat_1 | [2020-11-26 21:51:00,056: INFO/MainProcess] Scheduler: Sending due task oauth_task (token_generator.tasks.oauth_db_clearout_task)
celery-beat_1 | [2020-11-26 21:51:00,065: DEBUG/MainProcess] token_generator.tasks.oauth_db_clearout_task sent. id->ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f
celery-beat_1 | [2020-11-26 21:51:00,067: DEBUG/MainProcess] beat: Waking up in 59.92 seconds.
celery_1 | [2020-11-26 21:51:00,070: INFO/MainProcess] Received task: token_generator.tasks.oauth_db_clearout_task[ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f]
celery_1 | [2020-11-26 21:51:00,076: DEBUG/MainProcess] TaskPool: Apply <function _fast_trace_task at 0x7f32013b3c10> (args:('token_generator.tasks.oauth_db_clearout_task', 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', {'lang': 'py', 'task': 'token_generator.tasks.oauth_db_clearout_task', 'id': 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', 'parent_id': None, 'argsrepr': '()', 'kwargsrepr': '{}', 'origin': 'gen1@328b6b324d84', 'reply_to': '0513ed80-806d-33c4-aa3f-83f942c27d0d', 'correlation_id': 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', 'hostname': 'celery@6735220ff248', 'delivery_info': {'exchange': '', 'routing_key': 'celery', 'priority': 0, 'redelivered': None}, 'args': [], 'kwargs': {}}, b'[[], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]', 'application/json', 'utf-8') kwargs:{})
celery_1 | [2020-11-26 21:51:00,077: DEBUG/MainProcess] Task accepted: token_generator.tasks.oauth_db_clearout_task[ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f] pid:1
celery_1 | [2020-11-26 21:51:00,106: INFO/MainProcess] Task token_generator.tasks.oauth_db_clearout_task[ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f] succeeded in 0.028364189998683287s: None
我的应用程序中的 celery.py 文件:
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "token_generator.settings")
app = Celery("token_generator")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
和芹菜相关settings.py:
CELERY_BROKER_URL = "redis://redis:6379"
CELERY_RESULT_BACKEND = "redis://redis:6379"
CELERY_BEAT_SCHEDULE = {
"oauth_task": {
"task": "token_generator.tasks.oauth_db_clearout_task",
"schedule": crontab(minute="*/1"),
},
}
芹菜报告:
software -> celery:5.0.2 (singularity) kombu:5.0.2 py:3.8.2
billiard:3.6.3.0 py-amqp:5.0.2
platform -> system:Linux arch:64bit
kernel version:5.4.0-53-generic imp:CPython
loader -> celery.loaders.default.Loader
settings -> transport:amqp results:disabled
deprecated_settings: None
Django 是 3.1.3
我自己找到了答案。每个容器都有自己的 sqlite 数据库副本。命令实际上正在执行,但仅在 celery 容器中的数据库副本上执行。我的 IDE 正在检查的数据库位于不同的容器中,因此未受影响。
我在 docker-compose 配置中添加了一个额外的 postgres 服务,并添加了一个 dockerignore 文件,以不复制 sqlite 数据库。
我在 Django 应用程序中创建了这个自定义管理命令,用于删除数据库中除一条记录外的所有记录 table:
from django.core.management.base import BaseCommand
from tokenizer.models import OauthToken
class Command(BaseCommand):
help = 'Deletes all but the most recent oauthtoken'
def handle(self, *args, **options):
latest_token_id = OauthToken.objects.latest("gen_time").id
OauthToken.objects.exclude(id=latest_token_id).delete()
并且当 运行 手动时它按预期工作,像这样:
python manage.py oauth_table_clearout
但是,当我尝试让 Celery 任务执行它时,虽然任务似乎已被拾取并成功,但记录并没有从数据库中删除,也没有给出明显的错误。
我是运行宁docker-这样写:
version: '3.7'
services:
redis:
image: redis:alpine
django:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
env_file:
- ./.env
depends_on:
- redis
celery:
build: .
command: celery -A token_generator worker -l debug --without-gossip --without-mingle --without-heartbeat -Ofair --pool=solo
volumes:
- .:/usr/src/app/
depends_on:
- redis
env_file:
- ./.env
celery-beat:
build: .
command: celery -A token_generator beat -l debug
volumes:
- .:/usr/src/app/
depends_on:
- redis
env_file:
- ./.env
请注意,我已尝试将“--without-gossip --without-mingle --without-heartbeat -Ofair”附加到 worker 命令,(这似乎为其他所有人解决了这个特定问题! )
日志如下所示:
celery-beat_1 | [2020-11-26 21:51:00,049: DEBUG/MainProcess] beat: Synchronizing schedule...
celery-beat_1 | [2020-11-26 21:51:00,056: INFO/MainProcess] Scheduler: Sending due task oauth_task (token_generator.tasks.oauth_db_clearout_task)
celery-beat_1 | [2020-11-26 21:51:00,065: DEBUG/MainProcess] token_generator.tasks.oauth_db_clearout_task sent. id->ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f
celery-beat_1 | [2020-11-26 21:51:00,067: DEBUG/MainProcess] beat: Waking up in 59.92 seconds.
celery_1 | [2020-11-26 21:51:00,070: INFO/MainProcess] Received task: token_generator.tasks.oauth_db_clearout_task[ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f]
celery_1 | [2020-11-26 21:51:00,076: DEBUG/MainProcess] TaskPool: Apply <function _fast_trace_task at 0x7f32013b3c10> (args:('token_generator.tasks.oauth_db_clearout_task', 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', {'lang': 'py', 'task': 'token_generator.tasks.oauth_db_clearout_task', 'id': 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', 'parent_id': None, 'argsrepr': '()', 'kwargsrepr': '{}', 'origin': 'gen1@328b6b324d84', 'reply_to': '0513ed80-806d-33c4-aa3f-83f942c27d0d', 'correlation_id': 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', 'hostname': 'celery@6735220ff248', 'delivery_info': {'exchange': '', 'routing_key': 'celery', 'priority': 0, 'redelivered': None}, 'args': [], 'kwargs': {}}, b'[[], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]', 'application/json', 'utf-8') kwargs:{})
celery_1 | [2020-11-26 21:51:00,077: DEBUG/MainProcess] Task accepted: token_generator.tasks.oauth_db_clearout_task[ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f] pid:1
celery_1 | [2020-11-26 21:51:00,106: INFO/MainProcess] Task token_generator.tasks.oauth_db_clearout_task[ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f] succeeded in 0.028364189998683287s: None
我的应用程序中的 celery.py 文件:
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "token_generator.settings")
app = Celery("token_generator")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
和芹菜相关settings.py:
CELERY_BROKER_URL = "redis://redis:6379"
CELERY_RESULT_BACKEND = "redis://redis:6379"
CELERY_BEAT_SCHEDULE = {
"oauth_task": {
"task": "token_generator.tasks.oauth_db_clearout_task",
"schedule": crontab(minute="*/1"),
},
}
芹菜报告:
software -> celery:5.0.2 (singularity) kombu:5.0.2 py:3.8.2
billiard:3.6.3.0 py-amqp:5.0.2
platform -> system:Linux arch:64bit
kernel version:5.4.0-53-generic imp:CPython
loader -> celery.loaders.default.Loader
settings -> transport:amqp results:disabled
deprecated_settings: None
Django 是 3.1.3
我自己找到了答案。每个容器都有自己的 sqlite 数据库副本。命令实际上正在执行,但仅在 celery 容器中的数据库副本上执行。我的 IDE 正在检查的数据库位于不同的容器中,因此未受影响。
我在 docker-compose 配置中添加了一个额外的 postgres 服务,并添加了一个 dockerignore 文件,以不复制 sqlite 数据库。