Django Model.objects.all() 在 celery 任务中返回空的 QuerySet
Django Model.objects.all() returning empty QuerySet in celery task
project/project/settings.py
...
CELERY_BEAT_SCHEDULE = {
'find-subdomains': {
'task': 'subdiscovery.tasks.mytask',
'schedule': 10.0
}
}
project/subdiscovery/tasks.py
from __future__ import absolute_import, unicode_literals
from celery import shared_task
from subdiscovery.models import Domain
@shared_task
def mytask():
print(Domain.objects.all())
return 99
celery worker 显示一个空的 QuerySet:
celery_worker_1 | [2019-08-12 07:07:44,229: WARNING/ForkPoolWorker-2] <QuerySet []>
celery_worker_1 | [2019-08-12 07:07:44,229: INFO/ForkPoolWorker-2] Task subdiscovery.tasks.mytask[60c59024-cd19-4ce9-ae69-782a3a81351b] succeeded in 0.004897953000181587s: 99
但是,导入相同的模型在 python shell:
./manage.py shell
>>> from subdiscovery.models import Domain
>>> Domain.objects.all()
<QuerySet [<Domain: example1.com>, <Domain: example2.com>, <Domain: example3.com>]>
我应该在 docker 堆栈中提及它 运行
编辑:
好的,进入运行docker容器
docker exec -it <web service container id> /bin/sh
和运行
$ celery -A project worker -l info
按预期工作:
[2019-08-13 05:12:28,945: INFO/MainProcess] Received task: subdiscovery.tasks.mytask[7b2760cf-1e7f-41f8-bc13-fa4042eedf33]
[2019-08-13 05:12:28,957: WARNING/ForkPoolWorker-8] <QuerySet [<Domain: uber.com>, <Domain: example1.com>, <Domain: example2.com>, <Domain: example3.com>]>
这是 docker-compose.yml 的样子
version: '3'
services:
web:
build: .
image: app-image
ports:
- 80:8000
volumes:
- .:/app
command: gunicorn -b 0.0.0.0:8000 project.wsgi
redis:
image: "redis:alpine"
ports:
- 6379:6379
celery_worker:
working_dir: /app
command: sh -c './wait-for web:8000 && ./wait-for redis:6379 -- celery -A project worker -l info'
image: app-image
depends_on:
- web
- redis
celery_beat:
working_dir: /app
command: sh -c 'celery -A project beat -l info'
image: app-image
depends_on:
- celery_worker
知道为什么 worker 以 docker-compose 开始不起作用,但进入 运行 容器并启动 worker 了吗?
你的问题是你的 celery worker 没有看到 sqlite 数据库。您需要切换到不同的数据库或使您的 ./app
卷可见。
version: '3'
services:
...
celery_worker:
working_dir: /app
command: sh -c './wait-for web:8000 && ./wait-for redis:6379 -- celery -A project worker -l info'
image: app-image
volumes: # <-here
- .:/app
depends_on:
- web
- redis
...
我建议切换到更适合生产的数据库,如 postgres
project/project/settings.py
...
CELERY_BEAT_SCHEDULE = {
'find-subdomains': {
'task': 'subdiscovery.tasks.mytask',
'schedule': 10.0
}
}
project/subdiscovery/tasks.py
from __future__ import absolute_import, unicode_literals
from celery import shared_task
from subdiscovery.models import Domain
@shared_task
def mytask():
print(Domain.objects.all())
return 99
celery worker 显示一个空的 QuerySet:
celery_worker_1 | [2019-08-12 07:07:44,229: WARNING/ForkPoolWorker-2] <QuerySet []>
celery_worker_1 | [2019-08-12 07:07:44,229: INFO/ForkPoolWorker-2] Task subdiscovery.tasks.mytask[60c59024-cd19-4ce9-ae69-782a3a81351b] succeeded in 0.004897953000181587s: 99
但是,导入相同的模型在 python shell:
./manage.py shell
>>> from subdiscovery.models import Domain
>>> Domain.objects.all()
<QuerySet [<Domain: example1.com>, <Domain: example2.com>, <Domain: example3.com>]>
我应该在 docker 堆栈中提及它 运行
编辑:
好的,进入运行docker容器
docker exec -it <web service container id> /bin/sh
和运行
$ celery -A project worker -l info
按预期工作:
[2019-08-13 05:12:28,945: INFO/MainProcess] Received task: subdiscovery.tasks.mytask[7b2760cf-1e7f-41f8-bc13-fa4042eedf33]
[2019-08-13 05:12:28,957: WARNING/ForkPoolWorker-8] <QuerySet [<Domain: uber.com>, <Domain: example1.com>, <Domain: example2.com>, <Domain: example3.com>]>
这是 docker-compose.yml 的样子
version: '3'
services:
web:
build: .
image: app-image
ports:
- 80:8000
volumes:
- .:/app
command: gunicorn -b 0.0.0.0:8000 project.wsgi
redis:
image: "redis:alpine"
ports:
- 6379:6379
celery_worker:
working_dir: /app
command: sh -c './wait-for web:8000 && ./wait-for redis:6379 -- celery -A project worker -l info'
image: app-image
depends_on:
- web
- redis
celery_beat:
working_dir: /app
command: sh -c 'celery -A project beat -l info'
image: app-image
depends_on:
- celery_worker
知道为什么 worker 以 docker-compose 开始不起作用,但进入 运行 容器并启动 worker 了吗?
你的问题是你的 celery worker 没有看到 sqlite 数据库。您需要切换到不同的数据库或使您的 ./app
卷可见。
version: '3'
services:
...
celery_worker:
working_dir: /app
command: sh -c './wait-for web:8000 && ./wait-for redis:6379 -- celery -A project worker -l info'
image: app-image
volumes: # <-here
- .:/app
depends_on:
- web
- redis
...
我建议切换到更适合生产的数据库,如 postgres