仅当 post 和 get 请求发送到包含 .delay() 和 AsyncResult() 的路由时才会处理 Celery 周期性任务

Celery periodic tasks being processed only if a post and get request is sent to a route containing .delay() and AsyncResult()

我想 运行 每三分钟执行一次此任务,这就是我所拥有的

tasks.py

@shared_task
def save_hackathon_to_db():
    logger.info('ran')
    loop = asyncio.get_event_loop()
    statuses = ['ended', 'open', 'upcoming']
    loop.run_until_complete(send_data(statuses))
    logger.info('ended')

settings.py

CELERY_BEAT_SCHEDULE = {
    "devpost_api_task": {
        "task": "backend.tasks.save_hackathon_to_db",
        "schedule": crontab(minute="*/3"),
    },
}

然而,这并不是每 3 分钟得到一次 运行,当我向 http://0.0.0.0:8000/hackathon/task 发送一个 post 请求,然后向 http://0.0.0.0:8000/hackathon/task 发送一个 get 请求时,它只会得到 运行 http://0.0.0.0:8000/hackathon/<task_id>

这是分别与路线相关的功能代码

@csrf_exempt
def run_task(request):
    if request.method == 'POST':
        task = save_hackathon_to_db.delay()
        return JsonResponse({"task_id": task.id}, status=202)


@csrf_exempt
def get_status(request, task_id):
    print(task_id)
    task_result = AsyncResult(task_id)
    result = {
        "task_id": task_id,
        "task_status": task_result.status,
        "task_result": task_result.result
    }
    return JsonResponse(result, status=200)

docker-compose.yml

version: "3.9"

services:
  backend:
    build: ./backend
    ports:
      - "8000:8000"
    command: python3 manage.py runserver 0.0.0.0:8000
    depends_on:
      - db
      - redis
  celery:
    build: ./backend
    command: celery -A backend worker -l INFO --logfile=logs/celery.log
    volumes:
      - ./backend:/usr/src/app
    depends_on:
      - backend
      - redis
  celery-beat:
    build: ./backend
    command: celery -A backend beat -l info
    volumes:
      - ./backend/:/usr/src/app/
    depends_on:
      - redis
  dashboard:
    build: ./backend
    command: flower -A backend --port=5555 --broker=redis://redis:6379/0
    ports:
      - 5555:5555
    depends_on:
      - backend
      - redis
      - celery
  db:
    image: postgres
    volumes:
      - postgres_data:/var/lib/postgresql/data
    environment:
      POSTGRES_DB: ${DB_NAME}
      POSTGRES_USER: ${DB_USER}
      POSTGRES_PASSWORD: ${DB_PASSWORD}
  redis:
    image: redis:6-alpine
volumes:
  postgres_data:

编辑 #1 我怀疑这是可能的解决方案 #1 因为我确实有生产者 运行ning

这些是我从 celery-beat

获得的初始日志
celery-beat_1  | celery beat v4.4.7 (cliffs) is starting.
celery-beat_1  | ERROR: Pidfile (celerybeat.pid) already exists.
celery-beat_1  | Seems we're already running? (pid: 1)

celery 日志

celery_1       | /usr/local/lib/python3.9/site-packages/celery/platforms.py:800: RuntimeWarning: You're running the worker with superuser privileges: this is
celery_1       | absolutely not recommended!
celery_1       | 
celery_1       | Please specify a different user using the --uid option.
celery_1       | 
celery_1       | User information: uid=0 euid=0 gid=0 egid=0
celery_1       | 
celery_1       |   warnings.warn(RuntimeWarning(ROOT_DISCOURAGED.format(
celery_1       |  
celery_1       |  -------------- celery@948692de8c79 v4.4.7 (cliffs)
celery_1       | --- ***** ----- 
celery_1       | -- ******* ---- Linux-5.10.25-linuxkit-x86_64-with 2021-09-10 19:59:53
celery_1       | - *** --- * --- 
celery_1       | - ** ---------- [config]
celery_1       | - ** ---------- .> app:         backend:0x7f6af44a8130
celery_1       | - ** ---------- .> transport:   redis://redis:6379/0
celery_1       | - ** ---------- .> results:     redis://redis:6379/0
celery_1       | - *** --- * --- .> concurrency: 4 (prefork)
celery_1       | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
celery_1       | --- ***** ----- 
celery_1       |  -------------- [queues]
celery_1       |                 .> celery           exchange=celery(direct) key=celery
celery_1       |                 
celery_1       | 
celery_1       | [tasks]
celery_1       |   . backend.tasks.save_hackathon_to_db
celery_1       | 
2021-09-10 19:59:53,949: INFO/MainProcess] Connected to redis://redis:6379/0
[2021-09-10 19:59:53,960: INFO/MainProcess] mingle: searching for neighbors
[2021-09-10 19:59:54,988: INFO/MainProcess] mingle: all alone
[2021-09-10 19:59:55,019: WARNING/MainProcess] /usr/local/lib/python3.9/site-packages/celery/fixups/django.py:205: UserWarning: Using settings.DEBUG leads to a memory
            leak, never use this setting in production environments!
  warnings.warn('''Using settings.DEBUG leads to a memory
[2021-09-10 19:59:55,020: INFO/MainProcess] celery@948692de8c79 ready.
[2021-09-10 19:59:59,464: INFO/MainProcess] Events of group {task} enabled by remote.

dashboard 日志

dashboard_1    | [I 210910 19:59:54 command:135] Visit me at http://localhost:5555
dashboard_1    | [I 210910 19:59:54 command:142] Broker: redis://redis:6379/0
dashboard_1    | [I 210910 19:59:54 command:143] Registered tasks: 
dashboard_1    |     ['backend.tasks.save_hackathon_to_db',
dashboard_1    |      'celery.accumulate',
dashboard_1    |      'celery.backend_cleanup',
dashboard_1    |      'celery.chain',
dashboard_1    |      'celery.chord',
dashboard_1    |      'celery.chord_unlock',
dashboard_1    |      'celery.chunks',
dashboard_1    |      'celery.group',
dashboard_1    |      'celery.map',
dashboard_1    |      'celery.starmap']
dashboard_1    | [I 210910 19:59:54 mixins:229] Connected to redis://redis:6379/0
dashboard_1    | [W 210910 19:59:55 inspector:42] Inspect method reserved failed
dashboard_1    | [W 210910 19:59:55 inspector:42] Inspect method scheduled failed
dashboard_1    | [W 210910 19:59:55 inspector:42] Inspect method conf failed
dashboard_1    | [W 210910 19:59:55 inspector:42] Inspect method active_queues failed
dashboard_1    | [W 210910 19:59:55 inspector:42] Inspect method stats failed
dashboard_1    | [W 210910 19:59:55 inspector:42] Inspect method registered failed
dashboard_1    | [W 210910 19:59:55 inspector:42] Inspect method active failed
dashboard_1    | [W 210910 19:59:55 inspector:42] Inspect method revoked failed

我们知道的是:

  • 您的 Django 应用 运行ning 因为它可以接受 GET / POST 请求
    • 例如python manage.py runserver
  • 您的 Celery worker (consumer) 运行ning 因为它可以接受来自 Django 应用程序的异步 .delay() 请求
    • 例如celery --app=my_proj worker --loglevel=INFO

可能的解决方案 1

检查 Celery 调度程序 (producer) 是否也是 运行ning。

  • 例如celery --app=my_proj beat --loglevel=INFO

除非您将调度程序嵌入到工作程序中 运行,否则请注意它们是不同的实体。 worker a.k.a consumer 是将 receive/process 请求的那个,而 scheduler a.k.a producer 是将 send/trigger 请求的那个。如果没有 运行ning 调度程序,计划的任务当然不会排队,因此不会发生任何处理。

有关 运行 调度程序 here 的更多信息。

可能的解决方案 2

检查工作实例(消费者)中的错误日志。可能调度程序(生产者)正在 运行ning 并将任务排入队列,但工作人员无法处理它们,这可能是由于未找到任务实现。在这种情况下,请确保在 celery 应用程序的配置中包含任务的位置,例如celery_app.conf.update(imports=['backend.tasks']).


要验证调度程序是否 运行 正确,日志必须显示它每 3 分钟正确地将任务排入队列:

$ celery --app=my_proj beat --loglevel=INFO
celery beat v5.1.2 (sun-harmonics) is starting.
__    -    ... __   -        _
LocalTime -> 2021-09-11 01:38:05
Configuration ->
    . broker -> amqp://guest:**@127.0.0.1:5672//
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%INFO
    . maxinterval -> 5.00 minutes (300s)
[2021-09-11 01:38:05,523: INFO/MainProcess] beat: Starting...
[2021-09-11 01:38:05,544: INFO/MainProcess] Scheduler: Sending due task backend.tasks.save_hackathon_to_db (devpost_api_task)
[2021-09-11 01:41:05,613: INFO/MainProcess] Scheduler: Sending due task backend.tasks.save_hackathon_to_db (devpost_api_task)
[2021-09-11 01:44:05,638: INFO/MainProcess] Scheduler: Sending due task backend.tasks.save_hackathon_to_db (devpost_api_task)
...