将任务发送给芹菜中的两个独立工人
Send tasks to two separate workers in celery
我在 celery 中有两个任务,想将它们发送给不同的工人来完成,但我不确定如何执行此操作。我查看了 celery 文档的 task_routes 部分,并尝试了一些来自 Whosebug 的东西,但没有成功。
tasks.py
@app.task
def task1():
does something
@app.task
def task2():
does something else
我有两个 celery worker,我希望他们每个人专注于一项任务,因此 worker1 负责 task1,worker2 负责 task2
celery.py
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery('project')
app.conf.timezone = 'Europe/London'
app.config_from_object('django.conf:settings')
app.conf.update(BROKER_URL=str(os.getenv('REDIS_URL')),
CELERY_RESULT_BACKEND=str(os.getenv('REDIS_URL')),
broker_use_ssl = {
'ssl_cert_reqs': ssl.CERT_NONE
},
redis_backend_use_ssl = {
'ssl_cert_reqs': ssl.CERT_NONE
})
app.autodiscover_tasks()
然后procfile -
web: gunicorn project.wsgi --log-file -
worker1: celery -A project worker -l INFO --concurrency=1 -Ofair -n worker1.%h
worker2: celery -A project worker -l INFO --concurrency=1 -Ofair -n worker2.%h
如何设置队列以使 worker1 = task1 和 worker2 = task2?
您可以为不同的任务设置两个单独的队列:
# 'your.project.tasks.task2' is same as import path
app.conf.task_routes = {'your.project.tasks.task2': {'queue': 'some_special_queue'}}
和运行芹菜为:
# you may add -Q celery to first command (celery is a default queue name if you didn't specify it)
celery -A project worker -l INFO --concurrency=1 -Ofair -n worker1.%h
celery -A project worker -l INFO --concurrency=1 -Ofair -n worker2.%h -Q some_special_queue
task2
将向 some_special_queue
添加消息,并且只有 worker2
监听此队列
所有其他任务将由标准 celery
队列处理
我在 celery 中有两个任务,想将它们发送给不同的工人来完成,但我不确定如何执行此操作。我查看了 celery 文档的 task_routes 部分,并尝试了一些来自 Whosebug 的东西,但没有成功。
tasks.py
@app.task
def task1():
does something
@app.task
def task2():
does something else
我有两个 celery worker,我希望他们每个人专注于一项任务,因此 worker1 负责 task1,worker2 负责 task2
celery.py
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery('project')
app.conf.timezone = 'Europe/London'
app.config_from_object('django.conf:settings')
app.conf.update(BROKER_URL=str(os.getenv('REDIS_URL')),
CELERY_RESULT_BACKEND=str(os.getenv('REDIS_URL')),
broker_use_ssl = {
'ssl_cert_reqs': ssl.CERT_NONE
},
redis_backend_use_ssl = {
'ssl_cert_reqs': ssl.CERT_NONE
})
app.autodiscover_tasks()
然后procfile -
web: gunicorn project.wsgi --log-file -
worker1: celery -A project worker -l INFO --concurrency=1 -Ofair -n worker1.%h
worker2: celery -A project worker -l INFO --concurrency=1 -Ofair -n worker2.%h
如何设置队列以使 worker1 = task1 和 worker2 = task2?
您可以为不同的任务设置两个单独的队列:
# 'your.project.tasks.task2' is same as import path
app.conf.task_routes = {'your.project.tasks.task2': {'queue': 'some_special_queue'}}
和运行芹菜为:
# you may add -Q celery to first command (celery is a default queue name if you didn't specify it)
celery -A project worker -l INFO --concurrency=1 -Ofair -n worker1.%h
celery -A project worker -l INFO --concurrency=1 -Ofair -n worker2.%h -Q some_special_queue
task2
将向 some_special_queue
添加消息,并且只有 worker2
监听此队列
所有其他任务将由标准 celery
队列处理