Python Supervisord 导入错误

Python Import Error with Supervisord

我正在使用 supervisord 运行 芹菜作为守护进程,但它给我错误 ImportError: No module named tasks worker。我可以 运行 从 shell 没有 supervisord。我的 supervisord 配置文件是这样的:

[program:celery]
directory=/home/yongfengzhang/videomaker
environment=PYTHONPATH="/home/yongfengzhang/videomaker:/home/yongfengzhang/videomaker/videomaker:$PYTHONPATH",DJANGO_SETTINGS_MODULE="videomaker.settings"
;command=/home/yongfengzhang/Envs/videomake/bin/celery -A "tasks worker" --loglevel=INFO --concurrency=4
command=celery -A "tasks worker" --loglevel=INFO --concurrency=4
process_name=%(program_name)s ; process_name expr (default %(program_name)s)
numprocs=1
user=yongfengzhang                  ; setuid to this UNIX account to run the program
stdout_logfile=/home/yongfengzhang/logs/celery/celery.log
stderr_logfile=/home/yongfengzhang/logs/celery/celery.err
autostart=true
autorestart=true

所以在 /home/yongfengzhang/videomaker 下我有一个 tasks.py 定义了 celery 实例和任务函数(此处未显示):

app = Celery('tasks', backend='redis://localhost', broker='amqp://myuser:bloomsky@localhost:5672/myvhost')

当直接运行这个文件夹下的celery(~/videomaker)时,一切正常。 Envs/videomake 是我的 virtualenv 所在的位置。文件夹里有个__init__.py。任何输入将不胜感激。非常感谢。

我刚刚发现这里出了什么问题。您不能使用双引号对参数进行分组。执行以下操作应该有效:

command=/home/yongfengzhang/Envs/videomake/bin/celery -A tasks worker --loglevel=INFO --concurrency=4

参数 tasks worker 不应该用双引号引起来。谢谢