scrapyd 运行 作为守护进程找不到蜘蛛或项目

scrapyd running as daemon cannot find spider or project

spider 的名字是 quotes14,它在命令行下运行良好

即如果我从目录 /var/www/html/sprojects/tutorial/ 运行 scrapy crawl quotes14 它在命令行中工作正常。

我有 scrapyd 运行ning 作为守护进程。

我的 scrapy 蜘蛛文件在这里:/var/www/html/sprojects/tutorial/tutorial/spiders

我在上面的目录下有很多蜘蛛和其他文件,项目是/var/www/html/sprojects/tutorial/tutorial/

我试过了

curl http://localhost:6800/schedule.json -d project=tutorial -d spider=spiders/quotes14

curl http://localhost:6800/schedule.json -d project=/var/www/html/sprojects/tutorial/tutorial/tutorial -d spider=quotes14

curl http://localhost:6800/schedule.json -d project=/var/www/html/sprojects/tutorial/tutorial/ -d spider=quotes14

curl http://localhost:6800/schedule.json -d project=/var/www/html/sprojects/tutorial/tutorial/tutorial -d spider=spiders/quotes14

它说找不到项目或找不到蜘蛛

请帮忙

为了使用调度端点,您必须首先将蜘蛛部署到守护进程。 docs 告诉你如何做到这一点。

Deploying your project involves eggifying it and uploading the egg to Scrapyd via the addversion.json endpoint. You can do this manually, but the easiest way is to use the scrapyd-deploy tool provided by scrapyd-client which will do it all for you.