How to best handle Scrapy cache at 'OSError: [Errno 28] No space left on device' failure?

How to best handle Scrapy cache at 'OSError: [Errno 28] No space left on device' failure?

如果 Scrapy 异常失败,建议采取什么措施:

OSError: [Errno 28] No space left on device

Traceback (most recent call last):
  File "/usr/lib/python3.6/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
    result = g.send(result)
  File "/usr/lib/python3.6/site-packages/scrapy/core/downloader/middleware.py", line 53, in process_response
    spider=spider)
  File "/usr/lib/python3.6/site-packages/scrapy/downloadermiddlewares/httpcache.py", line 86, in process_response
    self._cache_response(spider, response, request, cachedresponse)
  File "/usr/lib/python3.6/site-packages/scrapy/downloadermiddlewares/httpcache.py", line 106, in _cache_response
    self.storage.store_response(spider, request, response)
  File "/usr/lib/python3.6/site-packages/scrapy/extensions/httpcache.py", line 317, in store_response
    f.write(to_bytes(repr(metadata)))
OSError: [Errno 28] No space left on device

在这种情况下,ramdisk/tmpfs 限制为 128 MB 用作缓存磁盘,scrapy 设置 HTTPCACHE_EXPIRATION_SECS = 300 on httpcache.FilesystemCacheStorage

HTTPCACHE_ENABLED = True
HTTPCACHE_EXPIRATION_SECS = 300
HTTPCACHE_DIR = '/tmp/ramdisk/scrapycache' # (tmpfs on /tmp/ramdisk type tmpfs (rw,relatime,size=131072k))
HTTPCACHE_IGNORE_HTTP_CODES = ['400','401','403','404','500','504']
HTTPCACHE_STORAGE = 'scrapy.extensions.httpcache.FilesystemCacheStorage'

我可能是错的,但我的印象是 Scrapy 的 FilesystemCacheStorage 可能没有管理它的缓存 (存储限制) 所有这些嗯 (?) .

使用 LevelDB 会不会更好?

你是对的。缓存过期后不会删除任何内容。 HTTPCACHE_EXPIRATION_SECS设置只决定是否使用缓存响应或re-download,对于所有HTTPCACHE_STORAGE

如果你的缓存数据非常大,你应该考虑使用DB来存储而不是本地文件系统。或者你可以扩展后端存储来添加一个 LoopingCall Task 来连续删除过期的缓存。

为什么 scrapy 保留被忽略的数据?

我认为有两点:

  • HTTPCACHE_EXPIRATION_SECS控制是使用缓存响应还是re-download,它只保证你使用no-expire缓存。不同的蜘蛛可能设置不同expiration_secs,删除缓存会使缓存混乱

  • 如果要删除过期的缓存,需要一个LoopingCall Task来不断检查过期的缓存,这使得scrapy扩展更加复杂,这不是scrapy想要的。