Pillow python : 提高脚本性能

Pillow python : Improve script performance

我有一个简单的脚本可以从图像列表中获取图像大小 URL 但是当列表太大时它会太慢(例如:120 URLs,它可能需要10 秒到 运行)

def get_image_size(url):
    data = requests.get(url).content
    try:
        im = Image.open(BytesIO(data))
        size = im.size
    except:
        size = False
    return size

list_images = ['https://example.com/img.png', ...]
for img in list_images:
    get_image_size(img)

我已经尝试过 Gevent,它可以让我节省 50% 的处理时间,但这还不够。我想知道是否还有其他选项可以使此脚本 运行 更快?

最终目标是得到数据集最大的5张图片

您可以使用 grequests(请求和 gevent)而不是使用 Pillow 来获取图像大小,您可以从 HTTP headers:

通常性能取决于网络connection/server速度和图像大小:

import grequests


def downloadImages(images):
    result = {}
    rs = (grequests.get(t) for t in images)
    downloads = grequests.map(rs, size=len(images))

    for download in downloads:
        _status = 200 == download.status_code
        _url = download.url

        if _status:
            for k, v in download.headers.items():
                if k.lower() == 'content-length':
                    result[_url] = v
                    continue
        else:
            result[_url] = -1
    return result


if __name__ == '__main__':
    urls = [
        'https://b.tile.openstreetmap.org/12/2075/1409.png',
        'https://b.tile.openstreetmap.org/12/2075/1410.png',
        'https://b.tile.openstreetmap.org/12/2075/1411.png',
        'https://b.tile.openstreetmap.org/12/2075/1412.png'
    ]

    sizes = downloadImages(urls)
    pprint.pprint(sizes)

Returns:

{'https://b.tile.openstreetmap.org/12/2075/1409.png': '40472',
 'https://b.tile.openstreetmap.org/12/2075/1410.png': '38267',
 'https://b.tile.openstreetmap.org/12/2075/1411.png': '36338',
 'https://b.tile.openstreetmap.org/12/2075/1412.png': '30467'}