在后台任务中使用多个 worker - Fast-API

Using several workers in a background task - Fast-API

我正在尝试处理用户上传的文件。但是,我希望用户在上传完成后得到响应并终止连接但继续处理文件。因此我使用 BackgroundTasks.add_tasks 并且我的代码看起来像这样:

class Line(BaseModel):
    line: str

@app.post("/foo")
async def foo(line: Line):
""" Processing line generate results"""

    ...

    result = ... # processing line.line
    print(results)
    return results

@app.post("/upload")
async def upload(background_tasks: BackgroundTasks, csv: UploadFile = File(...)):

    background_tasks.add_task(process, csv)
    return response.text("CSV has been uploaded successfully")


async def process(csv):
    """ Processing CSV and generate data"""

    tasks = [foo(line) for line in csv]
    result = await asyncio.gather(*tasks)

不幸的是,上面的代码只能逐一执行。此外,我必须等到所有结果都处理完毕,然后在 foo 中打印语句才能工作,即假设我在 csv 中有 n 行,当我看到所有 n 被处理后所有人的打印报表。我的程序运行在 20 个 worker 上,但是虽然这个过程是 运行,但它只使用了 CPU 的大约 1%(foo 不是计算任务,它更像是一个 IO/Network 绑定任务).这让我认为后台进程仅在 1 个工作人员上 运行。我确实按如下方式尝试了 ProcessPoolExecutor:

loop = asyncio.get_event_loop()
lines = [line_0, line_1, ..., line_n] # Extracted all lines from CSV
with ProcessPoolExecutor() as executor:
    results = [loop.run_in_executor(executor, lambda: foo(line)) for line in lines]
    results = loop.run_until_complete(*results)

但是,我收到以下错误:

processpoolexecutor can't pickle local object

我确实通过改变方法克服了这个错误 来自:

results = [loop.run_in_executor(executor, lambda: foo(line)) for line in lines]

至:

results = [asyncio.ensure_future(foo(line=Line(line)) for line in lines]

但是,我得到这个错误:

File "uvloop/loop.pyx", line 2658, in uvloop.loop.Loop.run_in_executor AttributeError: 'Loop' object has no attribute 'submit'

总结: 要处理一行,我可以点击 "/foo" 端点。现在,我想处理 200 行的 csv。所以首先我接受来自用户的文件和 return 一条成功消息并终止该连接。然后将 csv 添加到后台任务,该任务应将每一行映射到 "/foo" 端点,并为我提供每一行的结果。但是,到目前为止我尝试过的所有方法似乎都只使用一个线程并且正在逐行处理每一行。我想要一种可以一起处理多行的方法,就像我同时多次点击 "/foo" 端点一样,就像我们可以使用 Apache JMeter 这样的工具一样。

您可以在不使用端点的情况下进行并行处理。 下面是一个基于您的代码的简化示例(不使用 foo 端点):

import asyncio
import sys
import uvicorn
from fastapi import FastAPI, BackgroundTasks, UploadFile, File
from loguru import logger


logger.remove()
logger.add(sys.stdout, colorize=True, format="<green>{time:HH:mm:ss}</green> | {level} | <level>{message}</level>")

app = FastAPI()


async def async_io_bound(line: str):
    await asyncio.sleep(3)  # Pretend this is IO operations
    return f"Line '{line}' processed"


async def process(csv):
    """ Processing CSV and generate data"""
    tasks = [async_io_bound(line) for line in csv]
    logger.info("start processing")
    result = await asyncio.gather(*tasks)
    for i in result:
        logger.info(i)


@app.post("/upload-to-process")
async def upload(background_tasks: BackgroundTasks, csv: UploadFile = File(...)):
    background_tasks.add_task(process, csv.file)
    return {"result": "CSV has been uploaded successfully"}

if __name__ == "__main__":
    uvicorn.run("app3:app", host="localhost", port=8001)

输出示例(所有行并行处理):

INFO:     ::1:52358 - "POST /upload-to-process HTTP/1.1" 200 OK
13:21:31 | INFO | start processing
13:21:34 | INFO | Line 'b'one, two\n'' processed
13:21:34 | INFO | Line 'b'0, 1\n'' processed
13:21:34 | INFO | Line 'b'1, 1\n'' processed
13:21:34 | INFO | Line 'b'2, 1\n'' processed
13:21:34 | INFO | Line 'b'3, 1\n'' processed
13:21:34 | INFO | Line 'b'4, 1\n'' processed
13:21:34 | INFO | Line 'b'5, 1\n'' processed
13:21:34 | INFO | Line 'b'6, 1\n'' processed
13:21:34 | INFO | Line 'b'7, 1\n'' processed
13:21:34 | INFO | Line 'b'8, 1\n'' processed
13:21:34 | INFO | Line 'b'9, 1\n'' processed