Python - 运行 同时执行多个异步函数
Python - Run multiple async functions simultaneously
我实际上是在制作一个 pinger,它有一个键/webhook 对的二维列表,在对一个键执行 ping 后,将响应发送到 webhook
二维列表如下:
some_list = [["key1", "webhook1"], ["key2", "webhook2"]]
我的程序本质上是一个循环,我不太确定如何在函数中旋转 some_list
数据。
这是我的脚本的一个小演示:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks*)
sleep(10)
await do_ping(some_pair)
我试过:
async def main():
for entry in some_list:
asyncio.run(do_ping(entry))
但是由于do_ping
函数是一个自调用循环,它只会一遍又一遍地调用第一个,而永远不会调用它后面的那个。希望找到解决这个问题的方法,无论是线程还是类似的,如果你有更好的方法来构造 some_list
值(我假设它是一本字典),也请随时放弃该反馈
您使方法递归 await do_ping(some_pair)
,它永远不会结束 main
中的循环继续。我会像这样重组应用程序:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
while True:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks)
await asyncio.sleep(10)
async def main():
tasks = [do_ping(entry) for entry in some_list]
await asyncio.gather(*tasks)
if __name__ == "__main__":
asyncio.run(main())
或者您可以将重复和休眠逻辑移动到 main
:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks)
async def main():
while True:
tasks = [do_ping(entry) for entry in some_list]
await asyncio.gather(*tasks)
await asyncio.sleep(10)
if __name__ == "__main__":
asyncio.run(main())
您也可以在调用睡眠之前启动任务,然后再收集它们。这将使 ping 更一致地以 10 秒的间隔开始,而不是 10 秒 + 收集结果所需的时间:
async def main():
while True:
tasks = [
asyncio.create_task(do_ping(entry))
for entry in some_list
]
await asyncio.sleep(10)
await asyncio.wait(tasks)
编辑正如creolo you should only create a single ClientSession
object. See https://docs.aiohttp.org/en/stable/client_reference.html
所指出的
Session encapsulates a connection pool (connector instance) and supports keepalives by default. Unless you are connecting to a large, unknown number of different servers over the lifetime of your application, it is suggested you use a single session for the lifetime of your application to benefit from connection pooling.
async def do_ping(session, some_pair):
tasks = await gen_tasks(session, some_pair)
results = await asyncio.gather(*tasks)
async def main():
async with aiohttp.ClientSession() as session:
while True:
tasks = [
asyncio.create_task(do_ping(session, entry))
for entry in some_list
]
await asyncio.sleep(10)
await asyncio.wait(tasks)
我实际上是在制作一个 pinger,它有一个键/webhook 对的二维列表,在对一个键执行 ping 后,将响应发送到 webhook
二维列表如下:
some_list = [["key1", "webhook1"], ["key2", "webhook2"]]
我的程序本质上是一个循环,我不太确定如何在函数中旋转 some_list
数据。
这是我的脚本的一个小演示:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks*)
sleep(10)
await do_ping(some_pair)
我试过:
async def main():
for entry in some_list:
asyncio.run(do_ping(entry))
但是由于do_ping
函数是一个自调用循环,它只会一遍又一遍地调用第一个,而永远不会调用它后面的那个。希望找到解决这个问题的方法,无论是线程还是类似的,如果你有更好的方法来构造 some_list
值(我假设它是一本字典),也请随时放弃该反馈
您使方法递归 await do_ping(some_pair)
,它永远不会结束 main
中的循环继续。我会像这样重组应用程序:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
while True:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks)
await asyncio.sleep(10)
async def main():
tasks = [do_ping(entry) for entry in some_list]
await asyncio.gather(*tasks)
if __name__ == "__main__":
asyncio.run(main())
或者您可以将重复和休眠逻辑移动到 main
:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks)
async def main():
while True:
tasks = [do_ping(entry) for entry in some_list]
await asyncio.gather(*tasks)
await asyncio.sleep(10)
if __name__ == "__main__":
asyncio.run(main())
您也可以在调用睡眠之前启动任务,然后再收集它们。这将使 ping 更一致地以 10 秒的间隔开始,而不是 10 秒 + 收集结果所需的时间:
async def main():
while True:
tasks = [
asyncio.create_task(do_ping(entry))
for entry in some_list
]
await asyncio.sleep(10)
await asyncio.wait(tasks)
编辑正如creolo you should only create a single ClientSession
object. See https://docs.aiohttp.org/en/stable/client_reference.html
Session encapsulates a connection pool (connector instance) and supports keepalives by default. Unless you are connecting to a large, unknown number of different servers over the lifetime of your application, it is suggested you use a single session for the lifetime of your application to benefit from connection pooling.
async def do_ping(session, some_pair):
tasks = await gen_tasks(session, some_pair)
results = await asyncio.gather(*tasks)
async def main():
async with aiohttp.ClientSession() as session:
while True:
tasks = [
asyncio.create_task(do_ping(session, entry))
for entry in some_list
]
await asyncio.sleep(10)
await asyncio.wait(tasks)