TypeError: An asyncio.Future, a coroutine or an awaitable is required
TypeError: An asyncio.Future, a coroutine or an awaitable is required
我正在尝试使用 beautifulsoup 制作一个异步网络抓取工具,aiohttp.This 是我启动的初始代码 things.I 收到 [TypeError: An asyncio.Future,需要协程或等待程序],并且很难弄清楚我的 code.I 有什么问题,我是 python 的新手,非常感谢有关这方面的任何帮助。
import bs4
import asyncio
import aiohttp
async def parse(page):
soup=bs4.BeautifulSoup(page,'html.parser')
soup.prettify()
print(soup.title)
async def request():
async with aiohttp.ClientSession() as session:
async with session.get("https://google.com") as resp:
await parse(resp)
loop=asyncio.get_event_loop()
loop.run_until_complete(request)
回溯:-
Traceback (most recent call last):
File "C:\Users\User\Desktop\Bot\aio-req\parser.py", line 21, in <module>
loop.run_until_complete(request)
File "C:\Users\User\AppData\Local\Programs\Python\Python38-32\lib\asyncio\base_events.py", line 591, in run_until_complete
future = tasks.ensure_future(future, loop=self)
File "C:\Users\User\AppData\Local\Programs\Python\Python38-32\lib\asyncio\tasks.py", line 673, in ensure_future
raise TypeError('An asyncio.Future, a coroutine or an awaitable is '
TypeError: An asyncio.Future, a coroutine or an awaitable is required
一个问题是 loop.run_until_complete(request)
应该是 loop.run_until_complete(request())
- 你实际上必须调用它才能 return 协程。
还有其他问题 - 就像您将 aiohttp.ClientResponse
对象传递给 parse
并将其视为 text/html。我让它与以下项目一起工作,但不知道它是否符合您的需求,因为 parse
不再是协程。
def parse(page):
soup=bs4.BeautifulSoup(page,'html.parser')
soup.prettify()
return soup.title
async def fetch(session, url):
async with session.get(url) as response:
return await response.text()
async def request():
async with aiohttp.ClientSession() as session:
html = await fetch(session, "https://google.com")
print(parse(html))
if __name__ == '__main__':
loop=asyncio.get_event_loop()
loop.run_until_complete(request())
这也有效:
def parse(page):
soup=bs4.BeautifulSoup(page,'html.parser')
soup.prettify()
print(soup.title)
async def request():
async with aiohttp.ClientSession() as session:
async with session.get("https://google.com") as resp:
parse(await resp.text())
最后 ,您的原始代码,将可等待的响应对象传递给 parse
,然后等待 page.text()
.
async def parse(page):
soup=bs4.BeautifulSoup(await page.text(),'html.parser')
soup.prettify()
print(soup.title)
async def request():
async with aiohttp.ClientSession() as session:
async with session.get("https://google.com") as resp:
await parse(resp)
我将我的代码改成了这个,现在可以用了。
import bs4
import asyncio
import aiohttp
async def parse(page):
soup=bs4.BeautifulSoup(page,'html.parser')
soup.prettify()
print(soup.title)
async def request():
async with aiohttp.ClientSession() as session:
async with session.get("https://google.com") as resp:
html=await resp.text()
await parse(html)
loop=asyncio.get_event_loop()
loop.run_until_complete(request())
我正在尝试使用 beautifulsoup 制作一个异步网络抓取工具,aiohttp.This 是我启动的初始代码 things.I 收到 [TypeError: An asyncio.Future,需要协程或等待程序],并且很难弄清楚我的 code.I 有什么问题,我是 python 的新手,非常感谢有关这方面的任何帮助。
import bs4
import asyncio
import aiohttp
async def parse(page):
soup=bs4.BeautifulSoup(page,'html.parser')
soup.prettify()
print(soup.title)
async def request():
async with aiohttp.ClientSession() as session:
async with session.get("https://google.com") as resp:
await parse(resp)
loop=asyncio.get_event_loop()
loop.run_until_complete(request)
回溯:-
Traceback (most recent call last):
File "C:\Users\User\Desktop\Bot\aio-req\parser.py", line 21, in <module>
loop.run_until_complete(request)
File "C:\Users\User\AppData\Local\Programs\Python\Python38-32\lib\asyncio\base_events.py", line 591, in run_until_complete
future = tasks.ensure_future(future, loop=self)
File "C:\Users\User\AppData\Local\Programs\Python\Python38-32\lib\asyncio\tasks.py", line 673, in ensure_future
raise TypeError('An asyncio.Future, a coroutine or an awaitable is '
TypeError: An asyncio.Future, a coroutine or an awaitable is required
一个问题是 loop.run_until_complete(request)
应该是 loop.run_until_complete(request())
- 你实际上必须调用它才能 return 协程。
还有其他问题 - 就像您将 aiohttp.ClientResponse
对象传递给 parse
并将其视为 text/html。我让它与以下项目一起工作,但不知道它是否符合您的需求,因为 parse
不再是协程。
def parse(page):
soup=bs4.BeautifulSoup(page,'html.parser')
soup.prettify()
return soup.title
async def fetch(session, url):
async with session.get(url) as response:
return await response.text()
async def request():
async with aiohttp.ClientSession() as session:
html = await fetch(session, "https://google.com")
print(parse(html))
if __name__ == '__main__':
loop=asyncio.get_event_loop()
loop.run_until_complete(request())
这也有效:
def parse(page):
soup=bs4.BeautifulSoup(page,'html.parser')
soup.prettify()
print(soup.title)
async def request():
async with aiohttp.ClientSession() as session:
async with session.get("https://google.com") as resp:
parse(await resp.text())
最后 ,您的原始代码,将可等待的响应对象传递给 parse
,然后等待 page.text()
.
async def parse(page):
soup=bs4.BeautifulSoup(await page.text(),'html.parser')
soup.prettify()
print(soup.title)
async def request():
async with aiohttp.ClientSession() as session:
async with session.get("https://google.com") as resp:
await parse(resp)
我将我的代码改成了这个,现在可以用了。
import bs4
import asyncio
import aiohttp
async def parse(page):
soup=bs4.BeautifulSoup(page,'html.parser')
soup.prettify()
print(soup.title)
async def request():
async with aiohttp.ClientSession() as session:
async with session.get("https://google.com") as resp:
html=await resp.text()
await parse(html)
loop=asyncio.get_event_loop()
loop.run_until_complete(request())