我应该在哪里添加另一个等待asyncio正常工作?

问题描述 投票:0回答:1

我正在下载pexels.com的所有图片,并由用户提供给定的关键字。该程序给我以下错误。

Traceback (most recent call last):
File "./asyncioPexels.py", line 73, in <module>
asyncio.run(forming_all_pages(numberOfPages, mainurl))
File "/usr/lib/python3.7/asyncio/base_events.py", line 573, in run_until_complete
return future.result()
File "./asyncioPexels.py", line 50, in forming_all_pages
await download_all_pages(urls)
File "./asyncioPexels.py", line 38, in download_all_pages
async with aiohttp.ClientSession as session:
AttributeError: __aexit__

我认为现在的问题是我使用函数download_all_pages作为上下文管理器!如果这是问题,我该如何解决?我有一个大致的想法,使它作为上下文管理器工作或有一个更简单的解决方案?这是我的整个代码:

async def download_single_image(subsession, imgurl):
    print(f'Downloading img {imgurl}')
    async with session.get(imgurl) as res:
        imgFile = open(os.path.join(str(keyword), os.path.basename(imgurl)), 'wb')
        for chunk in res.iter_content(100000):
            imgFile.write(chunk)
        imgFile.close()

async def download_all_images(imgurls):
    async with aiohttp.ClientSession as subsession:
        subtasks = []
        for imgurl in imgurls:
            subtask = asyncio.ensure_future(download_single_image(subsession, imgurl))
            subtasks.append(subtask)
        await asyncio.gather(*subtasks, return_exception=True)

async def download_single_page(session, url):
    print(f'Downloading page {url}...')
    imgurls = []
    async with session.get(url) as response:
        imgs = response.text.split('infiniteScrollingAppender.append')[1:]
        for img in imgs:
            soup = BeautifulSoup(img[2:-5].replace("\\'", "'").replace('\\"', '"'), 'html.parser')
            imgurls.append(soup.select('.photo-item__img')[0].get('srcset'))
        await download_all_images(imgurls)

async def download_all_pages(urls):
    async with aiohttp.ClientSession as session:
        tasks = []
        for url in urls:
            task = asyncio.ensure_future(download_single_page(session, url))
            tasks.append(task)
        await asyncio.gather(*tasks, return_exception=True)

async def forming_all_pages(numberOfPages, mainurl):
    urls = []
    for _ in range(1, numberOfPages + 1):
        page = mainurl + str(_)
        urls.append(page)
    await download_all_pages(urls)    

if __name__ == "__main__":
    asyncio.run(forming_all_pages(numberOfPages, mainurl))

如何为代码运行解决这个问题?

python-3.x async-await python-asyncio
1个回答
0
投票

forming_all_pages你有

download_all_pages(urls)

但正如例外告诉你的那样

./asyncioPexels.py:50: RuntimeWarning: coroutine 'download_all_pages' was never awaited

将此更改为

await download_all_pages(urls)

您还需要更改download_single_page才能使用

await download_all_images(imgurls)

最后,forming_all_pages需要等待。您需要将其更改为

async def forming_all_pages(numberOfPages, mainurl):
© www.soinside.com 2019 - 2024. All rights reserved.