Python RuntimeWarning:启用 tracemalloc 以使用 asyncio.sleep 获取对象分配回溯

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/54088263/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-19 20:25:49  来源:igfitidea点击:

RuntimeWarning: Enable tracemalloc to get the object allocation traceback with asyncio.sleep

pythonsemaphorepython-asyncioaiohttp

提问by Liondancer

Trying to use a semaphore to control asynchronous requests to control the requests to my target host but I am getting the following error which I have assume means that my asycio.sleep()is not actually sleeping. How can I fix this? I want to add a delay to my requests for each URL targeted.

尝试使用信号量来控制异步请求以控制对目标主机的请求,但我收到以下错误,我认为这意味着我asycio.sleep()实际上并没有在睡觉。我怎样才能解决这个问题?我想为每个目标 URL 的请求添加延迟。

Error:

错误:

RuntimeWarning: coroutine 'sleep' was never awaited
Coroutine created at (most recent call last)
  File "sephora_scraper.py", line 71, in <module>
    loop.run_until_complete(main())
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/base_events.py", line 571, in run_until_complete
    self.run_forever()
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/base_events.py", line 539, in run_forever
    self._run_once()
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/base_events.py", line 1767, in _run_once
    handle._run()
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/events.py", line 88, in _run
    self._context.run(self._callback, *self._args)
  File "makeup.py", line 26, in get_html
    asyncio.sleep(delay)
  asyncio.sleep(delay)
RuntimeWarning: Enable tracemalloc to get the object allocation traceback

Code:

代码:

import sys
import time
import asyncio
import aiohttp

async def get_html(semaphore, session, url, delay=6):
    await semaphore.acquire()
    async with session.get(url) as res:
        html = await res.text()
        asyncio.sleep(delay)
        semaphore.release()
        return html

async def main():
    categories = {
        "makeup": "https://www.sephora.com/shop/"
    }
    semaphore = asyncio.Semaphore(value=1)
    tasks = []
    async with aiohttp.ClientSession(loop=loop, connector=aiohttp.TCPConnector(ssl=False)) as session:
        for category, url in categories.items():
                # Get HTML of all pages
            tasks.append(get_html(semaphore, session, url))
        res = await asyncio.gather(*tasks)

if __name__ == "__main__":
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

回答by Mikhail Gerasimov

asyncio.sleep(delay)

Change it to:

将其更改为:

await asyncio.sleep(delay)

asyncio.sleepis a coroutineand should be awaited.

asyncio.sleep是一个协程,应该等待。