How could I use requests in asyncio? How could I use requests in asyncio? python python

How could I use requests in asyncio?


To use requests (or any other blocking libraries) with asyncio, you can use BaseEventLoop.run_in_executor to run a function in another thread and yield from it to get the result. For example:

import asyncioimport requests@asyncio.coroutinedef main():    loop = asyncio.get_event_loop()    future1 = loop.run_in_executor(None, requests.get, 'http://www.google.com')    future2 = loop.run_in_executor(None, requests.get, 'http://www.google.co.uk')    response1 = yield from future1    response2 = yield from future2    print(response1.text)    print(response2.text)loop = asyncio.get_event_loop()loop.run_until_complete(main())

This will get both responses in parallel.

With python 3.5 you can use the new await/async syntax:

import asyncioimport requestsasync def main():    loop = asyncio.get_event_loop()    future1 = loop.run_in_executor(None, requests.get, 'http://www.google.com')    future2 = loop.run_in_executor(None, requests.get, 'http://www.google.co.uk')    response1 = await future1    response2 = await future2    print(response1.text)    print(response2.text)loop = asyncio.get_event_loop()loop.run_until_complete(main())

See PEP0492 for more.


aiohttp can be used with HTTP proxy already:

import asyncioimport aiohttp@asyncio.coroutinedef do_request():    proxy_url = 'http://localhost:8118'  # your proxy address    response = yield from aiohttp.request(        'GET', 'http://google.com',        proxy=proxy_url,    )    return responseloop = asyncio.get_event_loop()loop.run_until_complete(do_request())


The answers above are still using the old Python 3.4 style coroutines. Here is what you would write if you got Python 3.5+.

aiohttp supports http proxy now

import aiohttpimport asyncioasync def fetch(session, url):    async with session.get(url) as response:        return await response.text()async def main():    urls = [            'http://python.org',            'https://google.com',            'http://yifei.me'        ]    tasks = []    async with aiohttp.ClientSession() as session:        for url in urls:            tasks.append(fetch(session, url))        htmls = await asyncio.gather(*tasks)        for html in htmls:            print(html[:100])if __name__ == '__main__':    loop = asyncio.get_event_loop()    loop.run_until_complete(main())