How to properly create and run concurrent tasks using python's asyncio module? How to properly create and run concurrent tasks using python's asyncio module? python python

How to properly create and run concurrent tasks using python's asyncio module?


Yes, any coroutine that's running inside your event loop will block other coroutines and tasks from running, unless it

  1. Calls another coroutine using yield from or await (if using Python 3.5+).
  2. Returns.

This is because asyncio is single-threaded; the only way for the event loop to run is for no other coroutine to be actively executing. Using yield from/await suspends the coroutine temporarily, giving the event loop a chance to work.

Your example code is fine, but in many cases, you probably wouldn't want long-running code that isn't doing asynchronous I/O running inside the event loop to begin with. In those cases, it often makes more sense to use asyncio.loop.run_in_executor to run the code in a background thread or process. ProcessPoolExecutor would be the better choice if your task is CPU-bound, ThreadPoolExecutor would be used if you need to do some I/O that isn't asyncio-friendly.

Your two loops, for example, are completely CPU-bound and don't share any state, so the best performance would come from using ProcessPoolExecutor to run each loop in parallel across CPUs:

import asynciofrom concurrent.futures import ProcessPoolExecutorprint('running async test')def say_boo():    i = 0    while True:        print('...boo {0}'.format(i))        i += 1def say_baa():    i = 0    while True:        print('...baa {0}'.format(i))        i += 1if __name__ == "__main__":    executor = ProcessPoolExecutor(2)    loop = asyncio.get_event_loop()    boo = loop.run_in_executor(executor, say_boo)    baa = loop.run_in_executor(executor, say_baa)    loop.run_forever()


You don't necessarily need a yield from x to give control over to the event loop.

In your example, I think the proper way would be to do a yield None or equivalently a simple yield, rather than a yield from asyncio.sleep(0.001):

import asyncio@asyncio.coroutinedef say_boo():  i = 0  while True:    yield None    print("...boo {0}".format(i))    i += 1@asyncio.coroutinedef say_baa():  i = 0  while True:    yield    print("...baa {0}".format(i))    i += 1boo_task = asyncio.async(say_boo())baa_task = asyncio.async(say_baa())loop = asyncio.get_event_loop()loop.run_forever()

Coroutines are just plain old Python generators.Internally, the asyncio event loop keeps a record of these generators and calls gen.send() on each of them one by one in a never ending loop. Whenever you yield, the call to gen.send() completes and the loop can move on. (I'm simplifying it; take a look around https://hg.python.org/cpython/file/3.4/Lib/asyncio/tasks.py#l265 for the actual code)

That said, I would still go the run_in_executor route if you need to do CPU intensive computation without sharing data.