multiprocessing.Pool: When to use apply, apply_async or map? multiprocessing.Pool: When to use apply, apply_async or map? python python

multiprocessing.Pool: When to use apply, apply_async or map?

Back in the old days of Python, to call a function with arbitrary arguments, you would use apply:


apply still exists in Python2.7 though not in Python3, and is generally not used anymore. Nowadays,


is preferred. The multiprocessing.Pool modules tries to provide a similar interface.

Pool.apply is like Python apply, except that the function call is performed in a separate process. Pool.apply blocks until the function is completed.

Pool.apply_async is also like Python's built-in apply, except that the call returns immediately instead of waiting for the result. An AsyncResult object is returned. You call its get() method to retrieve the result of the function call. The get() method blocks until the function is completed. Thus, pool.apply(func, args, kwargs) is equivalent to pool.apply_async(func, args, kwargs).get().

In contrast to Pool.apply, the Pool.apply_async method also has a callback which, if supplied, is called when the function is complete. This can be used instead of calling get().

For example:

import multiprocessing as mpimport timedef foo_pool(x):    time.sleep(2)    return x*xresult_list = []def log_result(result):    # This is called whenever foo_pool(i) returns a result.    # result_list is modified only by the main process, not the pool workers.    result_list.append(result)def apply_async_with_callback():    pool = mp.Pool()    for i in range(10):        pool.apply_async(foo_pool, args = (i, ), callback = log_result)    pool.close()    pool.join()    print(result_list)if __name__ == '__main__':    apply_async_with_callback()

may yield a result such as

[1, 0, 4, 9, 25, 16, 49, 36, 81, 64]

Notice, unlike, the order of the results may not correspond to the order in which the pool.apply_async calls were made.

So, if you need to run a function in a separate process, but want the current process to block until that function returns, use Pool.apply. Like Pool.apply, blocks until the complete result is returned.

If you want the Pool of worker processes to perform many function calls asynchronously, use Pool.apply_async. The order of the results is not guaranteed to be the same as the order of the calls to Pool.apply_async.

Notice also that you could call a number of different functions with Pool.apply_async (not all calls need to use the same function).

In contrast, applies the same function to many arguments.However, unlike Pool.apply_async, the results are returned in an order corresponding to the order of the arguments.

Here is an overview in a table format in order to show the differences between Pool.apply, Pool.apply_async, and Pool.map_async. When choosing one, you have to take multi-args, concurrency, blocking, and ordering into account:

                  | Multi-args   Concurrence    Blocking          | no           yes            yes          yesPool.map_async    | no           yes            no           yesPool.apply        | yes          no             yes          noPool.apply_async  | yes          yes            no           noPool.starmap      | yes          yes            yes          yesPool.starmap_async| yes          yes            no           no


  • Pool.imap and Pool.imap_async – lazier version of map and map_async.

  • Pool.starmap method, very much similar to map method besides it acceptance of multiple arguments.

  • Async methods submit all the processes at once and retrieve the results once they are finished. Use get method to obtain the results.

  • Pool.apply)methods are very much similar to Python built-in map(or apply). They block the main process until all the processes complete and return the result.



Is called for a list of jobs in one time

results =, [1, 2, 3])


Can only be called for one job

for x, y in [[1, 1], [2, 2]]:    results.append(pool.apply(func, (x, y)))def collect_result(result):    results.append(result)


Is called for a list of jobs in one time

pool.map_async(func, jobs, callback=collect_result)


Can only be called for one job and executes a job in the background in parallel

for x, y in [[1, 1], [2, 2]]:    pool.apply_async(worker, (x, y), callback=collect_result)


Is a variant of which support multiple arguments

pool.starmap(func, [(1, 1), (2, 1), (3, 1)])


A combination of starmap() and map_async() that iterates over iterable of iterables and calls func with the iterables unpacked. Returns a result object.

pool.starmap_async(calculate_worker, [(1, 1), (2, 1), (3, 1)], callback=collect_result)


Find complete documentation here:

Regarding apply vs map:

pool.apply(f, args): f is only executed in ONE of the workers of the pool. So ONE of the processes in the pool will run f(args)., iterable): This method chops the iterable into a number of chunks which it submits to the process pool as separate tasks. So you take advantage of all the processes in the pool.