How to make worker threads quit after work is finished in a multithreaded producer-consumer pattern? How to make worker threads quit after work is finished in a multithreaded producer-consumer pattern? multithreading multithreading

How to make worker threads quit after work is finished in a multithreaded producer-consumer pattern?


Don't call it a special case for a task.

Use an Event instead, with non-blocking implementation for your workers.

stopping = threading.Event()def worker(n, q, timeout=1):    # run until the master thread indicates we're done    while not stopping.is_set():        try:            # don't block indefinitely so we can return to the top            # of the loop and check the stopping event            data = q.get(True, timeout)        # raised by q.get if we reach the timeout on an empty queue        except queue.Empty:            continue        q.task_done()def master():    ...    print 'waiting for workers to finish'    q.join()    stopping.set()    print 'done'


Can the method of sending a single exit indicator for all threads (as explained in the second comment of https://stackoverflow.com/a/19369877/1175080 by Martin James) even work?

As you have notice it can't work, spreading the message will make the last thread to update the queue with one more item and since you are waiting for a queue that will never be empty, not with the code you have.

If the answer to the previous question is "No", is there a way to solve the problem in a way that I don't have to send a separate exit indicator for each worker thread?

You can join the threads instead of the queue:

def worker(n, q):    # n - Worker ID    # q - Queue from which to receive data    while True:        data = q.get()        print 'worker', n, 'got', data        time.sleep(1)  # Simulate noticeable data processing time        q.task_done()        if data == -1: # -1 is used to indicate that the worker should stop            # Requeue the exit indicator.            q.put(-1)            # Commit suicide.            print 'worker', n, 'is exiting'            breakdef master():    # master() sends data to worker() via q.    q = Queue.Queue()    # Create 3 workers.    threads = [threading.Thread(target=worker, args=(i, q)) for i in range(3)]    for t in threads:        threads.start()    # Send 10 items to work on.    for i in range(10):        q.put(i)        time.sleep(0.5)    # Send an exit indicator for all threads to consume.    q.put(-1)    print 'waiting for workers to finish ...'    for t in threads:        t.join()    print 'done'master()

As the Queue documentation explain get method will rise an execption once its empty so if you know already the data to process you can fill the queue and then spam the threads:

import Queueimport threadingimport timedef worker(n, q):    # n - Worker ID    # q - Queue from which to receive data    while True:        try:            data = q.get(block=False, timeout=1)            print 'worker', n, 'got', data            time.sleep(1)  # Simulate noticeable data processing time            q.task_done()        except Queue.Empty:            breakdef master():    # master() sends data to worker() via q.    q = Queue.Queue()    # Send 10 items to work on.    for i in range(10):        q.put(i)    # Create 3 workers.    for i in range(3):        t = threading.Thread(target=worker, args=(i, q))        t.start()    print 'waiting for workers to finish ...'    q.join()    print 'done'master()

Here you have a live example


Just for completeness sake:You could also enqueue a stop signal which is -(thread count).Each thread then can increment it by one and re queue it only if the stop signal is != 0.

    if data < 0: # negative numbers are used to indicate that the worker should stop        if data < -1:            q.put(data + 1)        # Commit suicide.        print 'worker', n, 'is exiting'        break

But i'd personally go with Travis Mehlinger or Daniel Sanchez answer.