Keras + Tensorflow and Multiprocessing in Python Keras + Tensorflow and Multiprocessing in Python python python

Keras + Tensorflow and Multiprocessing in Python


From my experience - the problem lies in loading Keras to one process and then spawning a new process when the keras has been loaded to your main environment. But for some applications (like e.g. training a mixture of Kerasmodels) it's simply better to have all of this things in one process. So what I advise is the following (a little bit cumbersome - but working for me) approach:

  1. DO NOT LOAD KERAS TO YOUR MAIN ENVIRONMENT. If you want to load Keras / Theano / TensorFlow do it only in the function environment. E.g. don't do this:

    import kerasdef training_function(...):    ...

    but do the following:

    def training_function(...):    import keras    ...
  2. Run work connected with each model in a separate process: I'm usually creating workers which are making the job (like e.g. training, tuning, scoring) and I'm running them in separate processes. What is nice about it that whole memory used by this process is completely freed when your process is done. This helps you with loads of memory problems which you usually come across when you are using multiprocessing or even running multiple models in one process. So this looks e.g. like this:

    def _training_worker(train_params):    import keras    model = obtain_model(train_params)    model.fit(train_params)    send_message_to_main_process(...)def train_new_model(train_params):    training_process = multiprocessing.Process(target=_training_worker, args = train_params)    training_process.start()    get_message_from_training_process(...)    training_process.join()

Different approach is simply preparing different scripts for different model actions. But this may cause memory errors especially when your models are memory consuming. NOTE that due to this reason it's better to make your execution strictly sequential.


I created one simple example to show how to run Keras model in multiple processes with multiple gpus. Hope this sample could help you.https://github.com/yuanyuanli85/Keras-Multiple-Process-Prediction


I created a decorator that fixed my code.

from multiprocessing import Pipe, Processdef child_process(func):    """Makes the function run as a separate process."""    def wrapper(*args, **kwargs):        def worker(conn, func, args, kwargs):            conn.send(func(*args, **kwargs))            conn.close()        parent_conn, child_conn = Pipe()        p = Process(target=worker, args=(child_conn, func, args, kwargs))        p.start()        ret = parent_conn.recv()        p.join()        return retreturn wrapper@child_processdef keras_stuff():    """ Keras stuff here"""