how to serve keras model for inference using flask?
I was facing the same problem, this is a keras issue. Primarily seems to be triggered when when there is asynchronous event handlerAdding model._make_predict_function()
right after loading the trained model worked for me.For example,
from keras.models import load_model model=load_model('yolo.h5') model._make_predict_function()
Another approach that has worked for other people is to use graph and have inference within the context something like:
global graphgraph = tf.get_default_graph()with graph.as_default():res = model.predict()
For more insights , please refer below links: