Docker Tensorflow-Serving Predictions too large Docker Tensorflow-Serving Predictions too large docker docker

Docker Tensorflow-Serving Predictions too large


Default message length is 4MB in gRPC, but we can extend size in your gRPC client and server request in python as something given below. You will be able to send and receive large messages without streaming

request = grpc.insecure_channel('localhost:6060', options=[('grpc.max_send_message_length', MAX_MESSAGE_LENGTH), ('grpc.max_receive_message_length', MAX_MESSAGE_LENGTH)])

In GO lang we have functions refer the URLs

https://godoc.org/google.golang.org/grpc#MaxMsgSize https://godoc.org/google.golang.org/grpc#WithMaxMsgSize


The code to set the size of Messages to Unlimited in gRPC Client Request using C++ is shown below:

grpc::ChannelArguments ch_args;ch_args.SetMaxReceiveMessageSize(-1);std::shared_ptr<grpc::Channel> ch = grpc::CreateCustomChannel("localhost:6060", grpc::InsecureChannelCredentials(), ch_args);