Asynchronous and synchronous HTTP request on server side, performance comparison Asynchronous and synchronous HTTP request on server side, performance comparison multithreading multithreading

Asynchronous and synchronous HTTP request on server side, performance comparison


The following are my thoughts.

Whether its synchronous or asynchronous request, its nothing related to the performance of HTTP but it related to your application's performance

Synchronous requests will block the application until it receives the response, whereas in asynchronous request basically, you will assign this work in a separate worker thread which will take care of the rest of things. So in asynchronous design, your main thread still can continue its own work.

Let say due to some limitation(not resource limitation of the server) your server can handle a limited number of connections (Basically each and every connection will be handled in a separate thread differs between the server we use). If your server can handle more number of threads than the connections and also if you don't want to return any data as a result of async work you have created, then you can design asynchronous logic. Because you will create a new thread to handle the requested task.

But if you expect results of the operations to be returned in the response nothing will differ.


You are using @Suspended combined with async which still wait for response

@Suspended will pause/Suspend the current thread until it gets response

If you want to get better performance in async, write a different async method with immediate response using ExecutorService and Future

private ExecutorService executor;private Future<String> futureResult;@PostConstructpublic void onCreate() {    this.executor = Executors.newSingleThreadExecutor();}@POSTpublic Response startTask() {    futureResult = executor.submit(new ExpensiveTask());    return Response.status(Status.ACCEPTED).build();}@GETpublic Response getResult() throws ExecutionException, InterruptedException {    if (futureResult != null && futureResult.isDone()) {        return Response.status(Status.OK).entity(futureResult.get()).build();    } else {        return Response.status(Status.FORBIDDEN).entity("Try later").build();    }}


Let's consider the following scenario:

Single Backend system                    ____________                   |  System A  | HTTP Request -->  |            |                   |  1.        |                   |  2.        | HTTP Response <-- |            |                   |____________|

You have one backend system which does some processing based on the request received on a particular order ( operation 1 and then operation 2 ). If you process the request synchronously or asynchronously doesn't really matter, it's the same amount of computation that needs to be done ( maybe some slight variations like you have encountered in your test ).

Now, let's consider a multi-backend scenario:

Multi-Backend System                        ____________                       |  System A  |       __________     HTTP Request -->  |            | -->  |          |                       |  1.        |      | System B |                       |            | <--  |__________|                       |            |       __________                         |  2.        | -->  |          |     HTTP Response <-- |            |      | System C |                       |____________| <--  |__________|

Still, 2 processing steps required to be done but this time, on each step we will call another back'end system.

SYNC processing:

  1. Call System B
  2. Wait for a response from System B
  3. Call System C
  4. Wait for a response from System C

Total time spent: B + C

ASYNC processing:

  1. Call System B
  2. Go forward since the call is not blocking
  3. Call System C
  4. Go forward since the call is not blocking
  5. Receive a response from System B
  6. Receive a response from System C
  7. Complete the call to the client

Total time spent: max(B, C)

Why max? Since all the calls are non-blocking then you will have to wait just for the slowest back'end to reply.