Long waiting (TTFB) time for scripts / styles on Azure Website Long waiting (TTFB) time for scripts / styles on Azure Website asp.net asp.net

Long waiting (TTFB) time for scripts / styles on Azure Website


Although your static files are cached, the browser still issues requests with if-modifies-since header (which results in a 304).

While it doesn't need to download the actual content, it still needs to wait the RTT + server think time to continue.

I would suggest two things:

  1. Adding Cache-Control and Expire headers - will help avoid 304 in some cases (pretty much unless you hit F5)
  2. Using a proper CDN - such as Incapsula or others, that will minimize the RTT + think time. It can also be used to easily control cache settings for various resources.

More good stuff here.

Good Luck!


From here:

As you saw earlier, IIS 7 caches the compressed versions of static files. So, if a request arrives for a static file whose compressed version is already in the cache, it doesn’t need to be compressed again.

But what if there is no compressed version in the cache? Will IIS 7 then compress the file right away and put it in the cache? The answer is yes, but only if the file is being requested frequently. By not compressing files that are only requested infrequently, IIS 7 saves CPU usage and cache space.

By default, a file is considered to be requested frequently if it is requested two or more times per 10 seconds.

So, the reason your users are being served an uncompressed version of the javascript file is because it didn't meet the default threshold for being compressed; in other words, the javascript file was not requested 2 times within 10 seconds.

To control this, there is one attribute we must change on the <serverRuntime> element, which controls compression: frequentHitThreshold. In order for your file to be compressed when it is requested once, change your <serverRuntime> element to look like this:

<serverRuntime enabled="true" frequentHitThreshold="1" />

This will slightly impact your CPU performance if you have many javascript files that are being served and you have users quite often, but likely if you have users often enough to impact CPU from compressing these files, then they are already compressed and cached!


My guess would be Azures always on.
If it works anything like the one CloudFlare provides, it essentially proxies the request and tries to cache it.
Depending on the exact implementation of this cache on the side of Azure, it might wait for the scripts output to complete to cache it/validate the cache and then pass it on to the browser.

You might have a chance checking the caching configuration and disable always on for your scripts if possible.