Recently, I took the task of improving the performance of my application and shortlisted the probable reasons or factors for the same. Out of which few of them were:
- Large bundle size of the application.
- Wrong handling of the redirection logics i.e not using in-app redirection ways.
- Multiple 3rd party scripts delaying the rendering.
- Number of concurrent browser calls to download resources.
First three could still be handled, but what about the last one? I realized since the APIs were also time-taking. I concluded that probably its due to the high number of concurrent requests, some of the resource downloading are getting delayed. But I discovered that only one protocol version change can save me from the extra efforts of trying to avoid this. Let’s see how.
Below is an example of the resource timing in chrome Network tab.
The figures which I found worrisome were the stalled and the queueing timing. The request was stalled/blocked for around 200ms before the request was sent. The reason for the blocking or queueing timings can be either of the following :
The request was postponed by the rendering engine because it’s considered lower priority than critical resources (such as scripts/styles). This often happens with images.
The request was put on hold to wait for an unavailable TCP socket that’s about to free up.
The request was put on hold because the browser only allows six TCP connections per origin on HTTP 1.
Time spent making disk cache entries (typically very quick.)According to Understanding Resource Timing,
Wait! whats the third point? Do we need to re-do our logic to minimize the number of concurrent requests?
Well, yes, ideally, we should avoid the number of browser requests as most SEO site checkup websites recommend to not have more than 20 external requests on a page.
But what is this limitation? Well, the number of HTTP connections are limited for browsers and we can see Browser Connection Limitation for reference. It has multiple reasons altogether.
How does this slow the page down?
For example if i am trying to send 7 requests altogether to the same host, the 7th request may have to be blocked for any of the connections to be free. Therefore, the page may not get all the resources needed to render the entire page and therefore, the stalled time may get added to your loading time.
Let’s see what all we can do to tackle this.
- Hmm.. As many do, we can keep different hosts for serving different kinds of resources. e.g apis.example.com for server APIs, style.example.com for CSS resources, console-static.example.com for other static resources. This can help as using different hosts can increase our limit of connections through browser.
- Try and reduce API calls using techniques like Image or CSS sprites etc.
- Keep a balance between the front end and back end to optimize browsing experience.
- Cautiously, choosing the APIs which are most needed in our application, maybe.
But can we avoid the above problem without making host changes, code changes or logic changes???
Before jumping to the solution, let’s see one more field in the Network requests section of Chrome, which is the Protocol tab. Its tells you what protocol your connection or browser request has used.
The above limitations were as mentioned in the quote were for the HTTP protocols of <=1.x versions.
Earlier in HTTP 1 protocol, each request needed a TCP connection to be made, therefore, repeating the handshake etc mechanism, therefore, delaying the browser requests and limiting the number of connections.
The HTTP 1.1 protocol tried to make it better by bringing in the facility to reuse the made connections but again it was based on First in first out but still keeping the connection alive.
HTTP 2 brings with it the new way of browser calls using Binary framing layer and thus tries to overcome the performance limitations of the earlier protocols by providing the following.
- Single Connection per Origin => i.e it caters multiple requests using a single connection therefore making parallel requests to be served without any blocking.
- Server Push => It has also brought with it the capability to send multiple responses for a request. e.g if you know, with a request, the other resources will also be needed, you can push them to the client without any extra call. Thus helping in improving the performance
Above was just an overview of HTTP2. Getting into it deeply is beyond the scope of this article. You can read about it more at Google’s developer site on HTTP2.
Using HTTP2 protocol can save us from putting the extra efforts needed to improve the performance for our applications while networking.
What all is needed to start using HTTP2 protocol?
- From UI side
Nothing needs to be done, since the new browser versions already support this. So if you have you browser updated to support HTTP2, we are ready from browser side to start using the protocol.
- From Server side
What if browser says, it can use HTTP2, but server says I cannot? Since we know, the protocols to be used are decided while handshaking between the client and server, our server also needs to support HTTP2 protocol.
In my case it was simple as i was using the nginx server for serving my application resources. Just adding the below line did the trick:
listen 443 ssl http2;
And its done!!!
My browser is updated, the server listens to http2 protocol, Now i can possibly delay the task of reducing the number of calls for later.
No more blocking, no more delaying.
Apart from the reason that we should keep our application updated with the latest technologies, bumping the protocol version to atleast HTTP2 gives us some extra performance improvements. Atleast those HTML, Scripts and CSS can be downloaded parallely. Thanks HTTP2 for taking care of these internally for me.
I hope it helps. Thanks for reading. Please add your feedbacks below and help me to make it more useful.