Back to all posts

Improving Web Application Performance - Part Three Client Side Considerations

Posted on May 30, 2014

Posted in category:

In the first two parts of this blog series, we introduced the importance of defining metrics prior to starting any performance optimization activities and the importance of identifying any user or role-based issues. In this part of the series, we start to dive into the actual optimization part of the process. I will start by introducing key concepts to look for, and then we will review the tools of the trade that can be used to help identify trouble points. It is important to note that the information contained in this post is 100% applicable to ALL server-side languages, regardless of .NET, PHP, Java, or anything else you might select to use.

Understanding Client-Side Operations

Before we can optimize our applications for the best client-side performance it is important to know exactly what can impact the client-side performance of our applications. The following are three of the most importing behaviors to understand and will help as we move into the key items to look for when trying to optimize performance.

HTML Rendering Order

It is important to remember when looking at the performance of our web applications that the first request, the one that returns our HTML that is to be rendered is the first critical point in the process. This has to be sent to the client BEFORE any further operations can be performed. We want this to be as fast as possible, BUT don't let yourself get caught up on this. It is highly possible for a client-side issue to be identified with this initial response coming in times of less than 10 milliseconds even.

Simultaneous Download Limits

Once a client's browser has downloaded the HTML that is destined to be displayed to the user this document is processed from the top down. External resources are loaded as they are encountered. This means that CSS, JavaScript, Images, and the like will be downloaded. This process is automatic and the web browser does everything it can to optimize this process. However, it is important to know that a web browser can only download between 4-8 items from the same server at the same time. This means that if your application needs to download 12 external assets, at least 4 of them will have to wait for others to complete BEFORE it can even start to download.

Blocking or Long-Running Javascript

In the previous sections we discussed that HTML is loaded from the top down, this means that larger JavaScript includes can block later items from loading. Often times you want to either move the JS to the bottom of the page or load it after the page is loaded. (See Google Analytics for an example.) Limiting JS that loads in the beginning, and/or limiting the size of the included JS files will often help.

Digging In! Finding & Fixing Problems

Now, that we understand at a high level some of the aspects that can impact performance we can now discuss a methodical approach to researching these issues and the tools that can be used to help. The following steps are presented in order but do not necessarily need to be followed in this order. However, for a clean start to a site, it is often easier to start in this order as problems tend to bubble up!

Identifying Excessive Number or Sizes of HTTP Requests

The first place to start with the optimization of an application on the client side would be to look for cases where you have excessive numbers of HTTP Requests related to your application, or where the download size of your application is larger than it should be. A few key points to consider before we discuss the tools available to help. There is no real "set" perfect amount of HTTP requests per page load. has only 18, has around 60-70, has around 100, and has more than 500. Therefore, remember just because you have requests, they might not be an issue.

For investigation into these types of issues, there are two easy ways to start looking at it. Google PageSpeed can be used to identify larger assets, such as images that should be re-sized or images that could be converted into a different type to help save the total space. These are great tools to start. A perfect example of a potential find here is that you have an image on a particular page that is 3MB in size, yet displaying as 100 X 100 pixels to the user. This image can be re-sized and reduced to less than 100 kb, improving overall performance, reducing bandwidth, and overall making your application better.

The next step is to use a tool to track HTTP Requests from your machine to get a full picture of what is going on at a particular moment in time. My preference for this operation is to use Fiddler, which is a free client-side application that will listen to all outbound HTTP Requests. If enough demand is expressed I will write a posting that outlines the detailed operation of this tool. The key with Fiddler is that you can see a timeline view of multiple HTTP requests which will allow you to see the assets that are taking a long time to download and blocking the page for the users.

Limiting Small Image Requests

One of the most common things that are identified as part of the review of individual HTTP requests is that we will have applications that are using 20-30 different small images for design-related elements. One operation that can be done to help improve performance would be to move from individual items to using a CSS Sprite based layout process. This involves creating a single large image and from there using CSS to only show the particular asset as needed. This results in a single HTTP request rather than a bunch of little ones. The actual implementation is something for the designer and is outside of the scope of operations.

Content Delivery Networks (CDN's) Can Help

If you start looking at your application and see that you cannot limit the number of items that need to be downloaded it might be good to see if you can use a CDN to deliver a few of the assets that are coming in as part of the main HTTP Request. This way you can get 4-8 items coming from the CDN & 4-8 items coming from your web-server at the exact same moment. This can help improve relative performance greatly.

Static File Expiration

Another item that you can see with both Fiddler & Google PageSpeed would be cases where static file assets are missing a cache time definition. In these cases, the files must be downloaded at each request and the user's cache on their local machines cannot be used to optimize the page loading process. Google PageSpeed will call attention to this, and Fiddler will show no-cache expiration value. The fix is most likely very simple, for ASP.NET it is as simple as adding 3 lines to the web.config and can result in major improvement for subsequent page loads and will reduce the load on your web-server greatly.


With a better understanding of how HTML pages are loaded, we were able to take a look into a few key areas of potential performance impact and discussed a few options for mitigation of performance issues. I might do a follow-up post with details on how to use Fiddler if desired. Feel free to share your comments below!