AJAX and Network Performance: A Cautionary Tale

Posted by Keith McMillan

April 14, 2008 | Leave a Comment

In recent years, the web has seen a proliferation of Web 2.0, or Rich Client, applications such as Google Docs. These web-browser based applications have a dynamic user interface using a technology called AJaX, for Active Javascript and XML. The user experience for these applications can rival traditional fat client applications, while at the same time, not needing to be installed and upgraded, since they’re downloaded to the browser as needed.

One of the great promises of these AJaX technologies is that of increased performance of the applications. If these applications make lots of small requests, this isn’t always the case in reality. Sometimes your performance can be significantly worse.

Traditional web applications download entire pages when you click on a link or submit a form. The smarter ones download IFrames or DIVs on the page, but they still download relatively large chunks of HTML in the process of doing their work. Each time a form is submitted, or a link clicked, the server is contacted for an entire new page of content to render. Here’s a stereotypical diagram of this interaction.

Contrast this with AJaX applications, which download JavaScript programs to the browser which are in turn used to control both the retrieval of data, and it’s display within the browser.

The data these scripts request are relatively smaller chunks than entire HTML pages of traditional applications. The data are usually (but not necessarily) structured as either the eponymous XML, or sometimes another format called JSON (JavaScript Object Notation).  Regardless of the format of the data, the server retrieves these smaller chunks of data, then updates the user interface using JavaScript, rather than fetching both layout and data, and rendering it as a piece, as in a traditional application.

This gives a much more lively user interface in theory, but one thing that can get you into trouble is making these same fine-grained requests for information.  Network requests have a certain amount of overhead associated with them, and making a lot of small requests to a server can result in significantly poorer performance than downloading large chunks. In experiments I performed earlier in this year, an application that requested lots of small files required over 100 times longer to get the content than downloading a single file of the same size. This is due to the overhead associated with setting up and tearing down these connections. The further the user is from the content, the worse this problem can be.

Performance for these types of applications might very well be acceptable for intranet applications where the user is close to the application server, or for companies large enough to afford a content delivery network to bring the static content closer to the user, but will cause miserable performance for others not so fortunate to be able to do either of these things.

The problem of lots of small files is particularly apparent with applications that download lots of rich graphic eye candy, such as rounded borders and dropped shadows, which increase the number of small image files that need to be transferred.

Tools such as the Google Web Toolkit combine these images together into a single larger file called a CSS Sprite, and then use cascading style sheets to cut them apart again on the client side. This overcomes the problem of transferring lots of small images, but can still leave us with the performance challenges of the AJaX application itself frequently requesting small blocks of data.

Another approach that can be used in conjunction is try to keep your network connections open to the server using HTTP keep-alives. Unfortunately, Internet Explorer has a broken implementation that causes errors when you try to do this, since it tries to use a connection that’s been closed and throws up it’s hands and gives up, rather than retrying.

So where does this leave us? If you’re lucky enough not to suffer performance problems with your AJaX applications trying to make lots of small requests to the server, then you’re fortunate. If you are seeing these problems, you can try a number of things to attack the problem of images:

– use CSS sprites to make bigger images out of smaller ones, which is rather invasive

– get rid of images where you can

– aggressively cache your images, which will only work for frequent users

– pre-load image files in the background

These same techniques can be used, with some modification, to load the scripts themselves if you have lots of those too.

If you have problems with the actual data requests, you arguably have more control over this than anything else. You can restructure your application to try to make these more efficient.

A wonderful tool for you to use when trying to identify and fix these problems is YSlow, from our friends at Yahoo!. It’s a plugin that works with the Firebug plugin to Firefox, and gives you both a good idea what files are coming from cache and how long they take, and also suggestions to make your application run faster, based on the experience of the Yahoo! team. YSlow can be found at Yahoo!’s site.

Good luck, and happy coding!


RSS feed | Trackback URI

Comments »

No comments yet.

Name (required)
E-mail (required - never shown publicly)
Your Comment (smaller size | larger size)
You may use <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> in your comment.