January 2009

Improving browser performance in Review Board

This past Sunday, I landed a set of changes into Review Board that provide improved performance, such as aggressive browser-side caching of media and pages. It’s just a start, but has already significantly reduced page load times in all of my tests, in some case by several seconds. We implemented these methods for Review Board, but they’re methods that can be applied to any Django project out there.

There are several key things that Review Board now does to improve performance:

  • Tells browsers to cache all media for one year.
  • Only sends page data if new data is available.
  • Compresses all media files to reduce transfer time.
  • Parallelizes media downloads.
  • Loads CSS files as early as possible.
  • Loads JavaScript files as late as possible.
  • Progressively loads expensive data.

A lot of the performance improvements come straight from Yahoo!’s Best Practices for Speeding Up Your Site. We’re not doing everything there yet, but we’re working toward it. That’s a great resource, by the way, and I recommend that everyone who has even made a website before go and read it.

So what do the above techniques buy us, and how are we doing them? Let me go into more details…

Caching all media for a year

The average site has one or more CSS files, JavaScript files, and several images. This translates to a lot of requests to the server, which may leave the site feeling slow. On top of this, a browser only makes a few requests to a server at a time, in order to avoid swamping the server, which will further hinder load times. This happens every time a user visits a page on your site.

Aggressive caching makes a huge difference and can greatly reduce load times for users. Review Board now tells the browser to cache media files for a year. Once a user downloads a JavaScript or CSS file, they won’t have to download it again, meaning that in general the only requests the browser needs to make is for the page requests and AJAX requests.

The big pitfall with long-term caching is that the cached resources can go stale. For example, if a new version of an image was uploaded, the browser wouldn’t even know about it, since it was told it should keep its old version for a year before checking again.

We solve this by introducing “media serials,” timestamps that are appended to all media paths. Instead of caching /js/myscript.js, the browser would cache /js/myscript.js?1273618736.

These media serials are computed on the first page request by our djblets.util.context_processors.ajaxSerial context processor. This quickly scans all media files known to the program, finding out the latest modification timestamp. It then provides a {{MEDIA_SERIAL}} variable for templates to append to media URLs as part of the query string.

The benefit to this method is that we can cache media files for a year and not worry about users having stale cached resources the next time we upgrade a copy of Review Board. The filenames requested will be different, browsers will see that the new file is not in the cache, and make a request, caching the new file for a year.

Only send page data if new data is available

Aggressive caching of media files is great and saves a lot of time, but it doesn’t help for dynamically generated content. For this, we need a new strategy.

When a browser makes a request, it can send a If-Modified-Since header to the server containing the Last-Modified value it received the last time it downloaded that page. This is a very valuable header, and there’s some things we can do with it to save both the server and the browser a lot of trouble.

If the browser sends If-Modified-Since, and we know that no new data has been generated since the timestamp provided, we can send an HttpResponseNotModified (HTTP response code 304). This will tell the browser it already has the newest version of the page. The sooner we do this, the better, as it means we don’t have to waste time building templates or doing any expensive database queries.

Djblets, once again, provides some functions to help out here: djblets.util.http.set_last_modified and djblets.util.http.get_modified_since.

The general usage pattern is that we first build a timestamp representing the latest version of the page. This could be the timestamp for a particular object that the page represents. We then check if we can bail early by calling:

if get_modified_since(request, timestamp):
    return HttpResponseNotModified()

Further down, after building the page, we must set the Last-Modified timestamp, using the same timestamp as above, like so:

set_last_modified(response, timestamp)

We’re using this in only a few places right now, such as the review request details page, but it drastically improves load times. If the review request hasn’t changed and nobody’s commented on it since the browser last loaded the page, a reload of the page will be almost instant.

Compress all media files

Our Apache and lighttpd config files now enable compression by default. By compressing these files, we can turn a relatively large JavaScript file (such as the jquery and jquery-ui files) into a very small file before sending it over to the browser. This reduces transfer times at the expense of compression/decompression time (which is small enough to not worry for deployments of this size, and can be offset by caching of compressed files server-side).

Parallelize media downloads

It’s important to not mix loads of media files of different types. The browser parallelizes media downloads of the same type, in page load order, but if you load one CSS file, one JavaScript file, another CSS file, and then another JavaScript file, the browser will only attempt one load at a time. If you load all the CSS files before all JavaScript files, it will parallelize the CSS file download and then the JavaScript downloads. By enforcing the separation of loads, we can achieve faster page download/render times.

Load CSS files as soon as possible

Loading CSS files before the browser starts to display the page can make the page appear to load smoother. The browser will already know how things should look and will lay the page out accordingly, instead of laying the page out once and then updating that once the CSS files have loaded.

Load JavaScript files as late as possible

JavaScript loads block the browser, as the browser must parse and interpet the JavaScript before it can continue. Sometimes it’s necessary to load a JavaScript file early, but in many cases the files can be loaded late. When possible, we load JavaScript files at the very end of the document body so that they won’t even begin downloading until the page has rendered. This provides noticeable performance for script-heavy pages.

Progressively load expensive data

There are types of data that are just too expensive to load along with the rest of the page. For a long time, Review Board would parse and render fragments of a diff for display in the review request page, but that meant that before the page could load, Review Board would need to do the following:

  1. Query the list of all comments.
  2. Fetch every file commented on.
  3. Apply the stored patch to each file.
  4. Diff between the original and patched files.
  5. Render the portion of the diff commented on into the page.

This became very time-consuming, and if a server was down, the page wasn’t available until everything timed out. The solution to this was to lazily load each of these diff fragments in order.

We now display a placeholder table for each diff fragment in roughly the same size of the rendered fragment (to avoid excessive page scrolling on loads). The table contains a spinner showing that something is happening, and, one-by-one (to avoid dogpiling) we load each diff fragment.

The code to render the diff fragment, by the way, takes advantage of the If-Modified-Since header and is also cached for a year. We use an AJAX_SERIAL (same principal as the MEDIA_SERIAL above) to allow for changes in new deployments.

With these caching mechanisms in place, the review request page now loads in roughly a second in many cases (or less once cached), with diff fragments coming in lazily (and then almost immediately on future loads).

More to come…

This was a great first step, but there’s more we can do. Before we hit our 1.0 release, we’re going to batch together all our CSS files and JavaScript files into a couple of combined files and then “minify” them (basically compressing them in such a way to allow for smaller files and faster load times of the interpreted data).

Again, these are techniques we’re now making use of in Review Board, but they’re not in any way specific to Review Board. Anyone out there developing websites or web applications should seriously look into ways to improve performance. I hope this was a good starting point, but seriously, do read Yahoo!’s article as well.

Infect your application with Parasite!

Parasite

Ever find yourself stuck debugging an application because the UI is just doing something weird that you can’t track down? Maybe a widget isn’t appearing correctly, or you just need more information about the overall structure and logging statements aren’t doing you much good. Debugging complex UIs can be a pain.

We’ve had some real challenges at VMware, due to the complexity of our applications. It was enough to drive me mad one day, so rather than write more logging statements, I wrote Parasite.

Parasite is a debugging tool that David Trowbridge and I have been working on to give developers an interactive view of their entire application’s UI. It provides a number of really useful features, including:

  • See the entire widget hierarchy of your UI.
  • Watch properties update live.
  • Modify existing properties on a widget.
  • View all registered GtkActions.
  • Toggle GTK+’s debugging of graphic updates.
  • Inject custom code while the application is running.

Yes, you can inject new code into an application. With Python. Parasite runs in-process as a GTK+ module, so it has access to some internals of your application. We provide a Python shell equipped with PyGTK support for creating and modifying your UI on-the-fly, regardless of the language it was written in. Handy when you want to test out new concepts for a UI without writing new C code.

David has a nice screencast available showing some of what Parasite can do.

For more information on Parasite, including screenshots, a mailing list, and where to get the source code, see the Parasite homepage.

Review Board 1.0 alpha 1 released

Roughly two years ago, David Trowbridge and I began development of Review Board for use in our open source projects and our team at VMware. During that time, we’ve turned Review Board into a powerful code review tool that works with a variety of version control systems. Most of VMware has moved over to it, as have an estimated 50-100 companies world-wide. We’ve had over 100 contributors to the project, people providing volunteer support on the mailing list, and people have developed third party tools for integrating with Review Board.

After all this time in development, with this many people contributing, we decided it’s probably time to get a release out there. Sure, we could have done this a long time ago, but there’s a number of large things we were hoping to get in (a recently-committed UI rewrite, for instance). Now that we have most of the major features we want for our 1.0 release, we decided it was time for an alpha.

Over the coming months, we’ll be working on stabilizing the codebase, fixing a few large remaining usability quirks, enhancing performance, and writing some proper documentation (which is coming along nicely).

We’re eager to get a quality product out there and to begin development on the next release. There’s a lot of neat things planned:

  • Support for writing extensions to Review Board.
  • A fully-featured API covering every operation you’ll need to perform.
  • Some degree of policy support (specifying which users/groups can see which parts of a repository, for instance).
  • Reviews with statuses other than “Ship It”. This will probably be customizable to some degree.
  • Possibly some theme customization to allow Review Board to blend in better with corporate sites, Trac installs, etc.

Along with this, I plan to roll out a new website for the project that will have a browseable list of third party extensions, apps, Greasemonkey scripts, and more.

We have more information on our release on our release announcement.

Scroll to Top