Why Static Content is Just So Much Static

If you don’t know what the term “static content” is, it refers to any content that doesn’t change (like images, SWF files, media files, CSS files, script files, etc).  The average web page contains dozens to links to static content, while only the HTML contains the data the user is typically interested in.  This has an enormous impact on load testing for reasons this blog post will address. 

As you probably know, static content is usually cached on your local machine the first time you see a page and stays there unless it changes.  This is why the first time you go to a site it takes longer to load than subsequent times and is a huge design consideration for serving up complex pages quickly for return visits.  Furthermore, since the advent of CDN (content delivery networks) like Akamai, the static content gets delivered to user from a server closest to them, reducing even further the time it takes to arrive at your desktop. 

But even if you don’t use a CDN but host the content on your own servers, the data is cached in the sever memory the first time it is accessed so future accesses require almost no processing overhead on the server’s part – filling up the pipe is all it takes and that is minimal in terms of CPU time. 

So the question is, what are the pros and what are the cons of including static content in a load test?  I think you’ll see that the list of cons is fairly long and the list of pros is…well, almost non-existent!  (Unless you want to test your bandwidth, in which case you can just choose the largest content file you have and just hit that with as many users as you can). 

Besides filling up the test and its results with dozens of URLs that make them difficult to read and interpret, the test is not very realistic unless you assume that all your users are first time visitors.  Even then, it’s the speed of the real end user’s connection that will really determine how fast the page loads and if you have a way to control that then you should patent it at once!  Seriously, though, the amount of static content and its size has always been a design consideration for web developers and the trade-offs are well known.

Most importantly, however, it’s the dynamic content that most determines the performance of a web application and that is where the “rubber meets the road”.  You have no control over static content other than eliminating it, reducing its size or increasing your network bandwidth.  You do have control over dynamic content, however, and that’s where most bottlenecks occur (most often with database operations).

So the art of web site performance improvement often boils down to database performance, load balancing and application code design.  For that, it’s best to eliminate all static content and focus entirely on what is making the real demands on your server.  Otherwise you may end up with a load test that requires many times the number of load agents than it would otherwise require, which adds cost that would otherwise be spent on improving performance on the back end!

No comments:

Post a Comment