Today’s agile development methodologies are based on the notion that quality has to be built in from the beginning, not hoped for at the end. The same applies to performance, which is just as important as functionality – after all, who cares if the ordering system works if it’s too slow for anyone to tolerate?
One of the basic tenets of agile methodology is test-driven development, starting with unit tests that are written explicitly to fail until they can be proven to work. This is exactly the time to start measuring and managing performance, so why do most companies wait until the last minute and hope for the best?
One reason is that measuring performance requires subjecting the application to a level of load that will at least approximate, if not duplicate, the production environment. Even though there are all kinds of tools available to do this, it is a costly and time-consuming effort even if the tool is free. If I’m a developer working on a deadline, what’s in it for me to spend hours if not days creating and running load testing scripts if I know that in the end the issue of performance is out of my hands? If I’m especially diligent I can do some code and database profiling to soothe my own conscience but I’m not likely to waste much time with a load testing tool unless I’m forced to. Are you?
No comments:
Post a Comment