There are no exceptions to this rule. Even shops that run the most sophisticated capacity-planning and performance-monitoring software must do extensive testing to determine their scalability needs and challenges. In fact, testing goes hand in hand with capacity planning (for more on capacity planning see the Playbook in this issue). Let's say your capacity-planning model indicates that, due to projected load increases, you'll need to move from a two-way server to a four-way server in about six months. You'd want to run the projected load through your existing hardware to see how it behaves, then run it through the new hardware to be sure the four-way will be sufficient. Testing lets you verify the accuracy of your models sooner rather than later.
We Never Said It Would Be Easy
Testing takes time and money. It also takes expertise. You need the right people with the right training to float this boat. But finding qualified staff and developing appropriate test methodologies are among the biggest challenges faced by IT managers aiming to build efficient in-house test labs, according to our recent poll of 889 readers (see poll results, right). You can't test the performance of an enterprise messaging server just by measuring SMTP throughput, for instance--that wouldn't give you the complete picture.
Other major obstacles, according to our survey: securing ongoing funding, maintaining management support, demonstrating ROI and justifying TCO (total cost of ownership).
The size and scope of in-house test programs vary significantly from company to company. Only 27 percent of the reader organizations we polled insist on formal testing with established protocols before any product can be put into a live production system, though another 58 percent require formal testing of some products. And while 35 percent of respondents with in-house test labs says they have official testing facilities, 65 percent say they do ad hoc testing, typically by systems administrators or developers familiar with the systems under test.