By choosing arbitrary numbers, the vendor can average all the vague data and come up with a number such as 23. Varying the numbers used to compute the average changes the result, which can be manipulated to present exactly the data the vendor wants to represent. The most accurate way to generate such a statistic would be to allow survey takers to specify the answer values themselves, rather than have them choose from a list.
"Hardware-accelerated SSL provides 32,000 transactions per second."
Throughput refers to how much data a product can process in a specific time period. The more throughput the better. But vendors often intentionally provide total throughput without referencing the size of the data used to benchmark the product. Since each product set has its own peculiarities, you must do your homework to determine whether the throughput numbers presented represent real-world traffic or sculpted patterns designed to produce performance metrics suitable for marketing material.
In the case of the IT Pro's hardware-accelerated SSL, the number of transactions per second quoted is likely to be the number of RSA operations per second that the product can execute. The number of these encryption and authentication operations per second is not directly translatable into transactions per second, as each transaction involves multiple cryptographic operations. Also curiously missing from the performance claims is the size of data being encrypted. Vendors often use 1-KB files in their performance tests to show a higher number of operations per second because bulk encryption rates decrease as the size of data being encrypted increases.
Firewall performance also varies based on data size. Using large packets and small numbers of sessions produces better performance numbers than using small packets and many sessions, which more closely resembles real-world traffic. Rather than show the devices' true capacity, therefore, vendors run the tests using the less taxing setup--large packets and few sessions--to make the products appear to perform better.
Also keep in mind that the number of rules configured on devices can dramatically affect their performance. Firewalls, IDSs and other packet-inspection products must inspect and direct packets based on a configured set of policies. The more rules or signatures (for IDSs and antivirus gateways) that a single packet must be compared against, the more work the device must do, which in turn degrades performance. Vendors will generally run performance tests with the fewest rules or policies required to show the best possible results.