Independent testing of vendor products is a rare and valuable thing. That's certainly true in the wireless networking market, as WLAN makers are pushing out ever more access points and associated hardware, and wireless architectures are becoming staggeringly complex. How do potential customers know what product is right for them, or if their current vendor is delivering the goods as advertised?
Independent test results can help--as long as they are kept in proper context. One of the more interesting independent tests comes from Keith Parsons, of Wireless LAN Professionals.
Parsons' "Wi-Fi Stress Test" is touted as a repeatable, vendor-independent access point analysis. The goal of the test was simple: pit an increasing number of Apple iPads against a single AP until the AP crumbled, and measure the same data points along the way for each unit under test.
APs from Aerohive, Aruba, Cisco, HP, Juniper, Linksys, Meraki, Ruckus, Ubiquity and Xirrus were provided by each manufacturer, and the rest of the test environment was composed of client devices and test gear owned by Wireless LAN Professionals.
Parsons' testing team included representatives from seven WLAN vendors and two dozen volunteers unaffiliated with the vendors who were eager to learn from some of the best in the industry. Two weeks ago, I spent time with Parsons and fellow industry experts and analysts at Wireless Field Day, and the Stress Test got a lot of attention.
Rather than regurgitate the results, I want to share my impressions of what I liked about Parsons' approach, and what I'm not so keen on.
The Ups and the Downs
If you read Parsons' report, you'll see he goes to great pains to explain the parameters of his test, as well as its limitations. In other words, he's not making this test out to be anything more than what it is.
His goal was to see what APs could stand up to the max load, measured in iPads pushing known traffic. That's it. He wasn't crowning anyone with the title of The Best Wireless System. He was clear with his methods, he had a great team of testers, and he kept an eagle-eye on the vendors that participated. No configuration or performance-optimizing was tolerated where vendors had an active hand in testing. At the end of it, Ruckus did well in this exercise, with Cisco close behind.
[ Join us at Interop Las Vegas for access to 125+ IT sessions and 300+ exhibiting companies. Register today! ]
What didn't I care for in this analysis? Though Parsons made it crystal clear that he doesn't consider this a "real-world" test of WLAN products, I was puzzled that all testing used 20 MHz wide channels in the 5 GHz band of 802.11n. I don't know of a single network that is configured for other than 40 MHz channels, as that was one of the prime drivers for migrating to 11n from legacy 11a/g technology. I also feel conflicted about a iPad-only testbed, as in my own network experiences I find Apple products to be maddeningly inconsistent depending on their own OS version combined with the specific code found on the infrastructure APs and controllers.
Parsons also points out that his analysis excluded many other critical functions of modern business WLAN: feature sets, management interfaces, location services and all of the other pieces that add up to TCO. Put another way, even though this stress test was not billed as real world (and yes, everyone's version of that notion varies), it was a bit too far away from real world for my liking. I learned what APs best stand up to huge volumes of traffic as delivered in the testing under specific parameters, but that is arguably a relatively minor data point in the bigger story of WLAN ownership.
At the same time, I truly appreciate Parsons' efforts to undertake this task and to gather the resulting data. I look forward to how Parsons will build on it in future tests, and wouldn't mind participating in the next round.