It's been about 10 years since I tested the first first batch of 802.11 wireless products for Network Computing. It was very cool technology, to be sure, and it had obvious long-term potential. Unfortunately, it was slow (throughput of just over 1 Mbps); it was expensive (PC-Card wireless NIC's cost several hundred dollars each); and, interoperability pretty much sucked. Part of the interoperability issues related to compromises in the standards-development process, where vendors could implement either of 2 different spread spectrum radio technologies (frequency hopping and direct sequence) and still call it 802.11. A lot has happened with Wi-Fi since then, almost all of it good.
As a grizzled veteran of the broadband wireless industry, it seemed appropriate to write a retrospective piece for my first blog posting on this site. If you've looked at my bio, you'll know that I have divided my time over the past 10 years, working as a faculty member and Assistant Dean at Syracuse University's School of Information Studies and also as an editor with Network Computing. As I've thought about my work over the years, I compiled the following list of lessons learned. I hope it stimulates some feedback about where we've been and where we're going. For this posting, I'll focus on Wi-Fi. In the future, I'll address other sectors of the wireless industry.
1. If a vendor promises seamless interoperability, show them the door. In the early days of 802.11, interoperability was hit or miss at best. The emergence of the Wi-Fi Alliance's product certification program was a big step in the right direction, but getting Wi-Fi products from multiple vendors to work together can still be a headache, especially when new releases break things that used to work. There are still plenty of seams to overcome. A great example is Wi-Fi support on version 3.0 of Apple's iPhone operating system. Never think of interoperability in terms of black and white. It's always grey.
2. Never trust the .0 release. Over the years, I've learned a lot about the sausage-making process employed by technology vendors, and it isn't always pretty. Vendors are under immense pressure to get products out the door, to beat competitors to the punch, to meet artificial marketing deadlines, to stimulate purchases from customers who have postponed purchases waiting for the next release. Unfortunately, these .0 releases are almost always glorified beta releases that cause problems for IT professionals. Unfortunately, waiting for releases that are stable not only prevents you from taking advantage of new capabilities, it can also prove to be an unfulfilled dream, with a different set of problems found in the maintenance release.
3. Wireless performance is a quantitative quagmire. In my earlier days of managing Ethernet networks, benchmarking performance and provisioning networks was pretty easy. Performance tests were easily repeatable and results in the lab correlated highly with experiences in the field. With Wi-Fi, the variables impacting performance are impossible to accurately model, despite efforts by a number of vendors that have developed extremely sophisticated tools. Worse, the tried and true Ethernet method of over-provisioning (throwing bandwidth at the problem) isn't always practical with Wi-Fi. It's true that 802.11n is moving us in that direction, with improved raw performance and better consistency of coverage. Unfortunately, the transition to 11n will take a few years as older products age out of production. In the mean time, most shops will need to trade performance for compatibility with legacy clients.