Those rosy numbers would seem to portend a revolution, so we set out to determine just where the market is headed, hoping to separate reality from hype. What we found is that utility computing holds great promise and in fact represents the maturing of IT. Just as sales, manufacturing, R&D and other corporate disciplines have grown up, now it's IT's turn. It's not a matter of whether IT will change, but when.
Evolution vs. The Big Bang
Before we drill down into definitions, viewpoints and timetables, it's worth noting that we are skeptical as to whether a revolution is coming, and judging from our reader poll, so are you. Vendors have heralded more than one overnight transformation that never came to fruition. Why? Because Darwin had the right idea.
Our industry, like most, evolves, and with good reason: You, the architects and ultimate buyers of information technology, won't let things move at a breakneck pace. It's far too risky. With the possible exception of the dot-com revolution--an experience we'd rather not repeat--IT decision-makers have dictated that real, meaningful, business-affecting change happens over the span of a decade, more or less. First, there was the PC revolution, from the mid-'80s to the mid-'90s; then the networking revolution of the early '90s; then the client-server revolution of the last 10 years; and most recently the Internet revolution. All were important turning points for IT, but each took the better part of a decade to move from first implementations to widespread business reliance.
It was this evolution of information technologies that led to the dawn of utility computing. Until three years ago, the executive suite's conviction that IT spending was critical to business success led to a sometimes reckless disregard for actual cost--so much so that IT still accounts for more than half of the typical corporate capital budget, not to mention a sizable human-resource cost.