Is that gas I smell?

At the Oracle at Delphi, a priestess would go deep into the cave, inhale noxious fumes seeping up from a natural gas deposit beneath the cave until she had Xes where her eyes used to be and then prophesize cryptically. She'd say stuff like, "Visicalc will live forever!" and "No one will ever lose their jobs if they buy IBM," and lots of people would believe her.

Nick Carr, a modern-day oracle, wrote a provocative article titled, "IT Doesn't Matter," that appeared in the 2003 Harvard Business Review, and it freaked out a lot of computer industry people who thought they were going to lose their jobs. In that article, Carr wrote that IT has been following a pattern similar to that of earlier technologies such as railroads and electric power.

Earlier adopters of new technologies could jump ahead of their rivals, but eventually everyone would acquire the same technologies, thereby neutralizing their competitive advantage, he wrote. "From a strategic standpoint, they become invisible; they no longer matter," Carr said. The article had the desired effect: People argued a lot, and Carr went on to write a book and give talks for good money.

Carr was (or is, depending on your time frame) wrong. Competitive advantage has more to do with how you use technology than being the first to get it. Compare UPS and FedEx, for example, and you'll see what I mean.

Carr's at it again, this time with an article titled "The End of Corporate Computing" in the Spring 2005 issue of the MIT Sloan Management Review. On his Web site, Carr explains the title of the article has a dual meaning. Computing utilities will bring to an end the traditional model of corporate computing. To put it another way, IT's final destination is utility computing, Carr writes.

Carr reminds me of an editor that I worked for: a smart guy who used to say really dumb things all the time. Here's how he comes up with his prediction that utility computing is where IT's at (or will be, is more like it):

"Three technological advances are enabling this change: virtualization, grid computing and Web services. Virtualization erases the differences between proprietary computing platforms, enabling applications designed to run on one operating system to be deployed elsewhere. Grid computing allows large numbers of hardware components, such as servers or disk drives, to effectively act as a single device, pooling their capacity and allocating it automatically to different jobs. Web services standardize the interfaces between applications, turning them into modules that can be assembled and disassembled easily."

From those three advances, will spring forth a new IT industry, with three components: IT utilities, IT suppliers and network operators, Carr says.

"IT's shift from an in-house capital asset to a centralized utility service will overturn strategic and operating assumptions, alter industrial economics, upset markets and pose daunting challenges to every user and vendor," Carr asserts. "The history of the commercial application of IT has been characterized by astounding leaps, but nothing that has come before-not even the introduction of the personal computer or the opening of the Internet-will match the upheaval that lies just over the horizon."

Supporting Carr's view is the notion that enterprises have over-bought hardware and software technology, have hired too many people to run the stuff and are under-using all of it.

"When overcapacity is combined with redundant functionality, the conditions are ripe for a shift to centralized supply," Carr writes. "Yet companies continue to invest large sums in maintaining and even expanding their private, subscale data centers. Why? For the same reason that manufacturers continued to install private electric generators during the early decades of the 20th century: because of the lack of a viable, large-scale utility model. But such a model is now emerging..."

Yada, yada, you get the point. The new stuff looks like the old stuff, and because the old stuff no longer exists, neither will the new stuff.

The problem with this big idea is that the Carr is comparing apples and basketballs. Old-time businesses ran their own generators because they had no choice. There was no infrastructure to support anything else. When utilities came along, they were happy to get out of the private power-generation business because it was easier and cheaper to hand it off. It's only electricity running through wires, right?

But what runs through technologies' wires is data, and making data is far more critical to the existence of companies today than making electricity was to companies in the old days. It's not the generator; it's what you do with it.

Enterprise computing doesn't take place solely in data centers, which is what Carr is really focused on when he makes his argument for utilities. It's widespread and becoming more ubiquitous every day. Data is not generated in one place, and it's not used in just one place.

Digital assets are not something most companies are going to hand over to anyone else, no matter how appealing the model is. Even if you did, you'd still want to have your assets close by in the same way a hospital has back-up power because it can't afford to rely solely on the electric utility.

Yeah, enterprises buy technology by the boat load. Why not? Technology is relatively cheap and getting cheaper by the day. As Carr points out, software is highly scalable and adding incremental users costs very little. And while new technology is coming through the front door, the old stuff is going out the back, and the overlap is inevitable. Most enterprises wring every last cent out of the stuff they buy before they get rid of it too.

So what if not all end users are running their PCs to the max? Not many people travel at 120 mph, but they buy cars that can because you never know-you might want to. Companies buy more than they need because they prepare for sunny days, when they expect to grow, to adapt to new competitive changes, or to switch business models. Enterprises can't adapt if they must acquire the technology first. No, they take what they have and build on it.

I could go on and on, but you get my drift. Now excuse me, while I crawl back into my cave and get a whiff of reality.

About the Author

Michael Alexander is editor-in-chief of Application Development Trends.