Why Web services aren't just a rerun
Web services can seem too good to be true. After decades of disagreement, all
of the major vendors have finally bought into a common approach to connecting
applications. It's hard not to harbor some doubts about whether this technology
will live up to its promise.
Why should we believe that Web services will usher in ubiquitous connections
between applications when every previous attempt has failed? The most often-cited
counterexample is the OMG's Common Object Request Broker (CORBA), which shared
many of the goals that Web services strive for today. There are plenty of other
examples for those with long memories: the Open Software Foundation's Distributed
Computing Environment (DCE), and even the Open Systems Interconnection's (OSI's)
attempt to define standards in this area back in the mid-1980s.
But Web services really are different. The most important change from those
previous efforts is that all of the vendors with dominant market share in the
most relevant area -- application servers -- are backing the exact same set
of technologies. While there are occasional distracting efforts -- such as the
WS-Reliability specification -- from vendors with less power in this area, none
of these distractions will matter in the end. The Microsoft/IBM axis, now joined
by BEA, has all the market power required to make Web services the universal
approach for connecting apps. This has never been true before -- Microsoft remained
aloof from CORBA, both Microsoft and Sun resisted DCE, and nobody took the OSI
upper layers effort seriously. This broad-based agreement is a milestone for
our industry, and it's why Web services are succeeding where these earlier efforts
Another popular concern: Is Microsoft just lulling us into believing they're
concerned with interoperability only to eventually add proprietary elements
that will lock us into a .NET-specific version of Web services? After all, Microsoft
is the company that coined the phrase "embrace and extend," and it
doesn't have a stellar reputation for sticking to interoperability standards.
Yet I'd argue that Microsoft's poor reputation in this area is no longer accurate.
For example, one of the most damning criticisms was that the Windows 2000 version
of Kerberos would not interoperate with standard Kerberos. I've read this in
many places and heard it angrily asserted in countless conversations as evidence
of Microsoft's protocol perfidy. But it's not true. Windows 2000 Kerberos will
interoperate with standard Kerberos, a fact that was fundamental to the success
of a large project I worked on a while ago.
Reputation aside, why would Microsoft go so far down the Web services path
only to drop in proprietary roadblocks at this late date? Imagine the backlash
from its enterprise customers if they did. Besides, without effective interoperability,
Microsoft technologies run the risk of being cabined off for desktops and departmental
use only, a fate that doesn't help the company achieve its growth targets. To
be a major player in the inevitably multivendor world of enterprise servers,
Microsoft's .NET technologies and the rest of its offerings need to work well
with the competition.
A final worry: improved software reuse. Web services make a Service-Oriented
Architecture (SOA) practical, and SOA allows for the easier reuse of existing
software because that software is accessible in a standard way. But object technology
made the same claims of enhancing reuse and wound up achieving much less than
was promised. Why will SOA be any different?
One possible answer is that it won't be. Even if SOA does not live up to its
claim of improved reuse, it will help to integrate new applications with existing
code, which is an important advance. But there are a few reasons to believe
that SOA will make all kinds of software reuse more likely. One of them is that
reuse at the level of services rather than objects makes more sense. Rather
than inheriting from a complex object, a developer can just use a simple service.
And the fact that most services are remote means that reuse will only be practical
for fairly large-grained business functions, which is exactly where reuse should
have the most value.
Still, the unpleasant truth is that reuse is not fundamentally a technical
problem. Web services-based SOA can help, but the human barriers to software
reuse remain. How do developers know what's available to be reused? How do managers
overcome a developer's ingrained resistance to using somebody else's code? And
how do organizations handle the problems inherent in letting applications owned
by one group be used by another? None of these problems is technical, all existed
with objects and they all exist with SOA.
SOA may or may not bring about a reuse nirvana. But whatever happens, Web services
certainly will allow much better connections between applications. Despite the
false steps of the last 20 years, this time really is different.
David Chappell is principal at Chappell & Associates, an education and consulting firm focused on enterprise software technologies. He can be reached via E-mail at email@example.com.