In-Depth
A new component realism emerges
- By Richard Adhikari
- October 31, 2002
As the economy slows and the last lingering remnants of
the dot.com frenzy evaporate, a new realism is shaping work around components.
Realists are dismissing the religious wars between J2EE and .NET as a waste of
time because integrating the two technologies is easy.
Meanwhile, some say changes have to be made in approaches to testing and
software licensing to fit the new distributed model. Vendors agree that the
traditional testing lab is not suitable for the brave new world of distributed
computing, and one vendor says the traditional per-seat approach to licensing is
not workable.
At the same time, the idea of reuse, the big selling point of distributed
components, is taking on a new form: Some argue that assets other than
components, up to and including Web services, should be reused. Others say
components other than GUI widgets were not meant to be reused in the first
place.
Web services -- the ultimate distributed component technology -- is still not
ready for prime time, vendors agree. But they are working to change that.
Can't we all get along?
Will Ballard, .NET practice manager at
Momentum Software Inc., Austin, Texas, has deployed systems on J2EE, .NET and on
''those that have both, and they work together fine.'' While integrating J2EE and
.NET systems ''could be tricky,'' it is ''perfectly possible,'' Ballard said.
Linkage is the biggest problem during integration, he said, adding that there
are three types of solutions to this: intermediate message queues like MQSeries,
MSMQ or SonicMQ; SOAP-type services; or a classic data-to-data integration where
''you can use brute force if necessary.''
However, Hugh Grant, chief technology officer at Dublin, Ireland-based Cape
Clear Software, a developer of middleware that lets Java and CORBA back-end
functions become Web services, said simple integration such as a Microsoft front
end and a J2EE, CORBA or MQSeries back end is ''pretty simple'' because ''the
integration piece is almost a commodity. Everybody's got an XML interface, for
example.''
At the very least, J2EE and .NET will coexist within the enterprise.
Companies will select one of the two depending on their size, said David Holmes,
chief marketing officer at Atlanta-based Jacada Inc., which deals with Global
2000 companies. ''Small- and medium-sized companies run their applications on
Microsoft technology, so it's natural for them to move to .NET,'' he said. ''Not
easy, but natural.'' In the Global 2000 community, however, ''corporations moved
to Java as their enterprise language, so medium- to large-sized corporations
have Java and J2EE skills and are dominated by J2EE,'' Holmes said. Nevertheless,
the two will coexist in the corporate environment partly because corporations do
not want to commit to any one vendor and partly because they will acquire
companies that were Microsoft shops and will have to integrate them, Holmes
said.
Jacada Integrator takes back-office applications from mainframes, Unisys, DEC
VAX and other legacy platforms, wrappers them and exposes them as Web services
so they can be invoked as Web services. ''We're a low-cost, low-risk alternative
to rewriting or restructuring an application,'' Holmes said.
Testing revisited
Atesto Technologies, Inc., Fremont, Calif.,
ties in testing with performance management to give users a 360-degree view of
their applications. Ronnie Ray, Atesto's vice president of marketing, said that
because the Web is fully distributed, and Web services have fully interactive
applications as well as users, testing and monitoring ''needs to shift to where
the work is getting done, which is to the edge of the network.''
Testing processes will define how people manage their environments in the
future, Ray said. While testing is currently done by QA departments, which have
little connection to operations departments, testing and operations will share
resources in the future, he said. ''Operations will be able to extract some of
the value from the work QA people do upstream,'' Ray said. Atesto tests
distributed components as well as Web services-based architectures. Its platform
conducts performance tests, and it also does system and application monitoring
enterprise-wide.
Testing will also have to move out of the laboratory. ''You need to mirror
your global environment in your testing environment instead of testing in a
clean room,'' Ray said.
Momentum's Ballard agreed. ''For us, testing is almost always based around
business scenarios, so we construct sample data that matches the use cases and
then run that data through the interfaces to test the system end to end,'' he
said.
Further, testers will need to upgrade their skills. Simon Galbraith,
marketing director at Red Gate Software, Cambridge, England, said corporations
will need a ''much higher level of skill for testing'' because testers ''must
understand the underlying technology more than if they were doing Web site
testing.'' That is because Web services do not have a user interface.
Red Gate offers ANTS -- Advanced .NET Testing System -- a Web services
testing tool that tests any SOAP-based Web service for scalability. This is
important because ''if you use Web services for integration, they'll handle quite
large numbers of requests,'' Galbraith said. If, for example, a corporation has a
payroll system tied into its employee database, the system might make 10,000
requests to the Web service on payday. The corporation must be able to ensure
that the Web service can stand up to that load.
License the people, not the seat
Atesto's Ray said the
current per-seat software licensing model is not good enough in the distributed
world. The location-specific licensing model used today ''doesn't make sense in
the distributed world because applications and components are spread across
the network,'' Ray said. Atesto uses the floating license concept. This is
essentially an enterprise-wide distributed license for the people who work with
Atesto's software.
''Say you have 10 people working on the system, and they're based anywhere and
connected through the corporate intranet; they'll be using the same resources,
software, hardware, test scripts, resource analyses. All that runs on one
integrated platform,'' Ray said. If a user leaves, that license can be
transferred to the new hire. When more users are added, the client has to take
out more licenses in their names.
The new face of reuse
The frenzy about reuse has died down, and
Momentum's Ballard said this is because the concept of reuse has been
misrepresented. ''Components, beyond GUI widget-type components, were never about
reuse; they were about defining interfaces on your systems and constructing
those systems in such a way that they're well defined, bite-sized, attackable
projects that you can integrate together to form your computing architecture,''
Ballard said.
To Greg Coticchia, CEO at LogicLibrary Inc., Pittsburgh, corporate interest
in reuse has fizzled because companies took the wrong approach to building
reusable components. ''These things are iterative,'' Coticchia said. ''You have to
start with some essential applications; you're not going to try to boil the
ocean because, if you do, you'll fail.'' When corporations try to make ''every
possible application under the sun'' reusable, they will fail, Coticchia said,
adding that ''you have to tackle applications you can handle first and get some
wins.''
LogicLibrary lets user companies create a catalog of essential software
assets, including both executable code and supporting architectures, use cases
and process models. The companies then use LogicLibrary Logidex, a component
discovery engine, to view their components within these contexts and identify
those that best fit a given application's technical and business requirements.
Logidex logs and registers queries to save developers from reinventing the
wheel.
''You can find out who asked a particular question when, and what the answer
to it was,'' Coticchia said. ''That makes a great starting point for building a
Web services initiative.''
Andre den Haan, vice president of product strategy at Seagull Software in
Atlanta, said attempts at reuse failed for two reasons: Some were tied to a
specific platform, such as DCOM, and others were too difficult to use.
Seagull offers Transidiom, a bridging technology that supports ''all platforms
and environments,'' including J2EE, .NET and WSDL, den Haan said. Transidiom
extracts critical business processes, and users can reuse these in Java or .NET
infrastructures or as Web services by wrapping an additional layer around them.
This approach minimizes what den Haan calls skills separation.
''Traditionally, when you try to integrate something old with something new,
the project goes over budget or fails because people underestimate the number
and types of skills required -- they need to understand old and new technologies
and gobbledygook like low-level APIs to make the whole thing work,'' he said.
Transidiom '''ets domain experts leverage their skills without being dependent on
them, and [lets them] gradually grow into the new technologies,'' he
explained.
Adam Wallace, vice president of R&D at Flashline Inc. in Cleveland,
contends that reuse has not fizzled out, but that corporations are redefining
the meaning of the term ''reuse,'' expanding it instead of focusing just on
reusable software objects or components. Flashline's clients are increasingly
looking at reusing assets other than components, he said, adding that ''there's a
good body of other types of assets such as process-based or knowledge-based
assets, even best practices documents, and we realized that our customers are
looking to not just reuse components for the sake of programmatic reuse, but
also [to] reuse knowledge or expertise that's been packaged as a best practices
document or architecture.''
Flashline offers the CMEE meta data repository that integrates with source
control management (SCM) packages such as ClearCase or PVCS. It acts as an
extension or enhancement of the meta data held in UDDI, providing a dynamic link
to that data so that, when changes are made, it does not re-create the UDDI
functionality, Wallace said. Flashline wrote a UDDI browser to enable the
functionality, and ''that turned out to be one of the things our customers
liked,'' he said.
Extending the meta data in UDDI gets around UDDI's lack of capabilities such
as dynamic binding, Wallace said. CMEE can also be used as a trusted repository,
sitting on top of customers' localized Web services to let customers control
access to and information about Web services, Wallace said. Flashline also lets
users have multiple UDDI directories distributed around one organization or
multiple organizations all fronted by CMEE to provide one central point of
access.
For Simon Peel, vice president of marketing at Mainsoft Corp., San Jose,
Calif., reuse failed because components ''were built and reused by people who
just built and used procedural code.'' Once larger teams or business partners of
a corporation were given access to the component, they would have to adapt it or
even change the component ''because inheriting it and building on it isn't good
enough,'' he said. Other issues that stymied the reuse of components revolved
around ownership of the code. ''Who has the responsibility to code and share
components? And if I give you access to my components, who's responsible for
them?'' Peel said.
Mainsoft's product, Visual MainWin 5, solves those problems by letting users
take components they built centrally in the Windows environment and reuse them
on Unix systems. Mainsoft has ported the Microsoft XML engine, MSXML, to Unix,
so developers can write their components to MSXML, and the components can
be reused in Unix. ''Typically, people think about reusability on the same
operating system; Mainsoft gives you reusability on multiple platforms,'' Peel
said.
Visual MainWin 5 lets users plug into Visual Studio .NET. After they build
unmanaged C++ applications on Windows and are satisfied that they have the logic
and syntax right, they press the ''build'' button and that sends a message to the
Visual MainWin server and the Unix compilers on the server compile the
application in Unix. ''That way, developers have a copy of the native application
and they own the source base, which is in Windows,'' Peel said.
Web services
Web services are, by definition, distributed
component technology, and some say Web services themselves can be considered
distributed components. ''In the loosely coupled component architecture, which is
XML and Web services, the key focus is on code reuse at a bigger-grained level
than the traditional object component-type of code reuse,'' said John Montgomery,
group product manager for the .NET developer platform at Microsoft.
''In that sense, Web services are either a specific subset or a superset of
components, depending on whom you talk to,'' he added.
Demand for Web services will grow, said Stefan van Overtveldt, director for
WebSphere technical marketing at IBM Software Group, Somers, N.Y., because, with
Web services, ''we have a way to take the core apps the central IT shop provides
and make them easily accessible to lines of business.'' While the concept of Web
services is ''extremely simple,'' standards are right now ''far from complete,'' van
Overtveldt said.
Security is one of the glaring omissions in Web services. Earlier this year,
IBM, Microsoft and VeriSign announced Web Services Security, which is designed
to provide a highly applicable way of securing XML Web services, Microsoft's
Montgomery said. Web Services Security gives users a common way of
authenticating and providing access authorization across various systems that
support Web services.
Several features are missing from that standard, IBM's van Overtveldt said.
''There are no solid mechanisms to do logging of access, analyze who's accessing
what, and there's no mechanism to construct these access requests on the fly so
as to give two companies that want to do business a way to negotiate the
contract immediately and get access to it,'' he said.
Web services requires other capabilities: ''There are other scenarios we need
to enable, like reliable messaging and being able to run a long-lived
transaction across multiple organizations,'' Microsoft's Montgomery said.
Still, the core services -- WSDL, UDDI and SOAP -- provide enough value that
''a lot of companies have started to implement Web services internally,'' IBM's
van Overtveldt said. For authentication, IBM offers the IBM Web Services
Gateway, a gateway server that is implemented in a corporate DMZ between
firewalls to create a secure and managed connection between internal and
external Web services-based applications. Its function for Web services is
similar to that of a proxy server for HTTP traffic: When an external Web
services request comes in, the gateway will perform authentication,
authorization and event logging before submitting the request to the internal
Web services provider. Alternatively, an internal Web services request can
access an external Web services provider application through the gateway,
provided the management authorizations are in place.
Van Overtveldt said that, in the future, the Web may move away from HTTP.
''HTTP has no acknowledgements and no guaranteed delivery, which is a problem if
you're trying to build apps with real-time transactions,'' he said. IBM is
working on two solutions. One is HTTPR, or Reliable HTTP. This gets
acknowledgements and gives guaranteed one-time delivery. It is built on
HTTP/1.1. Work on HTTPR is being done with the open-source community.
The other solution is a Web services gateway that takes incoming requests
over HTTP and puts them on MQSeries, which is ''an extremely reliable protocol,''
van Overtveldt said. Companies doing this typically already have an MQSeries
infrastructure in place so they can leverage that for Web services
communications. For example, within an application they could use Java Messaging
Services or they could use RMS/IIOP in CORBA.
However components are defined, the fact is that the distributed component
approach to computing is here to stay. It will become refined over time, as
technologies always are.