In-Depth

Tooling up for Components

With component-based development becoming today's hot buzzword, it is easy to forget just how recently this "plumbing" was developed. Just ask Countrywide Home Loans, the nation's largest independent home loan provider. When the firm began developing a loan origination application based on Microsoft OCX controls -- the forerunner of today's COM and DCOM component models -- it had to invent many of the basics. This included tasks such as reading and linking component interfaces, as well as maintaining database connectivity, all of which are taken for granted today.

What makes component-based development attractive to users is its potential to reduce application complexity and promote reuse. However, prior to the emergence of specifications from Microsoft Corp., Sun Microsystems Inc., Cupertino, Calif., and the Framingham, Mass.-based Object Management Group (OMG), the only choices available to application managers were inventing their own component models or relying on tool-specific ones.

Yet the new standards are just that -- new. For example, the Enterprise JavaBeans (EJBs) 1.0 standard supports traditional transaction models, but will not offer "entity" beans attuned to newer, object transactional models until an as-yet unscheduled upcoming release.

Nevertheless, the emergence of COM, Corba and various flavors of client- and server-side JavaBeans is happening not a moment too soon. Packaged applications are taking a larger bite out of internal development, with the market expected to double to $100 billion in five years, according to International Data Corp. (IDC), a Framingham, Mass.-based research firm. Theoretically, components should make internal development a more competitive option, with tool-makers taking notice. Most development tools now provide the ability to automatically generate COM, Corba or Java components, or at least develop interfaces to them.

Generating a standard component is only the first step, however. What about how the components should be bolted together? Without guidelines that define the relationships between various parts, conflicts might occur. In automotive manufacturing, for example, a wheel attached with a screw could be threaded into a dashboard with the same fitting. You probably would not want to drive such a car, however.

Translated into software terms, a component for a customer order credit limit and product paint color could sport compatible wrappers or interfaces, and their behaviors might not necessarily conflict. But could you imagine an application piecing them together without additional logic?

Not surprisingly, a growing number of tools are addressing the need to provide this elusive context. Some tools now bundle their own native repositories while offering grudging support of third-party repositories from Microsoft, Unisys Corp., Blue Bell, Pa., or other vendors. A few vendors, such as Select Software Tools Inc., Irvine, Calif., offer separate component manager products that document the components stored in those repositories. And other firms, such as Microsoft and Sybase Inc., Emeryville, Calif., bundle front-end and middleware components under a common umbrella.

UML-based modeling tools from Platinum Technology Inc., Oakbrook Terrace, Ill., Rational Software Corp., Cupertino, Calif., and Riverton Software Corp., Cambridge, Mass., are committed to generating not just raw objects, but components that are compatible with Microsoft, Corba or Sun standards. However, the most popular feature is the framework, a library of generic processes for functions such as workflow or system functions, such as mapping application functions to the operating environment. In some cases, frameworks are even going to the "next step" by including business templates that are almost applications in the raw.

Been there, done that

Component-based development did not dawn with COM/DCOM or Corba. During the heyday of two-tier client/server development, Forté Software Inc., Oakland, Calif.; Nat Systems, Paris; Magic Software Enterprises Inc., Irvine, Calif.; Netron Inc., Toronto; Progress Software Corp., Bedford, Mass.; Seer Technologies Inc., Cary, N.C.; Sterling Software Inc., Dallas; and Uniface Corp., now part of Compuware Corp., Farmington Hills, Mich., offered tools that supported multitier development. Because component and application communication standards had yet to exist, many of these tools devised their own formats to modularize code. Forté, for instance, built its own format and messaging infrastructure layered atop TCP/IP.

In most cases, these veterans are slowly embracing the new generation of DCOM/ ActiveX and Java/IIOP interfaces in order to protect their installed base. There is also the common refrain that open systems standards take time to achieve the robustness of their proprietary counterparts. For example, while Forté offers the necessary interfaces, it still claims that its
proprietary format offers superior multithreading and application messaging capabilities, and in the long run, may offer DCOM/Corba/JavaBean bridges.

Other vendors have added component-based development capabilities to tools built initially on Case and entity-relationship modeling foundations by extending their meta models to support component specification. Sterling's Cool:Gen product, for example, now generates Microsoft COM components.

But there are vendors that are reticent about supporting the new generation of de facto component standards. Netron maintains a proprietary "frames" architecture that relies not on distributed object request brokers (ORBs), but on traditional CICS-style transaction monitors. Like many tools on the market, Netron provides "frameworks" consisting of groups of generic functions, such as SQL call generation, reporting and Cobol screen painting. While Netron has not yet generated COM or Java components, it promises to do so in the future.

Suites anyone?

Microsoft and Sybase are among the best examples of vendors trying to be almost all things to some people. Microsoft's Visual Studio 6.0 -- which despite its release number is actually the second release -- combines all of Microsoft's development tools. It includes languages, an HTML application builder, a transaction server, a distributed application "wire" analysis tool, a lightweight version of the Rational Rose modeling tool and several component management tools built around the Microsoft Repository. Like Microsoft Office, Visual Studio features toolbar-level integration of a collection of separate tools that can generate -- what else -- COM components automatically. Although data modeling and testing are notable omissions, Visual Studio is one of the broadest component development environments available -- that is as long as you are content working only in a Windows environment and with COM component models.

For its part, Sun has unveiled Java Studio, which includes a visual development tool, a UML modeler and configuration management. It also offers Java Workshop, which provides GUI testing capabilities and will eventually include a load-testing element. Java Studio currently supports Java- Beans, but is still working on EJB support. Of course, you can develop and test all you want, as long as you are working on a 100% pure Java virtual machine environment.

If the pure Microsoft or Sun solutions are too restrictive, Sybase's Enterprise Applications Studio provides a more agnostic alternative. It consolidates its PowerBuilder and Java tools with the Jaguar Transaction Server that supports COM, C++ or Java components, a light version of the Riverton HOW modeling tool, and a development copy of its mobile database (SQL Anywhere). The Sybase offering might provide a migration path for classic PowerBuilder fat client applications, but PowerBuilder developers will have to be patient. Jaguar will not support PowerBuilder objects until next year.

A framework in every pot

Next to component, "framework" is probably the most overworked term in application development today. And virtually every tool supplier claims to offer one.

At their core, frameworks are pre-built system services that allow application components to work with operating systems, visual environments and data sources. The goal is to avoid the need for developers to reinvent some of the basic plumbing, a critical productivity requirement that keeps development teams focused on what counts: business logic. Providers such as Blueprint Technologies, McLean, Va., Persistence Software Inc., San Mateo, Calif., Riverton Software and Template Software Inc., Dulles, Va., deliver varying combinations of services, such as transaction state maintenance, database mappings, application logic mapping, and partitioning and process monitoring. Riverton backs this up with a UML-oriented modeling environment, while Blueprint Technologies relies on Rational Rose for application models. Platinum, meanwhile, is building a framework for component development based on the Catalysis methodology.

Most of these tools also offer business logic builders or templates. For instance, Blueprint Technologies and Template are planning business logic frameworks for specific vertical industry sectors. In some cases, these frameworks could morph into packaged applications in the rough that could stand on their own or fill the gaps not covered by popular enterprise resource planning (ERP) packages, such as SAP's R/3.

Scalability hurdles

One of the major issues when scaling up component-based applications is delivering robust back-end services. Distributed applications require more complex back-end services than simply connecting to a database. For instance, a simplistic view of a distributed securities trading application would replace monolithic logic with the following process: A trading order component would invoke a customer credit history component, which in turn invokes credit threshold components. However, each of these components might be based on different servers. Therefore, there has to be a way for "disparate" components to talk to and invoke one another.

One approach is a transaction server that coordinates the messages sent back and forth. Providers such as Microsoft and Sybase promote their bundling of native transaction servers with development tools. This is based on a traditional, transaction processor monitor view of the world that regards messages as transactions, not necessarily as objects.

The other approach is the object request broker model that involves multiple, individual interactions between distributed
objects. Surprisingly, Scotts Valley, Calif.-based Inprise Corp., which offers a spectrum of development tools (JBuilder, for example) and ORB middleware (like its Visi- Broker ORB, which the firm added as part of its Visigenic acquisition), has not yet added any integration between those products. Conversely, New York City-based Prolifics Inc., which offers only development tools, provides interfaces to Microsoft Transaction Server (MTS) and the Tuxedo transaction monitor from BEA Systems Inc., Sunnyvale, Calif. Prolifics will also add support for BEA's new M3 Object Transaction Manager.

Of course, the choice of transaction server vs. ORB as a back-end strategy for building an engine capable of supporting high-throughput applications is a philosophical debate. ORBs are the purer, but less mature view. Proponents claim distributed object-oriented approaches are, in the long run, more scalable as they avoid the use of monolithic transaction monitors or servers that limit flexibility, and threaten potential bottlenecks and single points of failure. Transaction processing promoters respond that their scalability and robustness is proven. Object transaction monitors theoretically combine the best of both worlds, but BEA's M3 product is barely a few months old.

Another approach to the problem is taking the well-worn technique of caching inside the component itself. Persistence Software uses this method, and has developed an application server that caches frequently used logic and data to minimize database I/O for EJB and C++ components.

The other side of the scalability issue is deploying components. Admittedly not a showstopper for small applications, the issue becomes more critical to high-throughput applications as they are likely to be more complex and involve larger populations of components. Repositories and component managers take care of half the equation: They store and organize components, but few, if any, are capable of deploying them at runtime. When large numbers are involved, it becomes necessary to ensure that the right versions of the components are deployed at the right time.

InLine Software Corp., Sterling, Va., has proposed an interim solution. A meta data modeling tool that acts as the bridge between Java development tools and application servers can extract EJB archive files characterizing business functions and convert them into eXtensible Model Interchange (XMI) files, a spin-off of the eXtensible Markup Language (XML). At runtime, XMI files are loaded into a temporary repository. Nonetheless, InLine admits that this is not a long-term substitute for a real, permanent repository.

Loose ends

Testing, however, remains the Achilles' heel of component-based development. Automated tools still tend to focus on aggregate behaviors, such as how well the screen navigation works and how well servers can handle the logical processing loads. For instance, Mercury Interactive Corp., Sunnyvale, Calif., recently added the capability to test DCOM calls on MTS. Meanwhile, Rational has consolidated its various load and screen testing tools from SQA, Performance Awareness and other acquisitions into the Darwin Performance Studio.

But as organizations deploy larger applications involving larger groups of components, only careful design can prevent components from being deployed with overlapping or conflicting functionality. Today's automated testing technologies can test DCOM or JavaBean wrappers, as well as perform data-driven regression tests to check the soundness of the logic under stress. However, they cannot look under the hood to judge whether a customer component identifying a customer might overlap with a credit history component that duplicates customer information functions.

Testing is but one of a number of issues that impact component development. The most effective means for scaling up component-based applications remains a topic of debate. And, of course, the issue of how standard "standard" components will be remains. For instance, could a developer of a back-end COM business component be confident with generating applications in the COM-compliant language or tool of their choice?

There is little question that components are in most developers' futures. But for all these questions, there is currently only one sure answer. In the world of component-based development, waiting for solutions remains a virtue.