Let the data do the talking -- or, keep an eye on objects: The state of application integration

It is a bit like a soap opera and a bit like a baccarat game. Yes, it is application integration at the end of the decade. It is a high-stakes game that used to be limited to technical obfuscation. Now marketing obfuscation is part of the muddle. Last month, we noted that the successes of packaged application vendors like SAP and PeopleSoft are the models the 'packaged app integration' vendors hope to follow. This month, we look at a few of the data-centric integrators coming out of the data warehouse world. They also seek to imitate the packaged crew with what they call 'turn-key' data marts.

We will also look at object-middleware issues, for the fundamental rules still apply: Application development managers must balance elegant, long-term strategies against immediate tactical requirements. Since database data can sometimes have more pull than a specific application, managers will often consider data-centric systems coming out of the data warehouse community among their host of alternatives.

It is a little too early to prognosticate, but new attention on application integration may represent the end of middleware as we know it. This may be a good thing. While operating systems, databases and applications have become fairly well defined over time, that stuff in between has been anything but clear. The motto has been 'If it doesn't go in another category, throw it in with middleware.'

But the importance of a middle-layer glue has been growing, as has the industry buzz. Fast-to-implement integration software, that somewhat resembles the packaged applications it is meant to connect, is a hopeful sign for many harried managers.

Clearly, issues of marketing and "eveningwear" aside, application integration middleware draws far more attention today than the tools with which it is built. But there are many ways to go, and some peril too. Said one system integrator: "If you look at how application integration is done these days, there are 50 ways to skin this cat -- but 42 of them will put you in trouble."

Developers continue to approach middleware decisions from quite diverse perspectives.

* For some, the focus is on applications and adapters. The corporation may have made a decision to standardize on off-the-shelf enterprise resource planning (ERP) packages. Integration may be a simple question of hooking up a sales automation package to that system. The major complexity challenge here comes when I/S has to deal with updates to the separate packages. Small things change and APIs have to be rewritten. Is this via C? Is this via proprietary language or script? Start-ups like CrossWorlds Software, Burlingame, Calif.; Frontec, Stamford, Conn.; Active Software Inc., Santa Clara, Calif.; Vitria Technology Inc., Mountain View, Calif.; New Era of Networks (Neon) Inc., Englewood, Colo.; and others seek to play in this space. They may also urge I/S to focus on the process, more so than the application. After all, the issue today is that processes span applications.

* Others concentrate on messages or transactions. The larger the number of integrations, the greater the need for guaranteed and (perhaps) asynchronous message handling, and the more likely message- or transaction-oriented middleware will be the choice. In these cases, development managers have overarching concerns that lead them to focus on the "pipes" more than the specific applications.

* Still others look for a more encompassing solution that embraces object methods. Here are where some of the toughest tradeoffs are to be made. If a customer at my bank has six accounts, do I want to handle those all as one customer relationship? Advocates say the object way enables flexibility down the line -- but admit that building the flexible system takes a very significant up-front effort. For advocates of objects, it is crucial that the interface method be language-independent, even in an era when, according to the hype, the last programming language (Java) has been written. An ace for the object enthusiasts: Corba offers the object user a standard that spans languages.

* For some, the data does the talking. Data mining and data transformation are essential, and their need to move volumes of data influences their thinking. This can lead to anything from emulator-like screen scraping to massively scaled data warehouses. Still others focus on data, but on an as-needed basis. This latter group eschews the big warehouse approach.

Any mix of the above could be a solution -- a fact influenced by a major trend of the day that sees, for example, data-transform hub vendors in deals with transaction-processor makers, or transaction-process houses buying into object request brokers. And the focus of start-ups like CrossWorlds on immediate app-to-app connections should not belie the fact that the "off-the-shelf" solution may, as in CrossWorlds' case, be built on Corba object technology.

What is only clear at the moment is that these perspectives represent distinct camps from which development efforts begin a march toward new age application integration.

Off-the-shelf, up-and-coming

How are the "off-the-shelf" up-and-comers likely to fare? Not too badly, suggests Mike Gilpin, vice president, Giga Information Group, Norwell, Mass. He is among a handful of analysts that have gotten out in front of this issue. He understands that the quick-fix is the choice in many cases, but sees more-encompassing object solutions coming along over time.

"Today, customers are more likely to respond to application integration needs in a more tactical way," he said. "Most often, they seem to care about whether the adapters that come with the package suit their needs for fast implementation. So that favors the new people."

Adapters tend to be transaction- or packaged-based, commented Gilpin. "With some, you have to change the source code in order to change the adapter."

There is also a data-centric view of the world, noted Gilpin, which can lead to a choice of other types of solutions. However, he added, "There's a lot of overlap between kinds of solutions."

Gilpin joins a growing chorus of observers who state that, whichever path is taken, the larger solution is to be found in effective use of meta data. "If you have effective meta data capability, you have more flexibility," said Gilpin. In other words, if you have adequate information about your data (how, for example, an SAP date format maps to a date format in another application), then you can more quickly adapt to change when needed. [Ed Note: For more on a related topic, see the "The culture of components."]

Stand by your data

As Gilpin noted, for some, the data is the decider -- and data-centric solutions hold sway. And, like application integration firms focused on SAP- and PeopleSoft-related sales, there are some data warehouse companies focusing on easy connectability to off-the-shelf ERP applications. Sometimes these are described as turn-key data warehouses or data marts.

The fact is that many data warehouse projects quickly become application integration projects. Data-centric application integration can successfully take the form of a data warehouse, said at least one off-the-shelf application integration software vendor, when data and applications are more tightly coupled, and implementation is less time-sensitive. Naturally, if different applications have different semantics, a good deal of time must be spent making sure they do not step on one another.

Include SmartDB Corp., Palo Alto, Calif.; Enterworks Inc., Ashburn, Va.; Influence Software Inc., Sunnyvale, Calif.; Constellar Corp., Redwood Shores, Calif.; and others among a host of data-integration alternatives for staging legacy, client/ server and ERP data.

While others tend to emphasize the data warehouse more, Enterworks and Constellar are surely in the ranks of those companies approaching application integration from a data-centric point of view. Constellar is one of several firms that uses the term "hub and spoke" to describe its architecture.

"We've been doing enterprise application integration for 10 years," said Brian Donnelly, Constellar chairman and CEO. "Sometimes the source happens to be a data warehouse, but we don't care. To us, stores are just data sources." The Constellar Hub itself sits on an Oracle database.

We talked recently to a user of Constellar Hub. As he enumerated the applications and platforms he works with, Drew Cifordelli, data analyst, Princeton University, Princeton, N.J., said, "At Princeton we have so much diversity that it's hard to get your arms around it all."

New applications are mandated to run on the Oracle database, he said, but there is plenty of legacy to connect, and for that he employs Constellar Hub software.

"Our role is to try to give access to data at its source," said Cifordelli. "But elements like addresses are handled differently in different applications. When you change an address in one department, it doesn't filter to others. It's a data problem, not a tool problem. Constellar makes a stride in making that a little bit easier to get to."

He continued: "Our source data is in legacy environments. How we get it out into the Oracle environment is the issue. And it takes more than just Constellar to do that. It takes some policies on extracting data. [By using the Hub,] you can see the meta data in one place.

"It's useful when you're able to pull it all together in one interface from a data administrator's point of view. You have to have a place where you can manage [the data transformation]," he said.

From Cifordelli's perspective, when you place a Constellar-style hub in the middle of your distributed environment, you still have point-to-point interfaces. And the "point-to-point" architecture is generally associated with the idea of "application spaghetti." But with Constellar, said Cifordelli, now you can view integration points from one point of view.

As discussed in Part 1, Enterworks takes data transformation as a starting point to integrate applications. The company's Virtual DB, built in part on object engine technology from GemStone Systems Inc., Beaverton, Ore., is said to provide a unified view of enterprise data across multiple platforms. Virtual DB user David Brown attested to the value of the system's object capabilities. Brown is principle specialist for programming and analysis at Boeing's McDonnell Aircraft and Missile Systems business unit in St. Louis.

"We've been in production for two years with several systems that use this basic tool," said Brown. His work began with system integrator Telos, from which Enterworks was spun off. The first project was an information system covering the Navy's F/A-18E/F Fighter construction.

The Navy would request assembly drawings and related data. Yet to get the kind of information they were talking about -- and pulling the data from varied sources -- could take a couple of weeks, said Brown. The data-accession list, said Brown, included IMS and DB2 mainframe databases, and Sherpa [Corp.] application views residing on Oracle Unix-based databases, as well as "straight-up" Oracle on Unix. The accession list has been limited to about 10 items rather than the 50 items McDonnell-Douglas started with.

Said Brown: "Virtual DB for us is really a marriage between the GemStone object engine and a Sybase Omni SQL system. You connect to the database, run a program that interrogates the database and then, with Enterworks, collect Sybase data into the GemStone engine. It becomes a meta catalog, and now there's a mapping with the object store, and now you can build objects that reflect what is in the database.

"We've not had the concept of data warehouse [in this development]," said Brown. "That was not our view of it. That may be due to our [implementation] view because we go against live databases. What we're doing is spanning those databases. We automatically go from one to the next -- they were silos before. It is the object medium that allows us to do that." Brown said the new cross-process application has its own flow. Now drawings from different applications are presented in the context of the way people work.

Ben Kaplan speaks for many in the data warehouse community when he carefully distinguishes his approach from the likes of CrossWorlds, which dubs its offering as a form of processware. "There are two different views and two types of integration," said Kaplan, director of product marketing with Broadbase Information Systems, Menlo Park, Calif. Broadbase offers what it calls enterprise performance management solutions. SAP R/3 data is a major target.

"So-called processware tries to speed up the time it takes to integrate, for example, an SAP and a Siebel Systems' [sales automation] app," noted Kaplan. "That is different than extracting the data from the system. It doesn't address analysis. What we're doing is taking the information from SAP and Siebel Systems and establishing a view of customer profitability. We enable companies to build analytical applications fast."

There is often a wariness about the cost of big data warehouses. This is borne out in part by the approaches newer, turn-key data warehouse companies are taking, and by senior I/S staff comments. "I can't afford a big migration or reengineering cost," said Jim Jennis, senior specialist in manufacturing and information systems for Imation, a $2.3-billion maker of imaging and information tools. For his part, Jennis, based in Middleway, W. Va., has gone with a component-based application integration environment from SuperNova Inc., New York City. "I can provide complex remapping of data without disrupting the flow of my business," he said.

Timing is an issue that distinguishes message-oriented application integration from data-centric application integration. And, developers confront bandwidth and execution timing issues as 'back-room data warehouses' quickly become 'operational data warehouses.' "After all," said one observer, "a data warehouse is just like a big, temporary data store.

"But you can't, in near real-time, make it event-driven as with a message broker," this observer continued.

"Part of application integration clearly relates to data integration," said Ron Zahavi, director of object technology at software and services firm Concept Five Technologies Inc., McLean, Va. "At some point you are going to have to deal with transformations or semantics."

This leads to a basic truism of development, one that is often forgotten. Said Zahavi: "Different people start toward different solutions. And they discover they have to do things they didn't think about initially.

"They think they are just integrating data. They later realize they are integrating applications," he explained. "They need an overall architecture or approach. Not having that is one of the easiest ways to have a cost overrun." The growing emphasis on application integration among some data warehouse tools vendors may help address these issues.

What's the object?

Increasingly, corporations demand that I/S move quickly. Long-term object-oriented projects may be in some disfavor as a result. Many observers and players advise not to jettison object hopes altogether, however.

Today, most corporate departments have a very specific problem, said Giga's Gilpin, and the first thing they consider is the availability of a needed adapter. The second thing they consider is the amount of systems integration required. "None of these solutions are complete -- they all require integration," said Gilpin. So the question then becomes, as with any long-term object or meta data project, who has the resources? For the work that does need to be done, does the I/S department have the necessary people? Does the vendor?

"This all needs to be part of your thought process," said Gilpin. "You have to consider the total package of product and services."

He continued: "The majority of solutions people are using today are based on messaging technology. Messaging provides 'looser coupling.' That means it's easier to plug systems together. You don't have to do as much work to bring about the connection. But if the SAP module changes in the next release, you may have to go back and change your interface."

Moreover, you are less able, with messaging, to create an integrated view of, for example, a customer as may be available from a bunch of merged application views. Objects have merit if you want a richer level of functionality, Gilpin noted.

"Banks used to think of you as five separate accounts. Today, some banks are building middle-tier business objects that look at relationship information. That level of richness can't be supported in messaging," he said.

"As we go forward, more and more object technology will be used," forecasted Gilpin. "But the action is not so much about Corba as it is with the application servers." In this active category of object-oriented middleware, he mentioned NetDynamics Inc. (now part of Sun), Menlo Park, Calif., GemStone, BEA Systems, Sunnyvale, Calif., and Inprise Corp., Scotts Valley, Calif. "Most intend to support EJB [Enterprise JavaBeans]. Some also support Corba. The Corba support seems to be more attractive to enterprise customers. It gives you multilanguage support," he said.

How are the object middleware makers positioned today? Framingham, Mass.-based research group International Data Corp. (IDC) estimates that both commercial message-oriented middleware and distributed transaction-processing middleware sales are more than double the size of object-oriented middleware today. While most all middleware segments are projected to excel -- only the use of commercial remote procedure call (RPC) methods is expected to decline -- IDC expects message-oriented middleware to garner an even more significant portion of the middleware market in 2002 than it does today.

Of course, 2002 may look quite different. As stated, these categories are blending. "Message people are adding object capabilities. Object people are adding messaging," said Concept Five's Zahavi.

If there is a blending of technologies underway, as some suggest, a different approach to distinguishing between alternatives is in order. For example, Vytas Kisielisu, vice president of operations for Frontec AMT Americas, said it may be worthwhile to consider whether middleware tools support a few or many points of integration. As well, developers should distinguish whether the type of integration is simple or complex, and whether the completeness of a solution may at one end be described as architectural (requiring more hands-on work by end users) and at the other extreme be described as "off the shelf." The key elements of application integration, by Kisielisu's estimation, should include data transformation, protocol conversion, error handling, queuing, data validation, logging, change management and workflow support.

It's the process ...

Sometimes lost in the application integration hoopla is the emphasis CrossWorlds and others have placed on processware -- a perhaps heavy-handed term that tries to convey the importance of the business process over the technology choice. The potential has not been overlooked by industry forces such as Hewlett-Packard (HP) Co., Cupertino, Calif.

Like so many others, Hewlett-Packard sees a future in application integration. Its approach takes the form of server software that provides process automation for the enterprise, in effect separating out business logic from the application so that change is better enabled. Clearly HP's middle-tier computing strategy is quite different than competitor Sun Microsystems, recent purchaser of application-server boutique (at least when compared in size to Sun) NetDynamics.

"We will be a competitor to CrossWorlds," said Leith Anderson, worldwide marketing manager for Hewlett-Packard's Electronic Business Software Division. The vehicle for this competition is Changengine. More specifically it is Changengine 2.0 Admin Edition, released July 27. This process automation solution grew out of workflow research conducted by HP in the United Kingdom. For now, the company is working with select ISVs and systems integrators to prove that process automation software can work as billed.

"Changengine technology could be an enabler," said Anderson. "You may connect one application data source with another application, but what do you do with the data? For example, you may want to read an employee file from a PeopleSoft application, qualify the employee for a pension offer, process a pension application, and finish off in an SAP application that would eventually send a check."

This is inherently a business process automation task, chided Anderson. It becomes "application integration" when it requests data from the application.

The product arose out of people asking the question: 'Can we measure and monitor events as they take place?' Thus, you use Changengine (and, in effect, it is a middleware server) as a pipe between, say, a packaged HR app from PeopleSoft and an accounting package from SAP. But it is a "smart pipe" -- you theoretically can systematize (and measure) your business processes as you connect the different sources.

As the name Changengine implies, the point is to make a system that makes packaged applications more flexible, and more quickly changeable. Besides providing a high-level interface, Changengine supports C++, COM, Java and Corba programming interfaces.

Such ease of programmability (and note that HP is early in providing a full Changengine solution) drives some middleware choices today. It was a factor that drove Costa Mesa, Calif.-based Optum Software Inc. in August to select the Mercator integration solution from TSI International Software, Wilton, Conn., as the connectivity agent for its supply-chain execution portfolio. TSI, which arose from the electronic data interchange world, has been cited by IDC as the market-share leader in a newly described application integration engine segment of the middleware market.

Again showing the blending of middleware categories, TSI recently forged deals with other middleware forces BEA Systems and Tibco, Palo Alto, Calif. TSI's Mercator has been selected by SAP as a means of compliance with standards of the Open Applications Group (OAG). Founded in 1995, OAG was among the first to place the spotlight on application integration in a world inhabited by off-the-shelf ERP packages.

The drive toward changeability is driven in turn by programmer resource restraints. This, too, is an issue for object technologists. Asked Frontec's Kisielisu: "Where do you want to spend your resources? How fast do you need to build? That is the real challenge."

Big I/S shops are not unable to build their own message brokers. In fact, "You could probably build the single message broker to end all message brokers," said Kisielisu, implying that time might be better spent on other tasks.

People who need people

At the end of the day application integration may boil down to people. Jerry Donolin is CIO for Cedel, a European clearing and settlement depository based in Luxembourg. Cedel is using Tibco technology to move from a batch model to a real-time model of financial transaction settlement.

Cedel uses software from Tibco, one of the traditional middleware leaders. Cedel uses TIB/Rendezvous as the medium of data exchange, and Tibco Message Broker to translate formats over this medium. Tibco's Object Bus product provides a Corba framework over the "TIB." In the future, Donolin hopes to feed data to an SAP system with a Tibco adapter. His architecture approach makes provision for a variety (some object-oriented, some not) of middleware types.

Donolin was asked what he sees as the major issues in application integration. He responded: "I think the most important thing is to have people who have a view that integration is a very important part of the development process. Integration should be at the forefront of people's thinking. You need an architecture that from the beginning deals with diversity.

"Often people don't consider integration until they're too far down the road," he added. "Going back later in the game is guaranteed to cause problems."

Donolin's general view on objects mirrors the approach he took with the Cedel architecture. "There's a multiplicity of uses of middleware." he said. "Some have use for objects and some do not." The blend that is the new integration software landscape reflects Donolin's view.