In-Depth

Still coding and tweaking

Sally Cusack, an analyst at IDC, said that the use of development tools remains strong as organizations concentrate on extending and improving on their installed packaged applications. "In theory, the use of packaged applications allows IT organizations to put their developers on more creative tasks designed to add unique value to supporting the business itself," she said. "However, while much of the integration technology is still in its early stages, developers are still spending time on 'down and dirty' coding and tweaking tasks even with the proliferation of packaged solutions."

Greg Leake, general product manager for Microsoft Visual Studio, agrees that "some tools, like old-style CASE tools unable to adapt to the Web paradigm, are likely on the wane and are being used much less." Leake places Sybase's PowerBuilder, Inprise's Delphi and Oracle's Dev 2000 in this category. "But on the whole, we see that development tools have become even more central to an organization's success.

Without the right development tools, there is no way an organization could be effective with an e-commerce initiative," he noted. "From a Microsoft tools perspective, we just completed our best year ever in terms of overall licenses sold."

"Sun defines the battle around Java the language and EJB the component standard. But Microsoft frequently defines where the industry is focused," said Tom Keffer, chairman of Rogue Wave Software, a Boulder, Colo.-based company with products spanning both the Microsoft and non-Microsoft camps. "They pretty well define the development environment."

Change spares no one

Adjusting to the change in IT focus in recent years has been difficult for many suppliers of application development products, according to several experts.

"Many companies in this area are struggling due to an inability to move their technology quickly enough to the mainstream," said IDC's Cusack. "Others are struggling due to problems resulting from actions like mergers and acquisitions."

"My thesis is that most vendors out there have an incomplete story -- only IBM has both the breadth and depth
of requirements," commented IBM's Swainson. "You need a complete life-cycle solution. You need strong application development tools to build high-performance applications and management tools to keep everything up and running.

"Microsoft has a complete story, but its horizontal with no depth. It only runs within the context of Windows NT," added Swainson. "NT won't be providing a platform that's capable of running an enterprise any time soon. Sun has a story that's reasonably deep, but not horizontal. [It] has a collection of tools without tight integration with application servers. And BEA has a tight focus on transaction processing." [Ed Note: BEA was also quick to ride the Java wave with its purchase of specialty server house WebLogic in 1998.]

Said Swainson: "Oracle runs on lots of things. Oracle does the database piece well, but the application server, systems management and tools pieces are problematic."

Components

Industry analysts and a slew of experts have been touting the benefits of component-based development for large-scale systems for more than a decade. Yet only Microsoft has found success selling the concept on client machines. Microsoft's ActiveX components are supported only by the Windows platform. IT groups building applications for the enterprise still seek a silver bullet to cut down the substantial barriers to multiplatform, server-side component-based development. The latest candidate to fulfill the promise again comes courtesy of Java and its JavaBeans and Enterprise Java- Beans (EJBs) component models.

Experts have long blamed a lack of standards for the dearth of components beyond the Windows platform. A widespread embrace of the EJB specifications by both large user organizations and the vendor community could quickly expand the use of component technologies in corporate development organizations. But based on the dubious history of standards that were touted by supporters as vital to the future of computing -- witness the former Open Software Foundation's OSF/1 Unix standard -- skepticism abounds.

"There is huge potential for component-based development that has not been realized yet," said Li at Sybase. "It can become popular if there are standards, but the software industry has not matured to that stage yet. There are many political issues." Nonetheless, Li said he is convinced that "in the long run [widespread component-based development] will happen."

IBM's SanFrancisco component architecture, unveiled more than five years ago, has suffered through several fits and starts before latching onto EJB. The 1994 plan called for IBM and a parade of partners to build reusable components for the ISV community. That formula failed for several reasons, but mostly because the components were based on a proprietary model that kept the SanFrancisco components from working with those based on other models. IBM changed the model several times in the intervening years before settling on EJB with the SanFrancisco version that began shipping in mid-1999.

"Component-based development has emerged more slowly than we thought it would due to a lack of a clear standard
for components on the server," said IBM's Swainson. "Now Enterprise JavaBeans has emerged as a workable server standard, and applications based on that model are starting to emerge."

"Component-based development is finally coming into the mainstream," said IDC's Cusack. "With the continuing solidification of industry standards like Java-Beans, EJB and [Microsoft's] ActiveX, developers are beginning to take advantage of the reuse capability promised by components." Cusack said that IDC "projects significant growth in this area during 2000."

IBM's Swainson was a bit more cautious, reckoning that "it's going to take the better part of the next decade before this becomes the prevalent programming style. I see two phenomena. First, major companies like IBM are providing useful components, and second the application vendors, the ISVs, are restructuring applications along component lines. I think SAP has seen the success of [rival applications supplier] Siebel Systems and its component-based applications."

Downside of components and Internet

What does a genuine Web phenom make of all this? "In the Internet space, where you're dealing with software deployed outside your organization, all of these newly emerged standards and architectures can collapse on themselves because they are built on binary protocols and high latency networks where you can rely on the platform that you're using to integrate your software," said Jeremy Allaire. But the Web is different.

"On the Internet, you have low latency. You have no control over the platforms your customers and suppliers use, and you have this flat protocol called HTTP, which is what is used to enable the Web," said Allaire. "In that new universe, we're going to have to build XML middleware completely independent of language and platforms from scratch. That's a completely new set of requirements."

The inward focus of key vendors' component strategy will play out in a variety of ways, said Allaire. He points
to the upcoming Windows NT remake, Windows 2000, which embeds significant application server and component technology. The rollout of this platform, clearly, will be closely watched by all in the year to come. At Comdex, Microsoft went to lengths to allay fears about the stability of the new system.

"Core pieces of the basic COM infrastructure have been in NT in service packs and commercial versions for a while," noted Allaire. "MTS has been in NT. Microsoft has been successful in getting the top tier of its customer base to migrate to that. But the vast majority of Web apps do not have to use that infrastructure. There is not a huge percentage of apps that need that on the Internet today.

"Just as Windows 2000 is hitting the market next year with a heavy infrastructure, and likewise J2EE with its focus being inside the organization," added Allaire, the most sophisticated customers will want to go beyond that and distribute applications across the Internet. "That's a different set of challenges," he noted.

Continuing, Allaire said the appeal of Linux has not been as an app platform. "It's been more commodity plumbing for hosting Web [functions]," he said. He agrees that this has affected NT, which Allaire
depicts as a commodity server offering.

Modeling and objects

At the root of many advances during this decade is object technology. While it dares not speak the name 'object,' Java may be the biggest object success yet.

"Java essentially takes the object-oriented model and simplifies those structures," said IBM's Steve Mills, general manager of software solutions.

"It is a better language for writing object implementations, said Alan Brown, CTO at Sterling Software, Dallas.

Meanwhile, the object modeling products that carry on the flag of CASE usually do so under the Unified Modeling Language (UML) byname. The question of whether components are objects or objects are components does not even interest the academics these days. Many experts are predicting that Web development should mean a boon for modeling tools, a category long maligned in IT development circles.

"I think in the early days of object technology, the focus was on languages and methods," said SEI's Northrup. "People had trouble scaling to bigger systems. As a result, you have UML. Objects are now being viewed in the context of system building."

"Modeling has had a renaissance of sorts in the development community due to the increasing need to develop, distribute and integrate different programs, services and applications across multiple platforms, departments and organizations," said Cusack at IDC. "Without a consistent, systematic approach to the development life cycle, or a blueprint if you will, the entire development effort can come apart literally at the parametric seams."

"We are seeing growth, but any evidence of significant growth is mixed," said IBM's Swainson. "Modeling is still used predominately for more complex projects. There's no other way to build a complex application. IBM is preparing for further growth by tightening the integration of its tools and application servers with the Rational Rose modeling tool." In mid-1999 IBM and Rational signed a broad agreement that includes the integration project.

Rational's steady growth strategy, much of it based on its Rose UML line, has earned high grades from many industry players. Rational achieved $411.8 million in revenue for fiscal year ending March 31, with net income over $59 million. "Why is Rational successful?" asks IBM's Mills, "They are maniacally focused on serving the buyer -- the user of the tool."

Through its use of UML and the support of the "Three Amigos" (the nickname for the Rational team that first penned UML), Rational aims to drive sales in a lucrative life-cycle support tools domain, said Sterling's Brown.

Certainly Rational Software's measured merger plan has, to date, outperformed its most visible alternative, that being merger and acquisition dervish Platinum Technology. Platinum's stunning tools buying spree came to an end this year when Platinum itself was purchased by Computer Associates.

While UML co-inventor Grady Booch, chief scientist at Rational, sees some bumps ahead in the UML road -- people may be trying to make it do too many things, he suggests -- UML modeling can bring order to chaotic projects. The language should prove useful in e-commerce applications where much today is jerry-rigged with low-level technology and grunt effort. "Many e-commerce sites use Perl as if it's duct tape," Booch said at OOPSLA '99 in Denver. "They throw a lot of bodies at [the problem]."

But the Internet gold rush may win over sound method in the short term. People are taking less time to use UML with a method because "we're on Internet time," said Chris Kobyrn at OOPSLA. Kobyrn is chief architect, E.solutions unit at Dallas-based EDS. At issue too, said John Vlissides, prominent design patterns advocate and IBM researcher, the gap between design and implementation must be better addressed.

Will the pendulum turn again to tools -- object or otherwise? Perhaps. Among the hot tools companies has been San Francisco-based Macromedia Inc., once thought of as a multimedia tools house, but now, with Web awareness everywhere, perhaps ready to thrive. In recent months, IBM's alphaWorks and developerWorks Web sites have provided a constant stream of (sometimes offbeat) tools and utilities. A whole new set of software tool types is arising from the worldwide effort to create new wireless apps. There is always a chance some significant tool can bubble up from research and development labs, just as Java arose from a failed set-top box project.

And just as there has been a (slight) withering away of the operating system with Java, there may someday be a withering away of the application, say some. "There should not be an OS," said Smalltalk specialist Dan Ingalls, researcher, Walt Disney Imagineering. "A good language collects together the features you need. You can talk to a disk, for example, as an object.

"I hope there would be less of a notion of an application in the future," said Ingalls. 'Component,' he conceded, might be a more suitable term for applications of the future.

When keynotes collide

At the top level of computing, the development battle is sometimes presented in basic terms. At this year's Las Vegas Comdex keynote, recalling his early days at California Computing Faires, Microsoft boss Bill Gates recollected the then-common arguments of small computers vs. big computers. "How do we stop it being forever the big box world or the PC world? Being software scalability vs. hardware single points of failure?" asked Gates.

"Even the big box is not delivering what people expect in terms of hard-core scaling and reliability," said Gates, who off-handedly alluded to Microsoft's legal wrangles by way of asking if anyone had heard any good lawyer jokes lately.

Arrayed against Gates and Microsoft are a host of enterprise hardware and software giants. Speaking only for Oracle, but perhaps delivering a message others might second, was Larry Ellison, head of Oracle, at Fall Internet World in New York City.

"You want to keep your personal data and your personal applications on your PC," said Ellison "But shared data and the associated applications to access that shared data should be on shared servers. And those servers shouldn't be all over the place.

"You shouldn't have one in every bank branch, you shouldn't have one in every retail store. You should have as few as possible. The immutable law of data [is that] every time you take two databases and put them together, you gain information." While Oracle's tool business is healthy, critics point out the database is the driver at Redwood Shores, as Ellison's occasional comments may attest.

The Internet revolution and the reaction to year 2000 warnings have profoundly changed the way IT organizations develop software. No longer do business units throw requisitions over a wall and wait for development groups to fill the orders. Run-of-the-mill backlogs of five years ago have become intolerable.

Those traditional development backlogs, coupled with the looming year 2000 "crisis" prompted corporations to turn to packaged applications from suppliers like SAP America Inc., PeopleSoft Corp. and Baan, which promised to deliver integrated enterprise resource planning systems and allow IT developers to focus on developing proprietary systems.

"The key change for developers is dealing with a transition from client/server to Web-centric computing," said Microsoft's Leake. "This has meant that developers have to learn new computing paradigms like stateless server-side development, HTML page-based user interfaces and content-centric development. They also have to deal with new processes required to build Web applications. Namely, developers have to work much more closely with content editors and graphic designers than ever before, so having effective team processes and team-enabled tools has become paramount."

Internet hyperbole aside, the need to connect existing systems continues to overshadow interest in new system development. Will the coming decade be a wild ride? Maybe that is the only thing anyone can say for sure. Plenty of IT professionals will spend the eve of the millennium baby-sitting 30-year-old computer systems, watching for bugs. ADT fearlessly predicts that 100 years from today, someone somewhere will be baby-sitting a 130-year-old system. And if you built it, you can start being proud right now.