In-Depth
Is UML heading for fragmentation?
- By Richard Adhikari
- September 29, 2003
Earlier this year, members of the OMG adopted the Unified Modeling Language (UML) 2.0 infrastructure. While work is proceeding apace on the language, one problem screams for attention: It is not possible to enforce compliance with UML 2.0 standards.
This means that different dialects of UML 2.0 could spring up over the next few years, resulting in vendor lock-in -- which is exactly what UML 2.0 was designed to prevent.
UML 2.0 supporters are divided in their approach to this issue. Some say it is important to enforce compliance and urge that the OMG set up a reference implementation, a test suite and a model interchange. Others, including the OMG itself, contend that taking de facto implementations and standardizing them is more important, and that user feedback from customers struggling with dialects that fail will help to improve the standard.
As for the language itself, both sides agree that UML 2.0 offers strong improvements over previous versions, especially its ability to support complex object representations and component definitions. This will help to create platform-independent applications where the business logic is paramount -- or would be, if some form of compliance could be enforced.
Technical stuff
The requirements for UML 2.0 are specified in four complementary RFPs issued by the OMG. They are:
* Infrastructure RFP -- primarily concerned with architectural alignment, restructuring and extension mechanisms;
* Superstructure RFP -- primarily concerned with the refinement and extension of UML 1.x semantics and notation;
* OCL RFP -- primarily concerned with defining an OCL meta model; and
* UML 2.0 Diagram Interchange RFP -- primarily concerned with defining the meta model for diagram interchange using the XMI facility.
There are also three Meta-Object Facility (MOF) committees. But breaking down the task of overhauling UML into several parts has its pros and cons, said Cris Kobryn, chief technologist at Irvine, Calif.-based Telelogic, and chair of the OMG Analysis and Design Task Force. The downside is that there can be communication problems. “Sometimes breaking something down into parts makes it go faster, and at other times it doesn’t,” he said. “Alignment between all the standards committees takes a lot of time, and there’s a lot more standardization overhead than in the past.”
There are several aspects of UML 2.0 that are “very significant” to users, said Kobryn. One is bona fide support for component-based development, “so now you can model components in a very straightforward manner, which you could not do with UML 1.x,” he explained. “UML 1 wasn’t able to handle the complexity of large complex structures, but UML 2.0 can.”
Users can now recursively decompose large structures and behaviors, and integrate them so they can define components -- how they are broken down into sub-components, for example -- and assemble them together. This hierarchical decomposition capability is “very important” to accommodate large, complex architectures, Kobryn said.
For example, if an organization is modeling something like an enterprise application or an aircraft, it has to break systems down into subsystems and sub-subsystems, and components into sub-components. UML 2.0’s component modeling is “a paradigm that cuts across all branches of engineering,” explained Kobryn; it is also being used as a lingua franca in aerospace firms, which engage in projects where engineers from various disciplines have to communicate.
UML 2.0 has a “very advanced activities diagram capability,” noted Kobryn, adding that its structure and behavior are “very much integrated.” The technology also shows “significant advances” over earlier versions in terms of the tools it implements and its ability to help solve real problems, he said.
With UML 2.0, said Grover Righter, members of the OMG are trying to broaden process representation and behaviors beyond what was available with earlier versions of the standard. “A lot of this stuff used to be hidden in the way people implemented C++ or Java libraries,” said Righter, who is vice president of marketing and technical strategy at San Rafael, Calif.-based Kabira Technologies Inc. “All you saw were the exposed APIs and, while those implied a way of doing things, it wasn’t open for analysis.”
UML 2.0 “takes business process methods and brings them up to an object level -- how do you deal with meta data repositories, object repositories, etc.,” added Righter. “While earlier versions of UML saddled users with the need to represent and implement code, UML 2.0 gets rid of this, and all they have to deal with now is business process analysis.”
This focus on business process analysis is a strong selling point, Righter said. “We see the advancement of using models for serious software development. Our firm has seen an expansion from telecommunications into the financial services field, particularly around card and payment processing for the same reason as the telcos; there were the software and hardware layers, and now we have a network services layer with transaction switching all over the place.”
Currently, 10,000 to 15,000 financial transactions are being conducted over networks worldwide, and that will expand “10 to 15 times over the next five years,” Righter said. That will lead to reduced transaction cycle times and an increased risk of fraud, so “you’ll need more and more business logic at the network level. That’s being done in UML 2.0 today,” he said.
These enhanced capabilities are all very well, but standards compliance could be an issue.
Fragmentation or healthy divergence?
While UML 2.0 allows users to focus on business process analysis, this analysis can be instantiated in different ways. Righter admits that, for the near- and mid-term, there could be a problem with fragmentation of the language. “With more than one representation, how do you know you are conforming to the standard?” he said.
“We have never actually, ourselves, directly done compliance testing,” said Andrew Watson, vice president and technical director at the Object Management Group. “Our strategy from the beginning was to work with other organizations on compliance.”
He noted that, with UML, “it’s actually difficult to automate the testing. It is a visual language. The complexity comes from the fact that it is used not for an interaction between computers, but for an interaction between persons.”
Watson added that UML tools have been somewhat monolithic and are thus difficult to test for compliance. This is changing, he said, as monolithic UML tools become “chains of UML tools.”
Compliance has historically been an issue with UML, noted Telelogic’s Kobryn. “For example, we were supposed to provide a standard for model interchange back in 1997 and there’s still no standard for that yet,” he said.
For example, Andover, Mass.-based I-Logix Inc. -- one of the seven founding companies behind UML -- states on its Web site that “there is no compliance authority at the OMG to attest to whether something is or is not compliant with UML 2.0. There is no certification procedure or test that can be run to pass or fail. Thus it is a hard argument to make as to who is and who is not compliant.”
There are three compliance layers in UML 2.0: Basic, Intermediate and Complete. The specifications include a compliance table that allows any vendor who wants to do so to state how it complies with the various levels or sub-packages. But Kobryn doubts vendors will take this offer up.
The OMG’s inability to enforce compliance with UML 2.0 could cause problems for users, stated Kobryn. “There will be a challenge for users to determine which implementations are UML 2.0 and which are really UML 1 dressed up,” he said. UML 2.0 has vastly improved its capabilities to specify large, complex structures and activity diagrams can now specify complex business workflows, “but without reference implementations and test suites, it’s difficult for the user to determine whether a vendor is selling UML 1.x or the really good stuff,” Kobryn said. He suggests that OMG members adopt a procedure similar to that of the Java Community Process, which provides test suites so vendors can test their applications to ensure that they comply with specifications.
Kabira’s Righter disagrees. “The school of thought that I belong to, and which is characterized by the IEEE and the OMG, is that you ultimately have to standardize the proven, existing practice,” he said. “One of the most successful standards, IEEE 802.3, for example, was taken wholly from Xerox PARC, industrialized, manufactured and debugged, and we ultimately invented Ethernet and Ethernet Type 2 from one existing, proven implementation.”
Different industries will develop different subsets of the language according to their needs, said Righter. That “will cause a feedback loop to the OMG as to what needs to be changed or implemented,” he explained. Righter sees UML 2.0 as “the equivalent of a new template or a new dictionary for UML and, because of industry and business requirements, there will be some variants; we’ll learn a lot and there’ll be UML 2.1 or 2.4,” he said. Some users and vendors will have to re-do some of their work, but “it’s better to do this as a collaborative process than to have 20 vendors pick UML 1 as a frozen standard and declare everything as a variation,” noted Righter.
Righter also said that a lot of work will be done in translating UML down to an XML or XMI interface, which will provide a compatibility layer. “The question is not how you implement UML as a tool but, once you export UML as an XML or XMI interface, can multiple tools import it from different sources? That’s the industrial goal. Are there going to be bumps along the road and shouting matches? Yes. But what we’re doing is creating a higher standard for what component behaviors are like,” Righter said.
The OMG’s Watson points to a record of improvement in UML standards under OMG’s tutelage: “Before we began to work on UML standardization there were different notations and techniques -- at least a half a dozen. Moving from one technique to another took time. There was confusion in the marketplace when we started. By 1998, with UML 1.0, there was a market growing at more than 20% and it continues that way as far as we know.”
That has been accomplished without compliance requirements, he noted. However, that is something “we are working on starting with the XMI testing,” he added.
XMI is an OMG-backed interchange format for meta data, including models. Model sharing is essential to interoperability, explained Watson, and it is something the OMG is preparing to address. A recent XMI standard includes Diagram Interchange extensions for such a purpose.
“This is somewhere were we can get a lot of visible user benefit. Because that is a very specific data point, users want to know how the tools interoperate. That’s what the user cares about.” The real push in this regard, according to Watson, “will come next year. People want some type of guarantee of application portability from us.”
The real-world view
In an e-mail interview, Branislav Selic, principal engineer at IBM Rational, and Sridhar Iyengar, distinguished engineer at IBM WebSphere, said that the model-driven development (MDD) approach of UML “automates many of the error-prone mechanistic aspects of software development, such as translating application-specific concepts into computing technology equivalents. This is the primary reason why UML 2.0 is important to practically all software developers and their customers,” Selic and Iyengar said. MDD focuses on creating models that are then translated into equivalent programs. This improves product reliability and productivity.
UML 2.0’s backward-compatibility with earlier versions has been defined to be user-friendly -- users can take on only the new features they need, according to Selic and Iyengar. However, tool vendors will have to put in quite a bit of work. Vendors may have to completely rewrite their tools or at the least put in “a significant development effort” to remodel them, depending on how the tools were created, they said. But because the majority of UML 2.0’s new features are based on capabilities that were already proven in earlier UML tools, “many of the tool vendors have the necessary infrastructure in place in their tools,” Selic and Iyengar said. For example, Rational’s XDE technology base was built on these premises, they said.
So, is standardization necessary? If you think back to the Unix wars, then the answer is yes. Experience has shown that allowing vendors to guide technology implementations is like getting the fox to guard the henhouse. And while the OMG stresses participation by both vendors and users, users are not being effective enough if they cannot insist on compliance enforcement with OMG standards.
Please see the following related story:
“How to manage requirements” by Dan Romanchik, or click here to go to Web-only material on UML.