Reporter’s notebook: A random walk through the modeling firmament
- By Jack Vaughan
There may actually be only a handful of really good ideas in the entire history of computer technology, which may be why we tend to see the same ideas cropping up again and again. But that in itself is not a bad thing.
The journey from idea to execution, as this year’s batch of business self-help books attests, is a difficult one. At any step along the way, you can get lost. A technology can come to market with incredible promise, be snapped up by buyers eager to excel, but be oversold and inadequately supported.
Often, someone in the department quietly scales back the most extravagant objectives and the corporation finally settles for some minor improvement on what it could have bought off the shelf. Sometimes the minor improvements can have major significance. IT managers may even sometimes find themselves far enough ahead of the competition to take a weekend off.
This process is not altogether unlike Gartner Inc.’s fabled Hype Cycle -- Technology Trigger, the Peak of Inflated Expectations, the Trough of Disillusionment, the Slope of Enlightenment and the Plateau of Productivity -- in which technology goes through steps as if it is in a laundry cycle.
I spent 10 years in the Trough of Disillusionment one night. And the Slope of Enlightenment proved to be a roller coaster ride.
* * *
“You know in every other industry they plan things. Look at architecture; they have blueprints, they use off-the-shelf components. They don’t re-invent the wheel each time they start a project.”
Have you heard this before? There are slight variations on the theme. If you work in the computer hardware industry, the building trades look better. If you work in software, the computer hardware industry looks better. It is trite to say planning ahead is key but, again, cynicism cannot take full charge; there is some truth in the notion that “we wouldn’t be in this mess if we had planned better.”
The main message is that somewhere someone is modeling their problems and solutions, and they are able to move more quickly from one problem to another and better able to manage technology, which, let’s face it, can be a bear, a tiger or both.
“Patterns” have been widely discussed in software circles for at least a decade -- they are an effort to come up with repeatable processes to make an engineering science of software. But they do not necessarily promote the idea of blind adherence to blueprint plans. Most patternists claim architect Christopher Alexander, who studied the principles of design governing growth in villages and the like, as an inspiration.
Alexander said a natural organic order emerges in the place of a meta plan, as opposed to a master plan. It is sort of a cross between best practices and paleontology’s punctuated equilibrium.
The flexibility implied in software patterns plays to a receptive chord in the developer community. Master plans are not popular among developers, as too many have failed. The great ‘ubermethodologies’ promoted during the Case era -- the first big Silver Bullet Era of software marketing -- will never, it seems, be forgotten.
Developers are quick to point out the Case modeler toolbox sitting on the shelf and unused since that bygone project and long-ago head of engineering departed. This forms a backdrop for the recent history of interest in software modeling.
* * *
Modeling and code generation tools have gone through the ringer and emerged none the worse for wear in the software business in recent years. The Model Driven Architecture (MDA) has been the center of modeling focus in some circles of late, largely due to the committed proselytizing of the OMG.
Next big things are at issue here. Unsympathetic observers suggest that MDA is itself an effort by the OMG to find “the next big thing” after its flagship CORBA standard has been bypassed by Java in the hype race. But OMG members, the people who actually have to work with this stuff, have had to cope with more than Java. .NET, Web services and other architectures have come along in Java’s wake, and there is little doubt that something new is in the wings as we write.
The OMG and its members want to get out of this less-than-benevolent cycle. They are looking for an abstraction layer and meta data means to isolate implementation from planning. When the next thing comes along, and the last new Java app has been writ, you can go back to your model library and redeploy to that next thing. It makes a certain sense.
UML, which came under OMG’s tutelage as Rational Software sought to standardize models and notation, set the stage for a rebirth in models. As UML grew, the usual complaints about modeling were heard, the big one being that the model, several months into a project, typically no longer resembles anything near the reality of the code.
The other knock on UML came from deep in the modeling camp itself. UML was not tied closely enough to code-generation tools, some software-engineering types said. UML notation could be done with just a whiteboard and dry marker. But why not put the computers to work on generating systems from UML? (Dyed-in-the-wool, command line coder types didn’t need no stinkin’ code generator and didn’t see a problem here.) But the links to generation of UML are greater in UML 2.0, although the fruits of all this are, for now, for the ranks of ISVs (who will work to update their products) and the real-time embedded system crowd.
The buzz on UML diminished a little, as Extreme Programming (XP) came forward as a potentially faster-acting alternative. On one level, XP was an indictment of UML, although it did share some of the same principles associated with UML development. No matter, Agile Modeling subsequently came forward as something of an interesting amalgam of modeling and “working fast.”
Software companies like Borland and Compuware are taking MDA pretty seriously. But model interest is not limited to the MDA-kind. BEA, Sun and IBM have all begun work on Java tools that abstract out complexity to ease Java development. Call it RAD, call it encapsulation . . . the gist of it is that you describe what you want and the software wizards generate the basic skeleton of the code.
So here come models!
* * *
Some of the new attention on models stems from a renewed interest in automating, orchestrating, documenting and rationalizing business workflows using intelligent middleware routing. Efforts to ease the burden of integrating these processes sometimes concentrate on visual drag-and-drop business process modelers. Some activity is in the Java camp, while some of it is in the Microsoft camp. Microsoft has been a little quiet on modeling of late, but that may change.
Even as long ago as 1997, modeling and UML might have been part of the show at a Visual Studio tools rollout. But the modeling tool then was a somewhat mysterious tool, certainly overshadowed by audience questions about the latest Microsoft object marshalling architecture, ADO, an IntelliSense code rectifier, or what have you.
Microsoft’s UML capabilities came in part by way of a Rational Rose UML diagrammer that came with the Visual Studio product. An overlooked diagrammatic competitor at the time was Visio, which was largely seen as a network diagrammer, but which in fact had begun to add UML modeling facilities. Visio certainly wasn’t overlooked by Microsoft, which bought the company. Visio is still a popular standalone tool, and elements of it are embedded here and there in the Microsoft product line.
A few industry pundits noted that Microsoft founder, Chairman and Chief Software Architect Bill Gates mentioned -- though not too prominently --model-based programming at this summer’s MSFT Financial Analyst Meeting.
“Model-based programming, it’s the idea of very high-level specs that get used to describe how software is supposed to be pulled together,” said Gates, “and it’s not just something off on the side. We see that as literally part of the source code, and generating as much of the code from that model as possible.”
One thing that scares people about Microsoft: They don’t give up. Some other companies would have crawled back into their holes after something like Windows 1.0 was released; Microsoft kept on working to get it right.
What has occurred so far could be described as tentative stabs at software modeling on the part of Microsoft. But the company is watching interest in MDA, and could be preparing improved modeling tools of its own. Don’t bet that the company will give up on modeling; after all, it is an idea whose time will come again and again.
Jack Vaughan is former Editor-at-Large at Application Development Trends magazine.