Testing 1 2 17
- By Johanna Ambrosio
The bottom line is that there are few organizations actually doing this and even fewer tools to help. There are also no standards or widely accepted methodologies for testing distributed OO applications. And the very fact that it is a distributed application means that all kinds of things can go wrong — like network or machine outages, and slowdowns — that you cannot necessarily test for ahead of time.
But there is a silver lining in all this. First off, there is a raft of tools that have recently hit the market to help specifically with this task. Also, even though the technology is much changed from the days of procedural coding, many of the basic tenets of what makes for good testing have not changed — the mantra is still “test early and test often.” And there is much less actual code to have to test, assuming the application has been de-veloped with an OO language and uses accepted OO techniques.
The new tools (see “My kingdom for a testing tool,” p. 48) promise to automate the creation
of test harnesses built especially to bang against distributed object-based systems. Some will also help with load testing and scalability testing to ensure, for example, that the app does not spring a memory leak or that performance does not slow to a crawl after 100 users start doing their thing. There are some families of tools that will test everything from the basic OO model, its classes and sub-classes, to unit testing to make sure that the GUI and data layers are fine, to black-box testing to ensure that the application does what it is supposed to do in terms of functionality. Other tools will even help to isolate which object or component in a distributed app is the culprit if problems should occur. And yet others will help with regression testing to make sure any changes in a program (due to functionality or fixing problems) do not cause even more problems.
But it is a fragmented market, tool-wise, and there are few that work with each other. That may change slowly over the next couple of years if the vendor community adopts a new specification that Rational Software and IBM have collaborated on and submitted to the Object Management Group (OMG), Needham, Mass. The specification suggests a way to use eXtensible Markup Language (XML) meta data information to allow testing, debugging and other tools to share information. The goal is to allow customers to mix and match tools from vendors, secure in the knowledge that they will work together.
That is a future development, however, and in the meantime it is not easy to get started. And it is not necessarily due to a technology problem, either. It has as much to do with the difficulty in getting a proper testing group funded, a perception problem that has dogged testing ever since the early days of computing. Billie Shea, director of research at The Newport Group in Barnstable, Mass., calls testing “the Rodney Dangerfield of application development —it just doesn’t get any respect.” And that often translates into not getting much money for tools or a dedicated testing staff.
“It still gets back to money. By and large, testing happens when there’s a measurable reward or punishment for not testing,” said Mark Grand, an OO consultant working as a senior software architect for Atlanta-based eHatchery, an incubator for start-ups. “There’s a general correlation between what’s ‘cool’ and ‘in,’ and what’s profitable.”
The start of something big Thanks to the Web, some believe testing may soon be seen as “cool.” “The overall testing market is growing pretty fast, and it’s being driven by the e-business world where the cost of failure is much higher,” said Dick Heiman, research director of application development at IDC, Framingham, Mass. It is one thing to foist a poorly built and tested application on your employees who have to live with it. It is quite another to do that to customers or on the Web for the entire world to see.
The Newport Group predicts that the overall market for automated testing tools for distributed environments will grow from $609.1 million in 1999 to $827.5 million this year, a 36% increase. In 2003, Newport expects the market to be a whopping $2 billion. This figure includes tools to test object- and component-based applications, as well as the more traditional procedural market.
“We’re all the way at the beginning” as an industry, noted Todd Scallon, director of the California lab at Lexington, Mass.-based Segue Software Inc., one of the very few vendors with a tool aimed specifically at testing distributed objects. “If people are doing [distributed OO] testing at all, they’re building test drivers by hand,” he added. “Most testing has been focused on either the client side or the back end; there’s very little done in the middle tier. And in a distributed system, the business logic is in the middle-tier objects,” noted Sam Guckenheimer, senior director of technology at Rational Software’s Automated Testing Business Unit, Cupertino, Calif. He pointed out that most testing requires “building an environment around the components you’re building — and for most, that means building the system twice. You have to build a driver to test what you’ve just built.”
Richard Soley, chairman of the Object Management Group, agrees. “We’re not exactly starting from scratch, but it’s fundamentally more difficult to test distributed objects, and we do need other tools. There are plenty of tools for debugging individual pieces of software, but not as much integration among the testing tools.”
All of which is why, when it came time to build and test its new Java-based customer care and billing system, Empire District Electric Co. chose to go pretty much tool-less and build its own environment. “We chose a different path,” said Ron Yust, director of information services at the Joplin, Mo., utility. “We came up with our own API that defines how a developer interacts with the core programs. We have this framework we created, and it helps build business-specific components.”
The utility also uses its own, in-house-built wrapper tool to examine the data model and to generate programs. “It generates 90 percent of the code,” said Yust. “It lets the developer concentrate on the business logic and eliminates the need for housekeeping code for message transfer” and other things. For the most part, he noted, integration testing has been “eliminated,” and his group concentrates instead on unit-testing individual objects.
Yust said he has looked at various third-party testing tools but is not sure how they would fit into his environment. “Some seem overly complex to set up. I’m not sure the benefit would be worth the effort,” he said.
Iona Technologies, the Waltham, Mass.-based vendor that sells tools based on CORBA and Enterprise JavaBeans (EJBs), has its own set of in-house testing tools. In CORBA, the interfaces are written in Interface Definition Language (IDL). Iona has come up with a way of “applying IDL to the problem of creating testing interfaces,” said Steve Vinoski, the firm’s chief architect. Called the Parameter Passing Testing Suite (PPTS), it creates permutations and randomly creates values for distributed objects. The server knows which values to expect and will check those values as objects land on the server. Although PPTS is used internally at the moment, Iona would consider shipping it to customers if there is a demand.
And then there is an OO testing methodology called Full-Life-Cycle Object-Oriented Testing. Known as Floot, its basic idea is to test throughout the entire development cycle, from requirements to coding. “It’s based on proven software engineering concepts — the earlier you test, the less expensive it is to fix the defects you find,” said Scott Ambler, president of Ronin International, a consultancy in Castle Rock, Colo., and the author of several books on OO.
Rational’s Guckenheimer, however, is pushing a methodology that is less conventional. “We believe in iterative development where you prioritize the development by risk,” he explained. For that reason, he suggests starting by testing “the scenarios that prove architecturally significant issues. That implies you do integration testing early, before you do unit testing. You worry about how components work together in a system before you worry about individual components. Integration is frequently a major risk in distributed systems and so is performance.” With this method, he explained, “you get to look at those issues early.”
But not everyone believes this. The Newport Group’s Shea, for one, said it makes more sense to test the individual components or objects first because that way if there is a problem, you know where it is.
The scripting trend
In addition to the nascent tools and methodologies for testing distributed OO applications, there is another emerging trend: the use of scripting languages. A big part of the hassle and cost involved with testing is the need to write a number of different test cases to cover all of the various situations that might occur. Those test cases then have to be compiled; and if the situation being tested ever changes, the tests need to change as well.
But scripting languages do not need to be compiled. “Perl, Python and the Tool Command Language [Tcl] have been around for years and a lot of programmers understand them,” said Iona’s Vinoski. “You can write tools that are easy to change and extend.” The OMG recently standardized a couple of scripting languages, among them Python and CORBAscript, that people are using to do testing, added Vinoski.
In the future, most observers expect even more automation — for example, to be able to run tests unattended. Adam Kolawa, CEO at ParaSoft, a Monrovia, Calif.-based maker of several OO testing tools, expects to see an even more fundamental change. “The whole idea of OO technology is a struggle,” he candidly explained. “It’s very unnatural.” He is forecasting that in the “near future, XML will replace all of that. Remote procedure calls will return the objects you want inside the program.” Kolawa said that XML is much more than a basic tagging language, “it’s a protocol about exchanging information between programs. With distributed applications, you may have components operating properly but not interacting properly. I believe XML is the only way of doing it.”
Christy Stefani, lead engineer at Tellabs Operations Inc. in Lisle, Ill., said her wish list is fairly simple. “What I’d really like is to make it easier to share testing information throughout the entire development life cycle,” she said. She wants the development and QA teams to be able to communicate about problems and fixes without having to reinvent the wheel each time.
All of which goes to show that “this stuff is really tough, but it is achievable,” said Gary Barnett, director of research at Ovum North America in Wakefield, Mass. “Planning, design, upfront investment — all the things we know are true are really true. We all know the words, but we have to do it.”1
box, black box and regression testing for and C++
testing; builds test cases
suite for functional,
regression and load testing
for CORBA and Java
(Java coming soon)
OO testing: Resources
The Object Management Group, Needham, Mass., is the source of most things OO,
and has plenty of resources and links to other groups and companies. Reach them
via phone at 781-444-0404 or on the Web at www.omg.org.
The Component Vendor Consortium (CVC) has a testing program to ensure the quality of the components its members sell. Find them on the Web at www.components.org.
A pretty long list of tools suppliers — although not just for OO testing — is maintained by Reliable Software Technologies, a Dulles, Va.-based firm that provides assurance
solutions. Visit www.rstcorp.com/marick/faqs/tools.htm for tool information. Also check out the link to other sources of tool information.
Visit Scott Ambler’s Web page for developers (www.ambysoft.com), which includes links to his OO books,
white papers and UML resources. Ambler is president of Ronin International, a Castle Rock, Colo., OO development consultancy. There are also numerous books available on the subject. Here are just a few: 3Testing Object-Oriented Software, by David C. Kung, Pei Hsia, Jerry Gao a