Bring on the Java testers
Java developers have found that a good testing tool is essential for building a high-quality application; such tools are a must to meet schedule and performance requirements of Java systems.
- By Deborah Melewski
- January 1, 2001
Once considered a rather mundane task, and ofttimes relegated to the tail end of the development cycle, application testing is a hot topic in today's Web-enabled world. The need for testing applications has never been more obvious, and even the fast-paced Java application development environment has become test-tool-friendly.
Interestingly, the Java development world would seem the least likely place to add a technology that has historically been perceived as "difficult" and "time-consuming." Yet it is here that developers and test/QA teams alike have found the greatest need for a good testing methodology and associated automated testing tools.
Automating the test cycle can provide a better understanding of how the application will behave in the real world. The challenge lies in knowing what to test, how to test, when to test and how often to test. Because Java applications are distributed and usually complex, it is difficult to apply conventional testing methods to the entire application.
At any given time, New York City-based Internet consulting firm Concrete Inc. may be building 10 to 15 Java-based projects. In addition to partnering with Internet start-ups, the 150-person company works with established corporations to leverage the power of the Internet. Concrete typically works with multitier Internet development projects that range in size from two to 10 engineers, and in price from $100,000 to eight-figure engagements.
"We don't classify ourselves as being in the I-builder category," said Michael Flickman, CTO at Concrete. "They [I-builders] tend to view Web development as something other than true software development, and from our many years of experience, we've always known that building Web sites is indeed true software development. Internet development requires all the same processes and methodologies that we've used in our past lives.
"You can't view Web development as just a bunch of designers or HTML programmers throwing a Web site together—and away we go," continued Flickman. "We need to follow core processes that are pretty strict as far as guidelines go, and that pretty much adhere to what you might expect a true software engineering house to be."
According to Flickman, it is never too early to test. "We test, even in theory, with documentation," he said. "Everything is built and derived from Use Cases. Early on, as designers or the User Interface architects are defining a site map and Use Cases, we begin writing test cases against the Use Cases. We can then validate them on paper or white boards.
"We may find in this walk-through that there are open questions, or we may reveal holes in the Use Case itself," noted Flickman. "Testing is beneficial here as well, because once you've nailed your Use Cases, the architecture and everything based around building the product is derived from there."
Concrete Inc. then implements a four-step development methodology that includes the following:
- The assessment phase—requirements gathering.
- The shape phase—finalizing the project plan, prototyping and proof of concepts. Designers start to craft the look and feel, and the User Interface architects begin to map out the site. It is here also that front-end HTML and JSP pages take shape, and the software engineers begin the second tier.
- The actual build phase.
- The grow phase—deployment and monitoring.
The company's actual testing begins in the shape phase. Prototypes are tested as they are built. "It's important to ensure that the prototype from an architectural standpoint performs well," said Flickman. "For example, we need to know if we're passing data in some format that may be corrupted."
Java's promise of reusability presents another reason for Concrete's rigorous test cycle. The company's Tools, Architectures and Reusable Products (TARP) Group defines reusable architectures and components for inclusion in subsequent client cases. The group's primary goal is to maintain a repository from which EJBs can be harvested.
"When an engineer on another client engagement is going to harvest those EJBs, or anything reusable out of that repository, we have to feel confident they're not harvesting a defective product," explained Flickman. "If a defective product goes into a half dozen other projects, you can figure out the math as to how many problems I'll have."
Concrete has a very strict process and criteria for entering EJBs into that repository. The e-Test Suite from RSW Software Inc., Waltham, Mass., is Concrete's standard automated testing environment. The RSW tools are used extensively for application functionality testing, performance testing and load balancing.
Figure 1. RSW Software's Bean-test application tests Enterprise JavaBean middle-tier applications.
"Even though Java is considered an open standard, we tend to deal with a number of different third-party vendors in our work and each may have implemented the language in different ways," explained Flickman.
"For instance, if you're dealing with EJBs, you may write a suite of EJBs for BEA's WebLogic Server," he said. "But that doesn't necessarily mean that they'll run out of the box on IBM's WebSphere, or ATG's Dynamo Application Server. So the notion of 'write once, run anywhere' is true in a sense, but when you start mixing third-party application servers and server engineers, there is some work to be done there."
Concrete has found specific strengths in the RSW toolset. "When you talk about Java, it's bigger than just programming," said Flickman. "It's EJBs, Servlets and JSPs, and it's JavaBeans as opposed to EJBs. It embraces many different techniques, including RMI, JDBC and more."
Rather than providing an "all-inclusive" testing product, which Flickman compares to a Swiss army knife that does not really do all you need it to, RSW Software's offerings are becoming more granular. For example, RSW offers Bean-test for testing the scalability and functionality of EJB middle-tier applications. Flickman is hoping to see more applications along those lines; for example, a Servlet Test or JSP Test product.
While a number of products test clients written in Java, and a fair number of products are available for testing HTML in a browser, there are not many tools to date that test the middle tier explicitly. Rational Software Corp., Cupertino, Calif., is among the companies moving to fully support middle-tier testing. In December, the firm will release Rational Quality Architect, a component of its Rational Suite.
Because the architecture promoted by Sun's J2EE platform encapsulates the business logic on the middle tier's EJB level, a significant need to test directly has surfaced at this level. "Companies need an easy, automated way to test those EJBs, initially as components and again when they are integrated into substances and used in conjunction with third-party license components," said Sam Guckenheimer, Rational's senior director of technology for automated testing.
According to Guckenheimer, testing the large systems that people are building on the Java platform will account for more than half of the effort expended. At the same time, there is a new e-software paradox: much shorter time to market, higher quality requirements, and a greater complexity in the technology being deployed.
If testing is more than half the effort, then the greatest way to achieve the business benefit of better quality in a shorter amount of time is to take testing activities and move them in parallel, said Guckenheimer. "The problem with that, in practice, is that early testing requires you to test pieces like EJBs that are encapsulating business logic before there is a user interface through which to exercise them.
"Current practice is that if you're going to do a good job testing early, you need to build the components [as well as] a test harness around them in order to exercise them," he went on to explain. "Over time, you replace the test harness with the real running components for those other pieces, so effectively you have to build the system twice. And that's really expensive."
Added Guckenheimer, "Developers are measured by how much code they produce, and how quickly. They don't like to do this because they perceive the testing as getting in the way of their real job, which is writing code. Therefore, the early integration testing doesn't tend to get done very well."
Rational's December release for component testing will automate the process of generating the test harness for components. This will allow components to be exercised, and early integration tests to be generated automatically out of the visual model. Without having to effectively write the system twice, early testing can be done in parallel with the development activity, and by the developers themselves rather than by testers later on.
Choice Hotels International, Phoenix, is a complete Rational shop from development through QA. The company is using the current version of Rational's PerformanceStudio to test an all-Java rewrite of its Web site. Function and performance testing are of critical importance to the company.
Figure 2. Choice Hotels International's Web site allows users to find hotel and room availability at more than 4,900 hotels and inns by specifying some 16,000 points of interest.
Choice Hotels franchises more than 4,900 hotels, inns, all-suite hotels and resorts in 39 countries under the brand names Comfort Inns, Comfort Suites, Quality Inns, Clarion Hotels, Sleep Inns, Rodeway Inns, Econo Lodge and MainStay Suites. Visitors to the choicehotels.com site can make reservations online, obtain current rates and availability, and find detailed location information as well as special features.
The Web site uses geocoding, a search process that allows users to find hotel and room availability by specifying some 16,000 points of interest. In addition, a personalized point-to-point map program allows users to chart specific routes. The site runs on two Solaris-based Sun 450 servers, with an Informix database on the back end. A Cisco load balancer is used between the two servers.
"Testing is a huge part of our development cycle," noted Chad Mason, manager, test automation at Choice Hotels. "We have great management support for that, which is helpful. They recognize that we need to schedule adequate testing time into the process because we stand to lose a lot of money if any of our applications go down due to defects."
While Mason and his team anticipated that there would be problems testing in the new Java environment, all went rather smoothly. "We focused on load testing in about the middle of the testing effort," said Mason. "Functional testing was done earlier, with manual testers and standard function test scripts recorded in Rational Robot prior to that.
"We then built and executed load test scripts to ensure that in the event of a complete rewrite we'd know our application was stable, and I would not have to do much maintenance on the scripts I'd already done," he explained.
According to Mason, it was particularly important to test how the site would perform in the real world. "We needed to identify potential bottlenecks, and you can't test these things manually.
"With Rational PerformanceStudio, we can test 50 people at any one time clicking the reservations button, for example," he noted. "We may see 500 people online simultaneously, but they would all be doing different things at the site. Virtual user scripts with several different scenarios of users hitting the Web page were used."
The new rewrite went online a few months ago. Mason credits the Rational tools with helping his group get up to speed very quickly. "Rational allowed us to hit the ground running," he said. "It was very easy and quick to build schedules, record small pieces of scripts that fit together easily, and to then massage them and move them around once they are in place to simulate different scenarios, user loads and such."
|A sampling of Java testing tools
||Java Testing Support
|Coast Software Inc.
||Coast WebMaster 5.0
||Scans Java Script, validates links and checks for broken links
||Farmington Hills, MI
NuMega DevPartner Java Edition
||Flashline QA Lab
||Testing service; tests third-party JavaBean components, EJBs and Java code for structure, performance and server-side capacity
|McCabe & Associates Inc.
||Test planning, monitoring and measure
|Mercury Interactive Corp.
|Load and performance testing
|RadView Software Inc.
||Load, performance; unit level test of Java components, EJBs
|Rational Software Corp.
||Rational Suite PerformanceStudio
Rational Quality Architect
|Load, performance, function, regression
Targets testing of the Java middle tier
|RSW Software Inc.
|Tests EJB middle-tier applications
|Segue Software Inc.
Performance, regression test for pure Java; EJB servers
|Software Research Inc.
||San Francisco, CA
||TCAT for Java
||Test coverage analysis for Java applets/applications
When, and what, to test
With a focus on e-business, Lexington, Mass.-based Segue Software Inc. has attacked Java's distributed nature by providing tools focused on very specific areas. SilkTest is the company's traditional functional and regression test product. SilkPilot is a regression test tool directed at the middle tier, and the tester can automatically build test cases against back-end Java components. SilkPerformer addresses performance and scalability.
Figure 3. Segue Software's SilkPerformer is shown here testing transactions on a Web site.
Todd Scallan, vice president of Segue's California development labs, believes in testing early and often. Having come from a software development background, Scallan said he has always liked the iterative model, "where you develop some, and then you test some."
"In the end, this produces a higher quality product, faster," he noted. In terms of a good methodology, Scallan also cites the importance of a good design. "When you have a good design and good interface specs, you can begin to build your tests even before you've implemented a lot of code," he explained. "Getting skeletons in place forces you to think about what behavior you want your Java program to exhibit."
Scallan said there needs to be a clear distinction between testing Java as a GUI, and testing a Java program that is an API. "There are various reasons for testing a GUI-less Java program; maybe the GUI doesn't exist yet, maybe you're testing an SDK you want to deliver, maybe you want to do unit testing so you're not testing the entire application at once," he said. "One distinction people need to make is that testing a GUI is different than testing a server."
One Segue customer, Thomson & Thomson, Quincy, Mass., provides online research services for trademark, copyright and script clearance. The company developed its Saegis Web-based application to provide its subscribers with online access to the firm's research. During the past two years, the Saegis user base has grown from 250 to 10,000.
When Saegis first went online in 1996, the site was running with C and C++ code. A later version added Perl code, and in 1998, Thomson & Thomson decided to move to Java. In November 1999, a new system was released.
The new version maintains sessions, and can search against a series of trademark information databases, the U.S. Patent and Trademark Office, individual state registrations, a Canadian database and 13 European databases.
"One of the biggest problems we found with Java was memory leaks," said Brian Chase, QA manager at Thomson & Thomson. "The release we did last November was very high profile for the company because it had the stateful search ability where you can do multiple queries at a time, and build search logic from preceding queries," he said. "We found a lot of problems with memory leaks in the way we stored the sets when we brought the session back to a person."
To resolve the problem, Thomson & Thomson deployed Segue's SilkPerformer and SilkTest tools. "SilkPerformer allowed us to do multiple transactions and track memory leaks over a period of time," said Chase. "SilkTest allowed us to do a looped set of individual query types to see what would happen if you did the same query over and over again."
Another problem area was response time. The company's client base comprises predominantly intellectual property attorneys who are relatively new to the Internet. A comparative issue existed between the speed of the Internet and the speed of a modem connection, which is what most clients had been using.
"Our customers had an idea of what a response time should be," said Chase. "The Internet and its related graphics could slow speed. We used to get the comment, 'Give me the information, but not the window dressing.' [Customers] were critical of having too many buttons, waiting for images to load and so on."
Thomson & Thomson has geared up from the notion of testing at the point of release, when developers were done with their code. "That created process problems because you'd try to set a date for release, spend four to six weeks for testing and, if you found any bugs during that time, there wouldn't be enough time to hand the code back and still keep the release dates," said Chase.
Prior to its success with the automated testing tools from Segue, all tests were done manually. For a good load test, 50 to 100 people would log onto the system at the same time. The system would then be monitored from the back end. Some light scripting for event-type situations was incorporated, but no metrics were collected.
When the Thomson & Thomson Web site first went live, the results were catastrophic. The site had to be torn down, and it took the company four days to recover. "We learned from that experience that we needed to do a much broader job of testing. Not just regression testing on the code side, but [regression testing] on the performance side to validate that the architecture would work correctly before kicking it into production," said Chase.
"We'd had a lot of successes in 1996 and 1997, so it just seemed second nature that we could put the site up," explained Chase. "We'd started with small boxes and a small architecture, then began adding new boxes to it and modifying it. It became old hat after a while.
"The failure set a new tone in the way we handled testing," he continued. "We knew we'd need to be diligent in testing our return and database info, the interfaces and the back end, as well."
In 1998 and 1999, the company did not have a good development process or version control system in place. Because of that, code was constantly breaking due to incompatible versions. Bringing in a good regression test tool and a version control system (VCS) allows the company to know its code is maintainable. "It was a learning experience," said Chase. "The developers were doing good work, but we were constantly finding bugs coming back again. With the new VCS, the old bugs can't be re-introduced."
Segue was chosen for testing because of its high level of automation and ease of use, but also because the company had success stories in the Web area at the time. "They seemed to support the Web side better than the other tools providers we looked at," said Chase. "They seemed to have the Internet market; and I had compared job postings on Monsterboard to see which product was the most popular within companies I considered competitive or [with those that] did a lot of Web work. Segue came up a lot."
Test early, test often
While testing challenges remain, the Internet's rapid-release cycles and continuous changes make automated testing a must-have to ensure quality. When to test, where to test and how often to test? The old mantra "Test early and test often" seems popular among most in the industry.
The needs of the Java testing community are great. "Traditionally, the thinking was that these tools may slow us down, and we already have a way to get the job done," noted Segue Software's Scallan. "But the interesting thing with Java is that people want the tools. They come to us and say, 'We have a very short window to get this application developed and out the door, and we need help.'"
One thing that has been proven beyond a shadow of a doubt is that you cannot run an online business with performance problems or errors. When an Internet application fails, it means lost business for a dot.com or it could push said company into the newly created black hole known as outofbusiness.com. Companies are learning that in the ramped-up Internet development cycle there is more potential for costly errors than ever before.