Peace of Mind Is PricelessJava Testing with jtest!
Version Reviewed: 3.01
- By Tad Kershner
- January 19, 2000
Current Version: 3.01
Cup rating system:
5 Outstanding 4 Very Good 3 Acceptable
2 Has Potential 1 Poor
LIKE MANY DEVELOPERS, the hardest part of the debugging and QA process for me has always been at the end, when everything we've found has been fixed and we sit in eerie silence and some disbelief that it actually works. Our unease derives not from a distrust of our own code, but of code in general, where experience dictates that for every found bug there are two hiding. Matters are even worse now that shortened development cycles and feature creep on the front end steal so much time from testing at the back.
Fortunately, jtest! from ParaSoft provides a complete set of automatic testing tools that integrate well with manual testing. In the past six months I've watched jtest! undergo both a major and a minor upgrade. The first, from 2.0 to 3.0, expanded its feature set and put it in the true big leagues. The 3.01 upgrade streamlined several annoyances and added a general level of polish to the package, which is very welcome for a tool in constant use.
Installation and Setup
Installation is a simple matter of downloading the installer from ParaSoft's Web site and calling to obtain a license. ParaSoft is careful with its licenses, because it's quite possible to run an entire project through the evaluation version and fix everything (although the demo only reports the first three instances of a given error). This isn't likely to be a big problem because it's easy to get hooked once you get a look at the work and headaches that it saves.
Once you've registered for an eval or even a purchase, expect someone from Parasoft's sales staff to check in with you on a fairly regular basis. This is actually somewhat of a blessing because it's an easy way to get immediate technical support for any issues you may be having. I've had good results with ParaSoft's Tech support, an experience that is unfortunately becoming uncommon.
The package comes with fairly complete documentation in HTML, and your rep will walk you through the process of using jtest! for the first time. I poked around and was fairly conversant with it by the time he rang me back. Except for a few unobvious parameters, the interface is easy to use and fairly self-explanatory once you understand the paradigm.
Also included is a somewhat extensive set of code examples, illustrating most of the features and many of the errors that jtest! finds, but you'll probably be anxious to dive into your own code. The only real barrier is setting the classpath, but even that has been streamlined in 3.01. The older interface required the traditional long list of comma-separated values, which frequently overflowed the small text-entry box provided and made editing or deleting entries difficult.
With 3.01, ParaSoft introduced an easy-to-use list that maintains each item in a classpath as a separate entry. Click an existing entry to edit or delete it. To add an entry, either browse to it or type it in, then click the button. A further nicety that I hope will catch on and spread to other applications: If you click with an entry still waiting to be added, it prompts you to do so instead of just discarding it for you to discover later.
Before testing, there are two sets of parameters to set up: "Global," which sets default behaviors for all tests, and "Params" (which should be called "Local") for the current test. These control which tests to run, and various parameters for each test. Pay specific attention to Globals>Test Case Generation>Test Methods, where you can select the types of methods to test in your classes. By default, package-private and private are disabled, but turning them on provides a more extensive test.
In the "Params" section for the current test, you can set each test to true or false or to "inherit" from its Global setting. At the beginning of a test, all are defaulted to "inherit," and changing them involves a little interface weirdness. Clicking on a parameter toggles it between true and false; setting it back to "inherit" requires a right click.
A little time spent tuning the parameters up front pays off with better and more usable results later.
jtest! performs three types of dynamic testing: Black-Box, Regression, and White-Box. With the familiar Black-Box Testing, the user specifies inputs and other testing parameters, which are then fed into methods for comparison with desired results. Regression Testing stores method inputs and outputs between different code versions and alerts to differences which can then be scanned for undesirable side effects of new features or bug fixes. Regression Testing should be disabled or ignored for methods that are not intended to always produce the same results.
The key component of the package is White-Box Testing, and it is the one that is truly unique to jtest!. It incorporates what ParaSoft calls a "symbolic virtual machine"-namely, it actually runs through the byte codes in a simulated Java virtual machine (JVM). For each byte code, it solves backward to see if there are any potential conditions or values that would cause it to throw a runtime exception.
A common example that also points up a caveat of this type of testing is a Divide By Zero exception. In the White-Box model, when the VM encounters a statement such as x=foo/bar, jtest! solves backward to see if bar could ever be 0. If it could, jtest! marks the potential exception, along with its location, stack trace, and the test case input that could cause it.
In most cases, these will occur inside code you control and the error message points out legitimate logic or flow errors. On the other hand, if your class or method is to be used somewhere outside of your control, validity checking isn't always useful. Consider the case where bar is a method parameter. Including code to pre-check all method parameters and preemptively throw a Potential Divide By Zero is somewhat redundant. At a certain point, you have to trust the users of your classes to pass in valid parameters.
Still, it's nice to be informed. The foregoing example illustrates that it's not always the best course to run out and blindly try to fix everything that jtest! lists. It's important to remember that the exceptions that crop up are potential and not actual. When proper care is taken with method parameters, for example, many null pointers will never occur. Like a spell-checker, jtest! is necessarily literal and requires a little human guidance. Fortunately, that guidance is easy to provide because a right-click on the error brings up options to "Ignore the Outcome" or "Suppress." Saving the test saves the configuration along with it so that unwanted errors will never show up again. Or you may prefer to leave them in as a reminder.
The static analysis was folded in at the 3.0 release from a stand-alone product, and this addition greatly enhances what was already a must-have tool. It is essentially a style and rules checker for the source code itself. It encompasses 52 rules that range from the sublime ("always capitalize static finals") to the ridiculous ("don't explicitly import package java.lang"). Rules are classified by severity and can be enabled or disabled by setting the severity level for the test.
I prefer to tune the rules individually and it takes only a left-click to toggle. Right-clicking brings up a contextual menu that allows you to either enable/disable again or view the rule's description. Because the former is already handled by the left click, I would have preferred going directly to the definition. That's something you'll only be doing once or twice per rule though, so it's not a major issue.
Depending on your development environment, you will be turning many of them off and using a few of them a lot. Two of my favorites are "Avoid unused variables" and "Unused private class variable," which can reveal spelling and search-and-replace errors in variable names, as well as scoping problems.
The static checking is also good at finding gotchas from well-meaning junior developers, such as the ubiquitous "Use 'equals' instead of '=='" or the harmless but expensive "Use StringBuffer instead of String."
One feature that could be either particularly useful or not used at all is the ability to enforce naming standards. Standards can be set (or ignored) individually for names of just about everything from classes and interfaces to method parameters and instance variables. Just turn the rule on, specify a regular expression, and enforce away. Thankfully, I haven't had occasion to use these!
The main window has two dynamically updating results areas: Test Progress and Errors Found. Test Progress shows the number of static rules currently enabled, and contains statistics about dynamic cases executed and total coverage. Due to complexities in code and dependencies, the total coverage percentages can be a little low sometimes for particularly complicated classes. If I didn't have to be realistic, I'd like to see the coverage hover closer to 100%. It's a vital area that I'm sure ParaSoft is working on.
Of course, the most important area is the actual Errors Found window. Here you'll find the results neatly organized with a familiar expand/collapse tree structure. Each dynamic error entry provides the error message itself, as well as the stack trace and full information about the test case that produced the error.
Right clicking on the error line in the stack trace lets you view the code or edit it in your editor of choice. The viewer brings you right to the appropriate highlighted line; this is not currently possible in the editor but would be a nice addition. As a convenience, jtest! even provides a javac front end to recompile your changes so you don't have to kick off your IDE for little fixes.
To get the full context of an error, you can view or edit any line in the stack trace, not just the one where the error occurred. This is, of course, limited to anything you actually have the source for; if you derive from a standard superclass, you'll be confined to viewing your own code.
As mentioned earlier, you can instruct jtest! to ignore the error in the future. If the cause of the error is not apparent, you can view the whole test case input and even see a sample test case as a fully formed Java class. For static errors, you can still view or edit the code as well as view the rule description itself.
Because you'll often find more issues than can be resolved in one sitting, tests (both individual and project) can be saved and reloaded.
For the detail-happy, the report provides all the information you could ever want and can be generated for either single classes or projects. It shows a full listing of all parameter settings for the test, including on/off status for each rule, the complete suppressions table, and the classpath. Each error lists the stack trace and the input. It even ends with a code listing showing exactly which lines were covered and which weren't. There's an option to publish the report to HTML format for group usage.
Where jtest! really shines is its Project batch processing with which you can analyze an entire directory, .zip, or .jar. Pressing the button in the Class window brings up a second window for entering control parameters for the project test. The Project window is almost identical to the Class window and works intuitively.
In another great 3.01 improvement, if Project encounters a .class file it can't find the source for, it opens a browser dialog that lets you navigate to the source directory and then optionally add it to the project's source path. This is a powerful addition, because the source tree is often considerably more convoluted than the deployment tree, and having to dump all the source files into one big directory before use is a huge pain, especially if you're taking advantage of jtest!'s convenient source editing and recompile features. Source path settings are saved with the project.
During the run, the top section of the Project window shows cumulative errors by error for all classes, while the bottom shows results by class. The Project window is actually controlling the Class window underneath, so you can watch testing results as they show up in the class window.
Scanning a whole project is decidedly time-consuming, so be sure to have a good book (or another computer) in hand before you start. Better yet, run a project scan overnight. Just be sure to have your settings all configured beforehand: There's nothing like that sinking feeling when you return in the morning to find it stopped with a prompt on the second classfile!
At the end, the Project window displays results in the familiar tree structure. Right-clicking on each class brings it up in the Class window, but everything you need is in the Project window.
You're going to want a fast machine and lots of memory; not just for execution but mostly for communicating with the GUI. The GUI is written in Swing, and as such suffers from the sluggishness we know and love from Java GUIs. Trying to fine-tune a rule set on a slow box is a maddening exercise in waiting for toggle switches to catch up with mouse clicks, only to have to reset several when they get out of synch. Fortunately, this annoyance goes away with a speedy processor.
One major limitation is that jtest! currently only runs under Windows. The fact that the UI is written in Java suggests that a port or two may be around the corner, which would be quite welcome.
Being literal, jtest! tends to come up with many superfluous errors. With careful tuning and a little work, it's easy, if a bit tedious, to sort through them to find the important ones. I'd much rather have that than a wizard that thought it knew better than me.
Having used 2.x and 3.0 for more than 6 months now on projects of varying complexity, from single-developer one-class projects to team projects with more than a hundred classes, I found jtest! to be a great tool with a long list of minor UI annoyances. The 3.01 release, which was just made available at this writing, fixes all but a few of them and makes a truly useful tool truly usable as well.
2031 South Myrtle Avenue
Monrovia, CA 91016