In-Depth
This is not a test
- By Sandra Taylor
- July 12, 2001
Client/server systems were hard to debug, but there were comforts associated with the technology. The software
had matured to the point of relative stability, plus the systems were generally restricted to a predefined and
internal user population that could be trained in system policies and procedures. Similar comforts are not associated
with Web technology. Information systems (I/S) groups must now deal with a host of new issues, including browser-based
clients. In addition, Java is a young language based on a new programming paradigm comprised of components and
objects. And there are new middleware infrastructures such as COM/COM+, DCOM and Corba. The scene is made more
complex by the fact the user population can range from zero to thousands, and even to millions, at any time. These
issues are set against a backdrop of increasingly complex back-end systems.
Are the testing/debugging tools keeping pace? The answer is twofold. The world of the Web consists of systems
which run the gamut of complexity -- from the basic Web server with HTML pages, to the high-end system with a distributed
application spanning application servers, database servers and, with increasing frequency, a mainframe. The existing
and emerging tools available to test these environments are as different as the environments themselves, both in
form and function. The organizations typically responsible for administering the tests are similarly distinct.
Therein lies the challenge of bringing together the right tools with the right individuals from various groups
in an I/S organization.
Six point releases a year and counting
The idea
of "Web time" quickly went from amusing motto, to insider joke,
to cruel mandate. In so-called Web time -- also known as Internet time
or Web years -- time is accelerated until development efforts are measured
in weeks, not months. This is especially true at the highest of Internet
high flyers, Netscape Communications Corp. At Netscape, truly, Web-time
forces are helping to break down the traditional wall between developers
and testers.
"Internet time equals insane development schedules," said Michael Lopp, Java platform
manager at the Mountain View, Calif.-based Web browser market leader.
"When Navigator 1.0 came out, we created a QA [Quality Assurance] lab overnight," Lopp
said. It was simple in the beginning. "We had people just surfing the Web, clicking on links, saying if [the
presentation] looked all right to them." But the pressure to decide if a 'build' was good, on 16 platforms,
was mounting. "Now we have a point release every two months. Each needs a full level of testing on the multiple
platforms," said Lopp.
Automation of testing was imperative. "Builds were arriving every day. And we spent half
the day tracking changes in the builds," said Lopp, who turned to Segue Software for some test tools solutions.
Lopp (who's business card includes the whimsical title, "bit sniffer") said he found
Segue's 4Test language sophisticated enough to allow developers to "get creative" in the test sphere.
Moreover, "Segue has the only credible cross-platform story," he maintained.
Today, Netscape is using Intranet technology ["Eating our own dogfood," said Lopp]
to support its test efforts. Bug tracking is Web-based. An Oracle back end is used to store test cases. Lab machines
each run a distributed version of SilkTest or QAPartner from Segue. The QA department can execute unattended automation
runs, and results are posted to all team members via E-mail. "Netscape virtually eliminated manual acceptance
testing using SilkTest and Web-based distributed testing," said Lopp.
Such successes in Web browser testing are particularly welcome these days -- burdened Web servers
and gateways are quickly becoming the next frontier for many of the bit sniffers of the world.
-- Jack Vaughan
|
The test classes considered below only touch the surface of the available tool inventories. For example, native
testing facilities of various operating systems are not discussed.
QA testing
Other than Web site and link testing addressed below, QA testing software is the newest of the distributed systems
testing technologies. Typically developed and managed by the QA organization, these tool suites offer the most
exhaustive pre-
deployment trials for a new application. No other class of software has the facility to exercise the entire application
and the underlying infrastructures (middleware, operating systems, networks, databases, etc.) in a real-world environment.
These tool suites, however, are not inexpensive. License costs and, perhaps more importantly, costs to train and
secure the staff needed to run an automated QA effort can significantly increase expenses. Even though test tool
vendors have made major strides in ease-of-use, this is still a case of developing software to test software. And
that is non-trivial.
The investment may be hard to justify for I/S organizations developing applications that are relatively simple
or limited in scope. For a growing number of companies, however, the Net and E-commerce have raised the stakes.
Net-enabled business-to-consumer and business-to-business operations are fast becoming a competitive essential.
A Net-enabled L.L. Bean catalog without a competitive offering from Lands' End? Not likely! Or an online brokerage
system from Charles Schwab while Fidelity sits back and watches? Not on your life! In the world of E-commerce,
the stakes are high, the applications and underlying systems are complex and the mission is critical.
How prevalent is E-commerce? Consider the profiles of the prospective buyers coming through the doors of automated
testing companies. Said Brian Leseur, executive vice president, research and development, Segue Software Inc.,
Newton Centre, Mass., "There's been a noticeable change in the general tenor of the questions we're asked.
In the beginning, we had to spend a lot of time explaining what automated testing was and why it was important.
Now we see companies, driven by intense time-to-market pressures, building very serious applications on very young
technology. These companies are no longer asking why they should test, but how to do it intelligently," commented
Leseur. In support, Jayaram Bhat, vice president of marketing at Mercury Interactive Corp., Sunnyvale, Calif.,
also sees the change. "Companies are willing to invest in QA testing for their E-commerce applications because,
very simply, the cost of failure is so high. With E-commerce, that cost is quantifiable, and in some cases, that
cost is astronomical," said Bhat.
The current mission of automated testing tools is to uncover problems, but probing to find the specific cause
of the problem often lies in the domain of the next two technologies: network traffic testing/ analysis tools;
and program debugging aids.
Network traffic analysis
Response time may well be the Achilles heel of the Net. Time spent staring at messages such as "Web site
found," "Connecting to ... " or "Waiting for a reply" does nothing to ease a user's impatience.
And remember the early HTML implementations? More than a few page authors could not understand why screens were
taking so long to complete. They had no idea that TCP/IP was disconnecting and reestablishing connections between
transmissions. The connections can now be kept open but the example is noteworthy. The programming paradigm is
changing as the era of distributed objects that reside on Net-enabled clients and back-end servers begins. Like
the early HTML page developers, more network surprises await today's developers. In a complex distributed system,
the culprits can lurk in a myriad of places.
System management tools typically provide insight regarding network traffic issues. Problems that lie deeper
within the system and require more in-depth probing, however, must employ a class of tool (in the form of hardware,
software or a combination of the two) known as packet sniffers. Initially, these products offered a fairly rudimentary
capability that produced volumes of cryptic data to represent network traffic. Today's products offer dramatically
improved deciphering and information presentation facilities. Some packages have progressed to the point where
they can be programmed to respond to different conditions, such as a predetermined traffic threshold or a given
number of detected packet failures. The responses include alerts that are automatically sent to network administrators
and describe the offending condition. Some products even watch for packet sequences representative of a hack attack.
Still, other packages operate at a higher level of the network stack and have the ability to capture database-related
protocols such as Oracle's TNS.
Software development
Classic debugging tools generally provide the oldest type of testing facility for developers. With the ability
to set breakpoints, step through a code sequence or set/display data, these tools continue to be the mainstay in
the developer's testing tool kit, regardless of whether the tool suite is a 3GL or one of the newer 4GLs. Add the
increasing sophistication of tool suites that scan code looking for usage errors not typically detected by compilers,
and the developer has a formidable arsenal of debugging tools.
Automated QA tests
For those not familiar with automated testing, the
functionality typically falls into two general categories: regression and load testing. Many vendors apply different
names and offer variations on these themes.
Regression testing
Driven from the client perspective, the automated testing software "feeds" defined
user inputs into the application and verifies the application's output matches the anticipated results.
The "defined user inputs" and the "anticipated results" are derived most
commonly through the use of the tool's scripting language. The base theory is that all subsystems involved in the
process are functioning accurately and consistently if the application is generating the correct output.
Load testing
Most automated testing software
drives the application from the client perspective. With load or stress testing, however, the emphasis is on ensuring
acceptable response times to an increasing number of clients and verifying, to the extent possible, there are no
interaction or timing-related problems.
|
Perhaps more interesting than the debugging practice itself is what is being debugged. While Java has certainly
spiked the interest of the development community, it must be noted that research from SPG Analyst Services, Natick,
Mass., shows the percentages for Visual 3GLs and 4GLs are also increasing. In fact, respondents in mid- and large-size
companies predict the use of these languages in their companies will more than double over the next two years.
Additionally, the point spread between Java IDEs and Visual 3GLs or 4GLs disappears when respondents were asked
to rank the two classes by perceived importance. The numbers show the two environments were, in fact, dead even
at a 7.65 ranking out of a possible 10.
SPG Analyst Services believes that Java is filling the niche for a general purpose, portable language. Although
it is not commonly acknowledged, the language is fully capable of building a variety of application types and is
not restricted to Net-enabled systems. Unlike many 4GLs and scripting languages, Java is a 3GL. As such, Java is
fully capable of directly accessing operating system primitives (at the expense of portability); interacting with
hardware; or targeting non-traditional data sources such as object-oriented (OO) databases or realtime data feeds.
Java developers have also eliminated some problematic features associated with C++, such as GOTOs, automatic data
type coercions, preprocessing, operator overloading, pointers and others. Lastly, Java's support for automatic
memory management and garbage collection reduces the all-too-common problem of "memory leaks."
For many, Java is a better OO language than C++. Like C++, however, the difficulty of programming in Java can
limit its usability for whole classes of developers. While Java is a simpler programming language than C++, it
is still a 3GL and developers who know and love RAD 4GLs will soon see the difference. Happily, the Visual Java
tools have abstracted much of the programming, using technology such as wizards and templates, and, like 4GL RAD
tools, they ship with a variety of data-aware objects. This still may not be enough; at some point developers will
need to drop down to Java.
JavaBeans provides one solution to this issue when used as a bridge between the minority of technically savvy
Java developers and the business-savvy domain developers. The technically sophisticated developers will construct
collections of
JavaBeans. These Beans will be subsequently linked using a 4GL, script-based tool or even a highly abstracted visual
Java tool to build complete applications.
Link testing
Even in basic Web sites, such as a home page with links to other local pages, things can go awry. Research on
the packet sniffer vendors turned up a site where some of the pages were perfectly readable and others were totally
undecipherable. The pages in error appeared to be text-based HTML documents. Imagine what a few home-grown applets
or a few inter-company links would have generated.
Web sites form the basis for the critical first impression since they are often the first point of contact between
a company and prospective buyers. Make it interesting, easy and fast, and the notoriously less-than-patient electronic
visitor will stay. Make it difficult to navigate, slow or error-prone, and with the click of a mouse the customer
is off to search the competitor's site. Interestingly, these situations represent lost opportunity costs and are
virtually impossible to measure. In the vast majority of cases, a company will never know if prospects leave because
the products are inappropriate or because the site itself is poorly designed with respect to images or content.
The industry can do little to help with a company's products or pricing, but it is helping with site maintenance.
Using tool suites that visually map the Web site and validate links, a Webmaster can ensure paths through the site
are relatively simple and remain intact. On the proactive side, a Webmaster can also trace visitors' paths through
the Web site, tracking metrics such as where they went and how long they stayed on a page. Some products, such
as Segue's Silk, will additionally capture the access time required to walk the page hierarchy. Such timing statistics
have generated more than a few surprises when the development effort was deployed.
Many vendors and industry pundits believe the initial and simplistic use of the Web as a static information
warehouse has become antiquated, and they view such management tools as insignificant.
From their logic, it follows that most companies will progress beyond simple text and animation in their Web
applications to Web-enabled, complex transactional systems. According to this mind set, HTML documents and HTTP
would soon fade into oblivion.
Not likely. Have fat clients and ODBC completely given way to multitiered, partitioned applications and high-performance,
proprietary database drivers? Will they ever? How many DOS users are still out there? For most of us, the only
three things we can count on in life are death, taxes and DOS. Yet no one would seriously argue that two-tier client/server
is effective for large, complex distributed systems. Neither would the debate that ODBC offers better performance
than native database drivers be valid. DOS does not provide better user interaction than Windows, yet it lives
on. HTML and HTTP will also survive the tests of time.
According to Sally Cusack, senior analyst, SPG Analyst Services, "for typical business systems, there will
be a move to more sophisticated functionality. But even in these systems, HTML over HTTP will be used to invoke
that first Java applet or ORB. More importantly, our research shows that browsing for content will not disappear
with the advent of Net-based transactional systems. To the contrary, the amount of static content on the Web will
continue to increase. For this type of information brokering, the ubiquity and simplicity of HTTP and HTML are
all that is required," argued Cusack.
Cross-discipline cooperation
In spite of productivity advances, the industry has yet to discover how to generate error-free software. Software
vendors and I/S organizations face this predicament; it appears to go with the territory. Add today's accelerated
rate of development spurred by time-to-market considerations, and product release decisions become even more complicated.
There is not a single solution to the testing issue. Many of today's systems are just too complex for one person,
or even a development team, to understand the ramifications and interactions of all the hardware and software subsystems.
These subsystems are specialized in functionality, yet for the most part they operate in concert. The I/S teams
that develop and care for these subsystems must perform likewise. Each team, be it development, QA or operations,
can provide knowledge and experience that will make the delivered application both usable and reliable. Said Oliver
Stone, director of Internet products at Rational Software Corp., Santa Clara, Calif., "The Web is the only
environment I know of where real success can become a quality nightmare." Preparing applications for that
success will require cross-discipline efforts from the entire I/S team and high performance testing tools are available
to support the endeavor.
Representative Automated Testing Vendors
- Autotester
- Bellcore
- Cyrano
- Mercury Interactive
- Radview Software
- Rational
- RSW Software
- Segue Software
- Softbridge
- Sun Microsystems
Representative Network Analysis Vendors
- Accrue Software
- Cinco Networks
- Computer Associates
- Compuware
Hewlett-Packard
- Network General
- Optimal Networks
- Tivoli
Representative Software Development
Tool Vendors
- McCabe
- NuMega
- Parasoft
- Platinum
- Rational
- Sun Microsystems
Segue Software at AT&T
Even though the name has changed over the years, AT&T built a telephone network that is the most reliable electronic phenomena in the world. Today, like all companies, AT&T is constantly looking for competitive openings. E-commerce was a natural for this company, but what evolved was E-commerce with a definite AT&T twist.
AT&T's new SecureBuy electronic storefront (www.securebuy.com) was designed for companies who want to conduct business over the Internet but, for various reasons, do not or can not develop the complex transactional software essential to this environment. Mike Maney, E-commerce spokesman for AT&T, headquartered in New York City, described that "this endeavor was almost a natural for AT&T. We've been bringing consumers and business together via voice lines for a long time. With SecureBuy, we're changing the medium to the Internet. That's the idea, but obviously it's not quite that simple. We supply the electronic storefront where our merchants can display their catalogs, we provide the transaction engine, we take care of authenticating and ensuring the security of credit card purchases. We can even help our merchants design their Web pages. And once an order's been placed, our electronic customers can come back to check the status of their order -- what's been shipped, what's back-ordered, etc.," explained Maney.
No pressure
This is a high-exposure application by any metric. AT&T's name is on the storefront, so browsing consumers know that AT&T is driving the transaction. Subscribing merchants, like Rubbermaid and Better Homes and Gardens, are also investing in the AT&T reputation for reliability and dependability.
Sharing the responsibility for the quality of the underlying software is David Sherman, technical manager, service testing and market trial integration at AT&T's Internet applications services laboratory. While Sherman's title is a mouthful, his task is easily defined -- make sure the software works correctly. No pressure there!
Said Sherman, "these E-commerce applications are so large and so sprawling, there's just no way to test these systems without automation." A long-time believer in automated testing, Sherman initially found the market offered several client/server testing tools, but proposed few serious Internet-enabled tool suites. After looking for over a year, AT&T Labs linked up with Segue Software and its Silk product, an Internet-enabled automated load/regression/link testing tool suite. According to Sherman, "my group's not large. We needed to multiply our productivity through automation products like Silk and cover our applications as comprehensively as possible. Luckily, my management has supported both my group and the concept of automated testing. And it's paid off. We've found everything from broken links to bugs in the browsers ... all of which would have impacted our applications even though we weren't directly responsible. Of course, we're constantly staging tests to stay ahead of the curve in terms of users and transactions, and to make sure our engine can handle the load," added Sherman.
The bottom line
AT&T is a microcosm of the E-commerce evolution as it symbolizes new business opportunities presented by this unique medium. For companies like AT&T who are willing to step into the ring, the true potential is just beginning to appear. For example, the company already envisions expanding its line of services.
The inherent risk in using what still amounts to an emerging technology is one of the downsides. AT&T's reputation for quality and reliability is on the line. Still, the investment in Sherman's group and the Silk automated testing facility is but one example of the commitment to maintain that reputation.