In-Depth

ASQ is more than testing tools

True or false: Automated testing is the only way to ensure the quality of an application. The answer depends on whom you ask and the depth of his or her experience. Software developers are told over and over that testing is the last chance to "get it right" prior to deployment.

Yet a number of tools can be used to improve the quality of business applications long before the testing process begins and, in many cases, long before a line of code is written. "You've got to take the definition of quality and make it much broader," said Carl Zetie, senior industry analyst at Giga Information Group, Cambridge, Mass. "Some organizations unfortunately equate quality with software that doesn't crash. Instead, they should think about 'fitness for purpose' and trace development activities back to requirements."

Development organizations that understand requirements can effectively model apps and build test cases that accurately validate the features/functions that meet their business needs, say observers. Applying automated software quality (ASQ) disciplines early in the development life cycle can improve the quality of the application and, over the long run, reduce correction and rework.

"Statistically, 60% to 70% of application failures can be attributed to problems early in the life cycle," said Graham Titterington, senior analyst at Ovum Inc., a London-based consulting firm. "The most important aspect of quality application delivery is using the correct process that helps provide early notification of a problem."

For many programmers, though, "process" produces an effect somewhat similar to the way "fur" provokes an animal rights terrorist. Furthermore, faced with time-to-market pressures, "process" is typically envisioned as a stumbling block to creativity.

"Software developers have an aversion to process. They think about programming as a craft," said Randy Rice, president of Rice Consulting Services, Oklahoma City. "A consistent approach is needed because quirky software leads to quirky testing, and non-standard disciplines cause problems to ripple outward from the beginning of the project."

Thus, many consultants do away with the term "process" in favor of monikers like "framework" and "workbench." Or perhaps "methodology," as is the case at Stockholm, Sweden-based telecommunications giant Ericsson. Developing hardware and software components from its Network Core Products group, Ericsson relies on an automated development methodology that flows from requirements through development and testing, and into production deployment. "The need for an automated methodology is very important for the delivery of quality products," said Lars Taxen, a member of the Ericsson team responsible for implementation and support of new network products. "It is especially important for us because we have 10,000 design engineers."

It is also important because development requirements in the highly competitive telecommunications market are never fixed, requiring flexibility without sacrificing quality, noted Taxen. While fluid requirements can give most programmers fits, Taxen said the eMatrix toolset from MatrixOne Inc., Chelmsford, Mass., has let Ericsson implement automated collaborative and incremental development practices while tracing requirements through modeling and testing.

"Incremental development is guided by requirements and allows us to develop and test applications in smaller, more meaningful portions," Taxen said. "We then expand the methodology to test greater numbers of increments prior to a product rollout."

In effect, Ericsson expanded the definition of automated software quality by wrapping the disciplines of requirements traceability, modeling, incremental development and testing into a single comprehensive methodology.

Defining the requirements

In the best of worlds, the quality engineer would be involved as early as possible in the application development life cycle - usually during the requirements definition stage because open communications between the QA and development teams help build product components that are truly 'testable,' said Giga's Zetie. "How can you test a requirement that reads 'the product must be easy to install'?" he said.

Just as the quality of an application is dependent upon extensive testing, the quality of testing is dependent on the extent of the requirements. "Testers must build test cases based on application requirements," said Mary Walker, knowledge engineer at Logica North America, Lexington, Mass. "Otherwise, there's a lack of communication on what actually needs to be validated." Using the RequisitePro toolset from Rational Software Corp., Cupertino, Calif., Logica records application requirements from a variety of sources, including clients, field personnel, technicians and others. These recorded requirements are reviewed by software architects, developers and quality assurance staff in an effort to determine resource needs and - deemed equally important by the company - testing requirements.

"As part of the process, testing requirements are defined and approved based upon application requirements," said Walker. "No code is written prior to review and approval by all groups. The added communication helps everyone build the best quality product possible."

Another company moving automated quality disciplines to earlier in the development life cycle is ESPS, a developer of custom publishing software for a number of industries that must meet strict government regulation demands. "We require a much more structured development environment," explained Bob Schatz, executive director of development and operations at the Ft. Washington, Pa., firm. "It was critical that we captured all of our requirements and link them with development and testing activities."

The need for requirements traceability became evident for ESPS shortly after the first release of its product, Core Dossier. According to Schatz, ESPS must respond to the guidelines and auditing requirements of its customers, which are directed by government entities such as the Federal Drug Administration (FDA) or the Environmental Protection Agency (EPA). "We had a long way to go after the first release of our product," he said. "Instead of trying to find a way to re-create requirements from memory, we implemented a tool that helped us reverse engineer our existing functionality and build a baseline of requirements."

ESPS implemented the DOORS toolset from Quality Systems and Software Inc., Mt. Arlington, N.J., to reverse engineer existing software into functional requirements. Additionally, the company divided and defined requirements based on product lines, first creating a baseline for the core components contained in each product, and then creating and linking separate requirements for the various attributes of each individual product.

"While we maintain one set of requirements for the base product, the requirements differ across the add-on products," Schatz said. "Now they are all linked in the DOORS database so that we can immediately learn the impact of changes across different product lines."

According to Ovum's Titterington, the value of requirements management tools can only be fully leveraged when open communication exists between the various IT organizations. "It's part of an inspection and review process where objectives are clearly stated, documented, understood and followed through," he said.

Titterington also noted that while automated integration with modeling and test-planning products does improve the software delivery methodology, manual decision points still exist along the way. For example, while Web front ends to such tools allow more users to enter requirements, today's tools cannot determine whether a requirement is useful or contradictory. "Even with automation, a decision-maker must be involved," he said.

Application modeling

The application modeling discipline pushes the automated software quality process to earlier in the development life cycle. Typically used following the creation of requirements, modeling tools can provide support for rapid application development and help alleviate some ever-present time-to-market demands. "The ultimate dream is to have a modeling tool produce 100% fully functional code," said Mike Budd, a senior consultant at Ovum. "A good tool will provide anywhere from 70% to 80% of code generation."

Modeling can be viewed from several different angles, each with the goal of reducing app development time and helping to promote reuse of app components.

From the development perspective, modeling helps define the application flow while generating and managing source code and objects. From the perspective of a business user, modeling provides an opportunity to define business functions that require underlying code. An example of the latter rule is Business Process Template (BPT) from Template Software Inc., Dulles, Va., which is said to help organizations capitalize on business knowledge contained within the organization by graphically depicting processes.

"Reusability is the key," said Joe Damassa, vice president of application development marketing in the Software Solutions Division at IBM, Somers, N.Y. "Crucial to the delivery of quality applications is ease of use in application development tools and processes."

Damassa said that ease of use and reusability are the keys to IBM's oft-changed SanFrancisco e-business framework, which is gaining support from third-party tool makers like Princeton, N.J.-based Princeton Softech and its Select Enterprise modeling toolset. Through an automated pattern transformer, Select Enterprise applies the standard IBM SanFrancisco code-generation rules to a high-level model and generates a "design-" level view. This design-level view can be reviewed and modified prior to code generation.

"A fully expanded view of the application prior to code generation or changes goes a long way to improving the quality of an application," said Ovum's Budd. "In effect, good use of a modeling tool can help produce a diagrammatic view of the tasks required by the business. Modeling helps you define the 'what' of the system, not the 'how.'"

Test planning tools:
Expanded capabilities

At the core of automated quality disciplines reside test-planning tools. Throughout the years, these tools have continued to evolve to the point where the name "test planning" has become an almost inadequate description of the capabilities of most products.

For example, requirements management tools such as Quality Systems and Software's DOORS and Rational's RequisitePro today have interfaces with test-planning products such as TestDirector from Mercury Interactive Corp., San Jose, Calif., and Rational's SQA Manager. These interfaces are designed to provide two-way communications so that changes made in one product are automatically reflected in the other. "Bringing requirements right into test planning represents a dramatic improvement in quality assurance capabilities," said Ovum's Titterington.

"Communication between the technologies provides an ideal means for eliminating any interface constraints," added Giga's Zetie. "It allows for interface dependence and collaborative conversions as more people can access the requirements and review the test results."

Outside of the requirements realm, though, integration with other technologies enables test-planning tools to tout support for functions such as version management, test-data generation, systems management testing, defect tracking and an array of reporting capabilities. Mercury Interactive's TestDirector includes a Web interface that can be used to view all aspects of an app from requirements to test results. This integration becomes especially important for those firms deploying packaged apps where requirements and application models are unavailable. Furthermore, these functions can provide a centralized view of quality assurance activities.

The "P" word

Whether or not it is called a process, framework or methodology, organizations serious about quality application deployment have implemented steps to assure communication between various IT groups, while instilling the disciplines needed to meet business requirements.

"Regardless of how much of an aversion there is to formal, tool-driven methods," said Zetie, "there are procedures already in place by most organizations whether they use E-mail, meetings or MS-Word documents."

At People's Bank in Bridgeport, Conn., the use of a non-tool-driven process enabled the firm to successfully and automatically test its new Web-based banking system - complete with fully active streamlining video - using the e-Test Suite from RSW Software Inc., Waltham, Mass. "We first identified and reviewed our requirements," explained Steve Guidone, principal, People's Bank Emerging Technologies department. "Then we built our test cases based on those requirements and loaded them into the testing tool."

Some of those requirements included validation of typical customer tasks such as reviewing an account balance, paying bills and funds transfers. It also included validation of architectural requirements to ensure that the system could handle anticipated loads. "We began with testing some of the more non-intrusive functions and rippled the process outward, adding more as we went along," said Guidone. "The goal was to mimic the application and business requirements."

In formal situations, disciplines can be driven by tools such as MatrixOne's eMatrix or IS*Integrity from Chain Link Technologies Inc., Sunnyvale, Calif., which uses a Web interface to flow requests for change with the actual change process. This interface automatically links items such as requirements and defects with approvals for production deployment.

Disciplines can also be embraced with IBM's SanFrancisco which, according to the firm's Damassa, is delivered with a process and a workbook wrapped around development tools. "The goal is to take people who are not very skilled in object and Java technologies and have them learn how to quickly deliver new applications that contain the quality needed by the business," he said. But care must be taken when seeking tools, contends Rice at Rice Consulting Services, because tools cannot always be made to fit a process. "Although a consistent approach is needed, it seems that most people have a product-driven thought process," he said. "The questions always seem to evolve toward 'What are the features and functions?'."

Ovum's Titterington agrees and adds that IT organizations must avoid the distinct danger of becoming enamored of technologies without a good methodology already in place. "Building a process around your tools will help you end up in tears," said Titterington. "The tools should be selected based on your requirements as opposed to making your requirements meet the functions of a tool."

Automating software quality

So, is automated testing the only way to ensure the quality of an application? No. Instead, automated testing represents a portion of a comprehensive automated software quality practice that includes requirements definition and tracking, app modeling, test planning and execution, defect tracking and change management.

Equally important, although automation can help guide development activities from concept to finished product, the delivery of business systems will still require vital, manual intervention for decisions.

"You can't automate design or requirements differences," said Titterington. "And you can't automate the creation of the business plans, marketing plans or product launch plans. Automation can assist [you], but it cannot replace people."