Testing SAP apps is key in implementation process

ERP applications are supposed to help large companies solve problems by integrating general functions,avoiding the year 2000 problem and saving users time in building these huge integrated applications.Yet companies migrating to these packages are cautious about jumping on the ERP bandwagon. Users want to know that their ERP applications will do exactly what they are intended to do: handle thousands of users and work with other legacy, third-party and homegrown applications to ensure smooth operation. This has created a set of test issues that is both familiar and new.

Tool vendors have come out with software to test these large ERP applications. But are they necessary? Don't the ERP vendors themselves test products before they go to market? Yes, in fact, Darmstadt, Germany-based SAP AG, to ease user concerns, has introduced tools that test its large ERP applications. SAP, for example, offers tools for systematic testing in its R/3 ERP suite. The R/3 Test Workbench, which consists of the Computer Aided Test Tool (Catt) and a test planning tool, supports the organization of test procedures. It manages all relevant test activities inherent to R/3 -- Catt scripts, test instructions for manual tests, or scripts belonging to external tools -- in the form of a test catalog.

Catt systematically tests SAP transactions in R/3. And it allows customer-specific modifications to standard transactions to be tested during development, as well as when a release is upgraded. It also checks for any side effects resulting from changes to the system configuration. Catt contains a set of functions for creating, starting, managing and logging test procedures.

So what more can third-party testing tools do? Third-party tools do not test the ERP system itself. Instead, they test the way in which the ERP system is tied-in with an end user's other systems or infrastructure. But as Dick Heiman, an analyst for Framingham, Mass.-based International Data Corp. (IDC) points out, no ERP system is truly an off-the-shelf product -- they all require a good deal of customization. "It amounts to doing internal development and programming," he said. "Thus the process is susceptible to bugs and needs testing."

The problem only grows as use of SAP solutions increases. "Growth rates [for packaged applications] have been pretty phenomenal," noted Heiman. Worldwide testing tools revenue for 1997 was $645 million, according to IDC, which projects a 20% compound annual growth rate through 2002. In comparison, the overall software market for 1997 was $118.5 billion, with a projected compound annual growth rate of 14% through 2002.

Heiman attributes the growth in testing tools to the increased importance of software, the Web and the increased complexity of client/server networks. In addition, he said, there is an increasing backlog of applications to be developed, not enough people to develop them, and a need for applications to work properly when they are released.


There are three main kinds of testing: functionality, load or stress, and regression testing. Functionality testing addresses the issue of whether or not the application does what it was intended to do. This involves testing the specific functions to be accomplished.

Load or stress testing examines what happens when an application is under a load. For instance, an application may work fine with one or two users, but what happens when there are 1,000 or 2,000 users? These testing tools simulate or provide virtual users to test these scenarios. This also helps users to determine how much of a load is too much.

Regression testing is based on the fact that nothing ever stays the same. There are always modifications, such as new or revised software versions and customer-initiated changes. "When you make a change, you can have some unintended side effects," Heiman said. Regression testing ensures that things that used to work still do and that the system has not regressed.

According to Jim Hare, product marketing manager for packaged applications at Mercury Interactive Corp., Sunnyvale, Calif., there are five challenges in testing packaged applications:

  • EACH IMPLEMENTATION IS UNIQUE. Modifications and enhancements are often what cause problems.
  • SYSTEMS ARE NOT ISLANDS UNTO THEMSELVES. The systems need to be able to tie in legacy, third-party and homegrown applications.
  • MANY ERP SYSTEMS ARE GEOGRAPHICALLY DISPERSED, MANDATING A 24X7 OPERATION. Before migrating to a packaged application, companies and end users need to understand what will happen throughout the day as different people log on and off.
  • PEOPLE WANT TO GO "LIVE" AS SOON AS POSSIBLE. Testing tools provide a way of managing that whole process.
  • THERE ARE A LOT OF ALTERNATIVE FRONT ENDS OUT THERE, SUCH AS THE WEB, SELF-SERVICE APPLICATIONS AND JAVA.Testing tools provide a way to test these alternative delivery mechanisms.

While these challenges confront packaged application customers daily, Joshua Greenbaum, principal, Enterprise Applications Consulting, Berkeley, Calif., believes that the overall issue in testing packaged applications is convincing customers of the value of testing.

"Customers try to get the implementations done first and then when things don't go right, they try testing," he said. Instead of using testing as a last-minute alternative, ERP customers need to think about testing as a primary function within the implementation process. "Customers need to treat testing as a preventive measure, not as an after-the-fact, pain-relief measure," he added.

To help organizations address this, software vendors such as Dallas-based AutoTester Inc. are taking a quality assurance (QA) approach to the problem. The whole reason companies go through testing, said Randy Jesberg, AutoTester's packaged ERP product line manager, is to answer the following questions: "Can the system support our business?" and "Are we going to be able to support our business process once we put the application into production?"

Companies purchasing ERP packages typically focus on test execution, assuming they already know what to test. However, "a lot of shops don't really know what to test," noted Jesberg. Because of this, AutoTester has decided that QA is more important than the testing tool itself. IDC's Heiman agrees. "You can buy me the most expensive paintbrush in the world," he said, "but I'm still not going to be Rembrandt." Rather than simply selling a testing tool, vendors are now also showing companies how to set it up and use it.

The problem is familiarizing customers with the differences between QA and testing, as many believe the two are synonymous. Jesberg is quick to point out that they are not. "Testing is a set of events," he explained. "QA is a process that involves test planning, test creation, test environment preparation and definition, test execution, and results measuring and reporting.

"From an ERP standpoint," continued Jesberg, "a well-defined QA process is key to obtaining the benefits of test automation. Another key is having a QA presence on the project that is separate from the test group, the testing event." It also helps, he said, to maintain a pure test environment and to have well-defined tests that represent the way the system will be used in production.

"The QA methodology provides a basis for measuring the effectiveness of the testing effort, allows configuration and test groups to coordinate, and eliminates the rework involved with invalid test cycles," he noted.

The biggest challenge in testing an ERP system, according to Jesberg, is making sure that the people brought together by a testing effort understand each other well enough to perform a test that is meaningful on the system. "It requires technologists and end users to work together," he said.

The answer to testing performance and/or functionality where diverse applications overlap, he said, is planning ahead of time and identifying the data dependencies that exist. "If you're testing ERP and other legacy systems, you need to be able to have one system hand off information to another, and have something to drive both systems," said Jesberg. In order to test performance, he continued, every layer involved needs to be tested. "You need to drive the system from the very end node where the end user is sitting," he said.

Enterprise Applications Consulting's Greenbaum added that this can be very complicated; testing where applications mesh depends on what kind of interface is being used. "[Mercury Interactive's] LoadRunner and others create sort of a dummy test set and ram it through the interface to make sure that interface holds up under the load," he said. The key to testing application programming interfaces (APIs), he added, is ensuring that data moves quickly through the interface without error and without
tying up the network or application.

AutoTester's Jesberg agreed: "In testing for APIs, the main thing is to make sure you're going through every level, every layer that's going to be involved. APIs need to be driven the same way they're going to be used once they're in production."

Lessons learned

Numerous lessons have been learned through testing ERP applications. Ken Battles, senior systems analyst/SAP project manager for Siemens Telecom Networks, Lake Mary, Fla., used Mercury Interactive's LoadRunner to perform stress tests on Siemens' SAP production system prior to cutover. Once the tests were completed, he found there was not a proper distribution of users across application servers. And after identifying which transactions took a large number of database hits, Battles was able to pinpoint those areas that required more efficient database indexing.

Battles' biggest challenge in load testing the firm's SAP production system was setting up a test environment that would simulate the concurrent running of 45 key business transactions with up to 350 users. The LoadRunner tool made it possible to meet this challenge with a team of 10 people in five weeks, he said.

Siemens Telecom is taking a "big bang" implementation approach that will use most of SAP's R/3 modules all at once. "We're flipping the switch with greater confidence that our application and database servers can handle our production load," Battles explained.

Pete Nitchman, technical operations manager at the Glass Technologies Division of Osram Sylvania, the light bulb manufacturer headquartered in Danvers, Mass., learned that he could not do enough about testing. "There's always something that comes up that you hadn't thought of, or you say, 'Well, we don't really need to,' and that's usually the gotcha," he explained.

Osram Sylvania used AutoTester from AutoTester Inc. to make changes to SAP data when it needed to change the receiving location for all of its production materials. "I changed 400 finished goods in about 15 minutes instead of having two or three people involved," Nitchman said. "It would have been very time-consuming to do it manually."

Osram Sylvania has also used AutoTester for regression testing. The company has been bringing SAP up a division at a time. Osram Sylvania is also undergoing a corporate realignment and picking up a plant that has not yet gone live on SAP, Nitchman said. The firm will use AutoTester on that project, and intends to take advantage of AutoTester's QA methodologies.

When asked what his biggest challenge is in testing, Nitchman replied: "[The] ability to do as much as we need to do with the resources we have."

'Good insurance'

Enterprise Applications Consulting's Greenbaum noted three major lessons ERP testers have learned. First, testing is a cost-effective way to avoid having an extremely expensive implementation go-live failure. "It's good insurance," he said. Testing also lets users find more efficient ways to fix bottlenecks. And finally, efficiency and effectiveness in the system are extremely important.

AutoTester's Jesberg said there is no such thing as starting too early. "Don't think that test automation is going to eliminate the need for the up-front work that you need to do in order to get the most out of test automation," he warned. "Up-front planning and organization are commonly overlooked." Another lesson, he added, is to involve end users in the planning process,
creating scripts and executing tests.

Mercury Interactive's Hare said users should be careful not to customize without understanding. "Understand the real business environment so that you can create a realistic load test," he said. There will be patches and last-minute changes, even when going live, added Hare. "Plan for those changes. Make sure that doesn't trip you up."