Columns

Snapshots from the Web front -- developers find performance counts

Okay, so you have a Web site. It seems to be functioning smoothly, and the CEO is impressed with the colors and graphics. The sales and marketing staff call you daily with suggestions (some are valid, and some are apparently drawn from last night's Star Trek episode) but your real challenge is how to link this new advertising and information tool into the company's business and subsequently, business systems.

Developers across the country may take heart from the fact that they are not alone in this challenge. Consider this: when Software Productivity Group, Natick, Mass., surveyed Fortune 2000 users as to how long their organizations have been using Internet/Intranet development tools, more than two-thirds (68.1%) indicated less than one year. Another 24.8% responded less than two years, 4.8% have implemented these tools over the past two-to-three years and a minuscule 2.4% have been actively utilizing Internet development tools for three-plus years. There is not a wealth of experience to draw from in this area.

While Internet/Intranet technology is clearly in the early phases of adoption, it is accelerating at a phenomenal rate. However, there is one striking differentiator between Internet/Intranet technology and the vast majority of other computing technologies -- the active involvement of the business managers. This creates an interesting dilemma for information systems (I/S) managers. On one hand, I/S departments have been mandated (and even funded) by top brass to exploit this technology. On the other hand, the technology itself is immature and still evolving. The challenge for I/S is to successfully walk the high-wire by selecting Internet tools that will not only support current efforts, but extend far into the future as well. All this must be done under the umbrella of showing almost immediate ROI to the business itself. Sounds like fun, right?

While companies are scrambling to develop new Web-enabled, front-end systems, often these initiatives are completed in such haste that they are not well thought out in terms of the overall, corporate view of "big picture" business benefits derived from this technology. As such, users must be careful that the tools they select can accomplish not only the quick-fix efforts, but enable longer-term initiatives as well. Fortunately, there is hope on the horizon. Vendors are striving to create products and to differentiate themselves by rapidly meeting user needs in this arena. However, our data shows that more than half (55.2%) of respondents to the SPG survey cite "immature technology and lack of standards" as the primary disadvantage in using Internet/Intranet tools today.

In fact, in order of importance, the 300+ developers in SPG's survey list the disadvantages associated with using the I-net tools as:

  1. Immature technology and lack of standards;

  2. Too many products -- market confusion;

  3. Lack of security mechanisms for ensuring data, application integrity and authentication;

  4. Does not support rigorous development techniques; and

  5. Limitations in a "stateless" Web environment.

Users also referred to the cost of product, lack of team development and life-cycle management, inability to scale to mission-critical systems, and weak development interfaces and environments as major inhibitors to using these tools. Further down the list, but none the less important to consider, is the fact that the generated applications often exhibit poor performance and are difficult to maintain. The reason these latter categories are not ranked higher, we suspect, is that most users are in the first phases of their Internet/Intranet initiatives; few have actually created full-blown distributed applications for widespread use in this environment.

One company that has had success in the world of mission-critical Internet development projects is Charles Schwab & Co. Inc. The financial services giant, headquartered in San Francisco, was early in the market with full-blown, Internet-based services for its substantial customer base. Ron Welf, manager of network capacity planning at Schwab, related some of the concerns and issues he faces on a daily basis since the arrival of I-net.

Welf states up front that the Internet "changed my life dramatically -- it really hit six months ago when we experienced significant volumes on our Web trading applications." Schwab has received many kudos from trade press and customers alike on its quick-to-market delivery of a Web front-ended trading system that empowered customers to place orders directly from their PCs. According to Welf, one of the paramount issues involved in creating this type of world class trading system is performance. "We immediately became aware of the performance and response experienced by our customers. This provides a total quality of service focus for I/S. Our intention is not to be only visual and user friendly, but maintain peak response times as well," said Welf.

Internal encryption

The I-net trading system at Schwab utilized many tools (most written in-house) to create the back-end portion, and the Web itself functions as a delivery system. As Schwab is obviously very focused on security for these transactions, all Web trading is encrypted using internally created encryption systems. One of the things that became immediately apparent to the Schwab I/S staff is that a Web application executes across a network differently than a traditional application executes across a similar environment. The architectures and methods required by the Web servers demand different communications models.

Says Welf, "HTTP is very inefficient on top of TCP/IP, and when you're looking at things like employing dynamic scripts [or] using Java, you need to employ tools to get the processing where you need it." This can mean bigger clients, bigger network pipes -- or both.

At Schwab, like other Internet/Intranet installations across the country, applications must run across many different environments with absolutely no internal I/S control for end-user connection mechanisms. All these different elements have different implications on the overall performance characteristics of applications running within the I-net framework. To help alleviate these performance issues, Welf and his team are currently using a product from Optimal Networks Corp., Palo Alto, Calif., to measure performance, impact and peak loads across the Internet environment.

The Optimal Network's product line is designed to focus on network modeling and analysis. While this has always been critical to information processing, it is going to be crucial as more and more companies leverage the Web to conduct true, interactive business transactions. To this end, Optimal offers the Application Expert, a Windows- and Web-based application analyzer. The software is used to troubleshoot, analyze, report on and model live application traffic. Additionally, the tool is constructed so that developers, deployers and administrators can work collaboratively together to solve problems as (or before) they arise. The vendor has also enhanced its Internet Monitor product, which monitors Internet/Intranet application performance, warns of application "slow downs," and works in conjunction with the Application Expert to pinpoint the source of problems.

Before using Optimal's products, Welf and the Schwab I/S organization were using in-house utility scripts to provide the type of tracking information that Optimal provides. With the Internet development paradigm, the question becomes "how do we capture application-oriented data?," Welf says. "We need to drive performance models and build data simulation models. We need to measure and quantify the size and network impact of these applications. We must continually forecast and plan for growth."

To capture moving data

Products such as Optimal's allow Welf and his staff to efficiently capture this ever moving data at the application model by using application-oriented conversational threads and by tracking the number of bytes flowing across the network for a particular business function. All this data can be aggregated with other pieces of data to capture user profiles. These profiles, in turn, can be extended to build a location profile, and then I/S may forecast and size the link capabilities, says Welf. (For more on tools that provide integrated applications testing, see "When objects collide, you must rethink your test strategies".)

"This kind of thing has been done for years on the mainframe," Welf said. "However, in a TCP/IP environment, it's pretty new stuff. Most development organizations today don't consider that performance is important when building the application."

This situation is likely to change over the next several months as corporations move their Web development initiatives past the advertising/company identification stage and on into the world of interactive business and electronic commerce. As things like heterogeneous database connectivity, ease of use and maintaining consistent views across server and client platforms create challenges for developers, they must heed Welf's warning that performance issues must be taken into consideration as well. With this in mind, they may be able to stay ahead of the game.

Things to come

SPG forecasts that the number of corporations which will be building browser front ends to corporate database and application servers will increase 200% over the next two years. As such, developers and users will be dealing with more complex issues than originally imagined in the early days of simple Web site development.

Remember, Web server-based applications can be likened in many ways to the two-tier phase experienced in client/server development. They are the first necessary steps in bringing the Web into the world of serious business applications. The second phase of this initiative is client access to corporate application and database servers -- and this can be regarded as the equivalent of three-tier (or n-tier) processing, the model used by serious client/server deployments today.

As the Web technology, user expectations and the tools mature, we believe that the next two years will see deployment of Web-based systems in the same scenarios that are currently seen in mission-critical client/server systems today.