News

ADT News

eADT for Monday, April 23, 2001
This complimentary newsletter is sent to opt-in subscribers, and to subscribers of other 101communications publications that might be interested in its content. If you would like to receive this newsletter, subscribe by visiting our website at http://lists.101com.com/main.asp?NL=adt.

eADT, a weekly electronic complement to Application Development Trends, provides independent, timely insight to the tools, technologies and trends affecting corporate business application development.


If you are not currently a subscriber to Application Development Trends and would like to be, go to: http://www.adtmag.com and click on the "free subscription" button.

In this issue of e-ADT

Data Management
Kana and Broadbase Merger Bears Fruit in a Ripe CRM Market
by John K. Waters

eBusiness
.NET's New Evangelist Emphasizes Openness
by John K. Waters

Application Integration
Despite Meltdown, Internet Remains a Strategic Business Tool
by Rich Seeley


Data Management
Kana and Broadbase Merger Bears Fruit in a Ripe CRM Market
by John K. Waters

REDWOOD CITY, CA—The Customer Relationship Management (CRM) market was and still is ripe for consolidation. So say analysts at AMR Research. They point to the recent merger of two troubled companies as a harbinger of things to come: CRM vendor Kana Communications and market analytics software maker Broadbase, which joined forces last week in a stock deal reportedly worth about $75 million.

Redwood City, CA-based Kana Communications provides Web-architected enterprise relationship management solutions (eRM). It delivers integrated e-business and interaction applications, with a modular and scalable platform for both Internet and Global 2000 companies. Headquartered a few miles to the south in Menlo Park, CA, Broadbase Software provides customer-focused analytic, marketing automation, and e-service software applications that analyze customer data from multiple touch points, and use that information to execute marketing campaigns, improve online merchandizing and content, increase site stickiness, and personalize customer interactions.

Between them, the two companies will have a base of more than 1,300 customers, including American Express, Bank of America, Boeing, British Airways, Cisco, Fidelity Investments, Ford, General Motors, Hewlett-Packard, MCI, Microsoft, Nokia, Sprint and United Airlines. The merger agreement is subject to shareholder approval and is expected to close in the third quarter. The combined company will be called Kana Software. Chuck Bay is slated for the positions of president and CEO, and Jay Wood will serve as chairman.

"Customers are demanding a relationship management solution that creates extraordinary relationships," said Jay Wood in a media release. "This requires the combination of analytics, knowledge, marketing, and service with a scalable e-Business platform. When we look at the combined strengths of the Kana and Broadbase product offerings, we have a unique and unbeatable eRM [Enterprise Relationship Management] solution."


eBusiness
.NET's New Evangelist Emphasizes Openness
by John K. Waters

SAN JOSE, CA—According to Microsoft's new man in Silicon Valley, the company 's vision for the next generation of Web-services-based computing is "not at all proprietary," but is "totally open." And the "core and key behind it all" is XML. "Without this core descriptor for data, the impact of the Web and Web services would take a lot longer," Dan'l Lewin said. "It's broad, it's industry supported, and it's open. With XML emerging, and some of the tools associated with it, the revolution is on its way."

Speaking to attendees at this month's Software Development West conference, held here, Lewin described and demoed Microsoft's XML-based .NET initiative and technologies, which he said are designed with plenty of room for competitors. Openness was practically the theme of his presentation.

"We now have some open, interoperable standards," he said. "The question is, what is the computing model? I think intelligent devices at the edges of the Network are the key. It's really all about smart clients, and at the end of the day, [it's about] what kinds of standards required to bring... new Web-services-based computing to the end user.... Smart devices are the natural evolution of distributed computing," he said. "It's a natural progression, and that's the computing model that is going to win over time."

Lewin is vice president of .NET Business Development at Microsoft's new Mountain View, CA, facility. Lewin's job is to promote the .NET strategy to potential partners, software developers, and corporate customers in this area. He also acts as the Redmond, WA-based company's liaison for other matters in the valley. "As much as there has been historically this stuff against Microsoft in the valley," he recently told the San Jose Mercury News, "you have to take some of that with a sense of humor and recognize it's just business."

Fully half of Lewin's presentation was devoted to a demonstration of the new .NET tools. Rob Howard, a program manager at Microsoft, walked the audience through the drag-and-drop addition of a new third-party Web service to an existing Web site and showed how the Web service itself could be created quickly, starting with nothing more than a database.

Lewin announced that version 2.0 of the SOAP toolkit would be shipping later this month. The toolkit works with Visual Studio 6.0 and supports any language and any application that supports COM. He also said that the current beta of Windows XP operating system will now contains a native SOAP processor.

.NET deliverables announced during the keynote included "HailStorm," the current code name for a set of Web services; a new programming model supported by Visual Studio.NET, and the .NET Enterprise Servers. "Passport," a currently available user authentication service seen in the product demo, is also part of the .NET platform.

Microsoft has been expanding its presence in the valley. Chairman Bill Gates recently christened the company's new technology center in Mountain View. Microsoft is betting its survival on a new Internet strategy that depends on winning over local software developers and forging partnerships with other technology leaders.


Application Integration
Despite Meltdown, Internet Remains a Strategic Business Tool
by Rich Seeley

Now that the Internet bubble has finally burst and expectations have come back down to Earth, Sun's technology evangelist expects the Internet will soon emerge as a strategic business tool—and now, like other tools, it will now be measured by traditional metrics.

In an interview at last month's Internet World, George Paolini, vice president, technology evangelism at Sun told ADT, "The bottom falling out of the entire dot.com movement doesn't mean that using the Internet as a strategic tool is by any means invalid. What we've learned is that all of these vaporous new measurements that we came up with for determining whether we were successful or not were invalid. It's not about eyeballs, it's still about revenue, it's still about profit, it's still about the bottom line and it's still about traditional metrics for measuring success for a company."

"I think in some ways the economy being where it's at could actually expedite this," Paolini said. "Because a lot of what we're talking about is really trying to leverage current investments that enterprises have made, to extend that out through the Web."

To this end, Sun announced a new product initiative aimed at the more pragmatic e-business customer seeking ways to use the Internet to reduce costs and boost productivity. Its first customer shipment of Java Web Start (JWS) software, which launches Java applications through any standard web browser when the end user clicks on a link, has become available. More information on these products is available at: http://java.sun.com/products/javawebstart.



eADT for Monday, April 16, 2001
This complimentary newsletter is sent to opt-in subscribers, and to subscribers of other 101communications publications that might be interested in its content. If you would like to receive this newsletter, subscribe by visiting our website at http://lists.101com.com/main.asp?NL=adt.

eADT, a weekly electronic complement to Application Development Trends, provides independent, timely insight to the tools, technologies and trends affecting corporate business application development.


If you are not currently a subscriber to Application Development Trends and would like to be, go to: http://www.adtmag.com and click on the "free subscription" button.

In this issue of e-ADT

Ovum (http://www.ovum.com) on e-business
Hype and Anti-Hype
by Gary Barnett

Application Integration
Demand of IT Workers is Sliding, But The Jobs Are Still Out There
by John K. Waters

eBusiness
Analysts Hold Forth on Microsoft's HailStorm
by John K. Waters


Ovum (http://www.ovum.com) on e-business
Hype and Anti-Hype
by Gary Barnett

It seems astonishing to me that the entire technology market can oscillate from one extreme of hype to the other in such a short time. It's almost as if, just as it is with matter and anti-matter in particle physics, for every piece of hype there's another 'opposite' piece of 'anti-hype'.

Over the past months we've entered a seemingly un-ending field of anti-hype—'Business to Consumer E-commerce is a dud' it seems and 'Online Advertising doesn't pay', indeed you might be forgiven for wondering 'how can we all have been so stupid....'

The basic truth is that it's just as stupid to let yourself be lured by the new doomsayers (many of whom—rather oddly—were the old promoters...) as it was to believe the hucksters and promoters of the early hype.

The problem is that during a gold rush the guy in the corner who mutters things like 'Pick the ground carefully', or 'make sure you buy the right pickaxes', or (worst of all) 'don't expect to hit a big seam for some time' is usually trampled, or just ignored as a boring old geezer who simply doesn't understand.

History shows that in the real world goldrushes of the 19th and 20th centuries it was the slow moving, dull types that made the most money and for the longest period of time. Whilst a few pioneers got extraordinarily lucky, the real riches went to the people that took their time, picked the ground carefully and made sure they had the right pick-axes.

There is still a ton of gold in 'them there hills', e-commerce expenditure is still growing sharply, and even with concerns about a decline in consumer spending the importance of business to consumer e-commerce will continue to grow, and as it does you can expect more of the pioneers to fall by the wayside—leaving the riches to be divided between the hardy survivors, and the savvy types who stood aside to let the stampede pass in the sure and certain knowledge that most would fail—just as they'd nearly completed blazing the trail.


Application Integration
Demand of IT Workers is Sliding, But The Jobs Are Still Out There
by John K. Waters

ARLINGTON, VA—This is one of those good-news-bad-news announcements: According to the Information Technology Association of America (ITAA), nationwide demand for IT workers this year is down 44 percent from last year. That's the bad news. The good news (for IT workers, at least) is that, even in a climate of reduced demand, employers are still unlikely to fill all their open positions with qualified people.

"There are a lot more people out there with the skills than there were five years ago," said ITAA president Harris Miller. "On the demand side, the demand simply has slowed down because the U.S. economy is slower."

Based on interviews in December and January with managers who hire IT workers, the study covered both tech and non-tech companies, and it looked at both purely technical positions (systems administrators, programmers, etc.), and quasi-technical jobs, such as technical support. The study concluded that employers would need to fill 900,000 information technology jobs (due to new positions, retirements and turnover) in 2001, down from the 1.6 million employers needed to fill in 2000.

But they the number of qualified IT workers available to fill those jobs will fall short by 425,000, which is half the shortfall of 850,000 reported last year. The overall high-tech workforce is estimated at 10.4 million people, up from 10 million in last year's survey. That number does not include workers in government or non-profit jobs.

The study found that demand for the less technical jobs in particular is declining. Demand for technical support personnel has softened, too, but not to the same extent; a fourth of all IT positions to be filled this year will be in technical support. Demand has also dropped substantially for workers in technical writing and digital media, the study found. However, the demand for enterprise systems professionals and network designers and administrators is likely to increase, the study concludes, with many positions in those two categories going unfilled.

The study was sponsored by the American Association of Community Colleges, American Management Systems, Cisco Systems, Hall Kinion, Intel, ITT Educational Services, Knowledge Workers, Microsoft and SRA International.


eBusiness
Analysts Hold Forth on Microsoft's HailStorm
by John K. Waters

SAN JOSE, CA—Last month, Microsoft announced Project HailStorm, an Internet-based architecture boasting an extensive set of user-definable XML-based Web services. The software building blocks grouped under the HailStorm rubric can be used to create personal networks of applications, devices, and services.

HailStorm, considered to be a key component of the Redmond, WA-based software giant's evolving .Net software-as-a-service strategy, is still in beta, with promises from Microsoft of an early 2002 release.

In a research note, Merrill Lynch analyst Henry Blodget said that HailStorm was important to Microsoft because it might increase the size and loyalty of Microsoft's consumer user base, which could ultimately create opportunities for Microsoft to charge users a monthly fee. More importantly, he wrote, "...HailStorm should make the .Net platform more attractive to third-party developers. These developers will be able to leverage both the HailStorm code and user base when building their own Web services—similar to how they leverage the Windows OS when building PC applications."

In a recent FirstTake report, analysts at the Gartner Group saw the HailStorm announcement as a clear sign that the software giant is gassing up its Web service engine and attempting to spur the emergence of a large Web services industry. "...Microsoft will likely emerge a big winner," the report concludes. "[HailStorm] will accelerate movement to Windows XP and software subscriptions. In the opinion of the Gartner researchers, the HailStorm initiative will give Microsoft "...the mantle as the industry visionary..."

The report goes on to state that, if successful, HailStorm will make Microsoft specifications the "de facto" standard, completing Microsoft's transition from a vendor of desktop software to a vendor of software solutions. It would also reduce Microsoft's dependence for revenue on version upgrades.

However, while Microsoft's dominance and leverage on the desktop is undisputed, on the Web, competitors abound—including IBM, Sun Microsystems, and Hewlett-Packard, all of which are wooing developers to their own Web-based services platforms. Nonetheless, Gartner predicts that HailStorm and .NET are likely to significantly increase Microsoft's influence in the industry. By 2006, at least half of midsize and large enterprises will open their networks and rely on HailStorm or similar services for critical business processes, the Gartner report predicts.



eADT for Monday, April 9, 2001
This complimentary newsletter is sent to opt-in subscribers, and to subscribers of other 101communications publications that might be interested in its content. If you would like to receive this newsletter, subscribe by visiting our website at http://lists.101com.com/main.asp?NL=adt.

eADT, a weekly electronic complement to Application Development Trends, provides independent, timely insight to the tools, technologies and trends affecting corporate business application development.


If you are not currently a subscriber to Application Development Trends and would like to be, go to: http://www.adtmag.com and click on the "free subscription" button.

In this issue of e-ADT

Ovum (http://www.ovum.com) on e-business
Integrate to Compete
by Gary Barnett

Application Integration
Curling up with Hot Technology Transfer
by John K. Waters

eBusiness
3Com's IA Decision: Audrey We Hardly Knew Ye
by John K. Waters


Ovum (http://www.ovum.com) on e-business
Integrate to Compete
by Gary Barnett

As an economy cools down, it's a basic truth that competition heats up. Businesses have to accept that margins are going to suffer, and that they're going to have to work even harder to attract and retain customers.

The imperatives of cost management and service innovation should ring some bells—these were the same things we were urged to consider at the height of the e-commerce boom. So the list of things we have to do hasn't actually changed that much.

At the heart of cost management and service innovation (and e-commerce as a whole for that matter) lies what is without question the biggest IT challenge that any business of any scale faces—the integration of lots of different systems, and sources of data.

Integration is extraordinarily difficult, and there are no short cuts. Despite what the enthusiastic promoters of EAI solutions, or B2B integration technologies would like you to believe, the products that you can buy don't yet offer anything like a complete solution. The best products do offer a great framework within which you can create the integrated information system that your heart desires—but it is up to you to create it.

One of the most common mistakes people make when considering EAI, and EAI technologies, is to vastly underestimate how much effort will be required to complete an EAI project. Interestingly enough another of the most common mistakes is to underestimate the benefits of EAI.

If you successfully integrate your information system then attributes like 'flexibility' and 'adaptability' become a lot easier to achieve—and you'll have a stable platform on which you can rapidly build new commerce functionality, to support supply-chain integration, b2b e-commerce, and the voracious demands of your b2c site for new functionality, and broader access to corporate data.


Application Integration
Curling up with Hot Technology Transfer
by John K. Waters

CAMBRIDGE, Mass—It's not enough these days for college professors to while away their tenured hours in ivory towers. What university presidents want is "technology transfer," that often elusive conjunction of pure science and filthy commerce. No one can accuse the folks at the Massachusetts Institute of Technology's Laboratory for Computer Science of loafing in their lab coats. Under the direction of Professor Steve Ward, the Curl research project has just produced its first commercial release: the Curl Surge 1.0 software environment.

The Curl project has some big names attached to it. It was instigated by local business leaders and such MIT luminaries as World Wide Web creator Tim Berners-Lee and Dr. Michael L. Dertouzos, Director of the MIT computer lab. According to Robert A. Young, chairman and CEO of the newly founded Curl Corporation, Curl's technology is "the future of the Web."

"With it," Young said, "organizations can reap tremendous financial benefits by harnessing the power of client-side computing, reducing the size and volume of downloads, and integrating the fragmented development technologies of the Web into a seamless whole. For users, Curl technology delivers the fast, rich, highly interactive Web experience that has only been a promise until now."

Specifically, Curl is a new language for creating Web documents with almost any sort of content, from simple formatted text to complex interactive applets. The Curl Content Language was designed specifically for use on the Web. It integrates mark-up functionality, scripting functionality, and a full-featured, object-oriented programming language—all within one environment. Curl technology can be used with existing Web technologies, such as HTML, CGI, and JavaScript, and with multimedia animation tools, or it can be used in place of them. Curl is intended to be a "gentle slope system," Young said, accessible to content creators at skill levels ranging from authors new to the Web to experienced programmers, said.

According to Young, an applet written in the Curl Content Language is delivered by a server, like most Web content, but "lives" on the client and works well when embedded within an HTML page or when it replaces the HTML page altogether. The applet can "talk" to any server-side technology, such as a CGI script or JSP/ASP, with no modification to the server. Because JavaScript is usually embedded into an HTML document, the Curl Content Language can complement or completely replace JavaScript on the client, providing the interactivity and integrated dynamic content that the Web developer needs.

The Curl Surge product, now available for download from the company Web site, is a browser plug-in for viewing Curl content. The beta 3 of Curl Surge Lab, a developer environment for creating Curl content, is also available for download.

For a brief overview of Curl and its underlying philosophy, point your browser to http://www.cag.lcs.mit.edu/curl/. Or check out the Curl Corp site at http://www.curl.com/html/.


eBusiness
3Com's IA Decision: Audrey We Hardly Knew Ye
by John K. Waters

SANTA CLARA, CA—When 3Com Corp announced substantial third quarter losses last week the company also announced plans to drop two product lines: the Audrey Internet appliance and the Kerbango Internet radio. 3Com said it would suspend all marketing activity on both products on April 1, and would cease operations on them permanently on July 1.

Because Audrey and Kerbango were the only two products to emerge from 3Com's Internet appliance division, industry watchers are expecting the company to disband the entire division, which was created only last year. At press time, no one at 3Com would comment on this prediction, but 3Com spokesman Brian D. Johnson did confirm that the company will continue to make home-networking products.

In a phone interview, Johnson drew an analogy between 3Com's experience with the Audrey and the IA market and Apple Computer's experience with the ill-fated Newton and the hand-held market. "The Newton was ahead of its market," he said. "Today, there's simply no doubt in our mind that the hand-held computer category is a great category to be in. We believe in the potential of the IA category. The Audrey was a great product that was reviewed fabulously, and we have a lot of enthusiasm for it. But we have neither the time nor the financial resources to wait for that particular market to happen. It's important for 3Com to become profitable right now."

According to Brian O'Rourke, senior analyst at In-Stat, a unit of Cahners Business Information market research group, 3Com's action could have significant effects on the IA market.

"The Audrey was the most visible, well-known IA on the market," O'Rourke wrote in a recent In-Stat Information Alert. "3Com was seen as a leading proponent of the device segment, and one of the market leaders in this sector. When coupled with Netpliance's exit from the market in late 2000, it leaves Compaq as the only one of 2000's top three Internet terminal suppliers still in the market."



eADT for Monday, April 2, 2001
This complimentary newsletter is sent to opt-in subscribers, and to subscribers of other 101communications publications that might be interested in its content. If you would like to receive this newsletter, subscribe by visiting our website at http://lists.101com.com/main.asp?NL=adt.

eADT, a weekly electronic complement to Application Development Trends, provides independent, timely insight to the tools, technologies and trends affecting corporate business application development.


If you are not currently a subscriber to Application Development Trends and would like to be, go to: http://www.adtmag.com and click on the "free subscription" button.

In this issue of e-ADT

eBusiness
Support.com scales Great Wall
by Barry Zellen

Data Management
IBM's Data Mining Tool Comes of Age
by John K. Waters

Application Integration
KDE vs. GNOME: Call it a 'Friendly Competition'
by John K. Waters


eBusiness
Support.com scales Great Wall
by Barry Zellen

One of every five people on Earth can now comprehend Support.com's automated Web-based technical support software. That's because its support services are now available in Chinese, spoken by 1.1 billion people.

The addition of Chinese extends Support.com's reach into Asia, after earlier becoming available in both Japanese and Korean. Support.com's first customer in China is the China American Petrochemical Company Ltd. (CAPC), which was acquired by Taiwan-based Support.com-reseller Sysage Technology.

Stamford, Conn.-based Gartner predicts that the number of Internet users in the Asia-Pacific region will increase nearly two-fold by the end of this year, from 41 to 72 million. Ang Ban Leong, general manager of Support.com, Asia-Pacific, said, "Personalized, automated support is a universal need in a world of rapidly increasing technical complexity and widespread product use. Localization of our support software infrastructure in Asian languages allows enterprises in the region to increase user satisfaction and reduce support costs in ways never before possible—and do it for the first time in their native language."

Support.com automates the support process by using the Internet to speed up the resolution of technical problems experienced by the customers, partners and employees of a variety of enterprises. Its customers include GE, Cisco and Boeing.


Data Management
IBM's Data Mining Tool Comes of Age
by John K. Waters

SAN JOSE, CA—Now that data mining has emerged as a key competitive technology, sophisticated tools that allow users to browse, merge, manipulate, and analyze vast stores of raw enterprise data, and display the results in a wide range of useful forms and formats, are maturing fast. Among the most interesting of the current class to hit puberty is IBM's Intelligent Miner Scoring 7.1, which was just released.

Big Blue's new product has been implemented as an extension to its DB2 Universal Database. Intelligent Miner Scoring works directly from the relational DB, speeding up the data-mining process. Because it is an extender to DB2, it can easily read, not only DB2 data, but data running on competitive databases as well. For example, the product is compatible with Oracle databases.

Bringing the data mining functions into the DB engine reduces the amount of instructions the computer has to execute to respond to a user. The result, says Jeff Jones, senior program manager in IBM's Data Management Solutions Group, is real-time, interactive data mining.

"By taking the mining capabilities and building them as an extension to DB2, the notion of real-time data mining—data mining while you're talking on the telephone with a customer—is feasible," Jones says. "We believe that we are the first to implement integrated data mining into the database engine."

In line with the industry's efforts to enhance the exchange of information on the Web with tools like Extensible Markup Language (XML), Intelligent Miner Scoring also supports the industry standard for predictive modeling, PMML (Predictive Modeling Markup Language). PMML is an XML-based language that provides applications a vendor-independent method of defining predictive models to eliminate proprietary incompatibilities. IBM is among the first to support the PMML standard in an implementation of a mining tool.


Application Integration
KDE vs. GNOME: Call it a 'Friendly Competition'
by John K. Waters

PALO ALTO, CA—Anyone watching the action in the Linux space knows that rival factions have developed two feature-rich and rapidly maturing desktop environments for the open source operating system. What those outside looking in might not know is that what began as a religious war has simmered in recent months to more of a friendly competition.

At least that's how Kurt Granroth, KDE core developer working for SuSE, and Leslie Proctor, marketing coordinator for the GNOME Foundation, see it. "You certainly have people who are passionate about one or the other," says Proctor, "but on another level, we're trying to build bridges and have user interfaces that are common."

"Until fairly recently, [online] conversations between KDE and GNOME people would be filled mostly with flames," Granroth says. "Now we get along better than we ever have."

Initially called the Kool Desktop Environment, KDE is the brainchild of Matthias Ettrich, who began the project in October 1996. The first version was released in July 1998. The GNOME (GNU Network Object Model Environment, pronounced "guh-NOHM") project began in 1998 as a direct reaction among free software proponents to the licensing requirements of Tolltech—maker of the Qt library that KDE relies on for its graphical widgets. Trolltech initially made its library available in source code form for free software development, but anyone wanting to sell the applications they developed using it had to buy a license.

"KDE is an 'open source' desktop," Granroth explains. "GNOME is a 'free software' desktop. It sometimes doesn't make sense to people outside the community, but we have very different philosophies." But in late 1999, the Qt library essentially became open source, and partisan passions cooled.

"After that, we were no longer heretics," Granroth says.

Truce, however, doesn't equal surrender; don't expect to see some kind of merger of the two environments anytime soon.



eADT for Monday, March 26, 2001
This complimentary newsletter is sent to opt-in subscribers, and to subscribers of other 101communications publications that might be interested in its content. If you would like to receive this newsletter, subscribe by visiting our website at http://lists.101com.com/main.asp?NL=adt.

eADT, a weekly electronic complement to Application Development Trends, provides independent, timely insight to the tools, technologies and trends affecting corporate business application development.


If you are not currently a subscriber to Application Development Trends and would like to be, go to: http://www.adtmag.com and click on the "free subscription" button.

In this issue of e-ADT

Application Integration
Web Services Description Language (WSDL) 1.1 now in the Hands of W3C
by John K. Waters

eBusiness
PacketVideo Pockets Big Bucks from Tech Firms
by John K. Waters

Data Management
Beyond.com's Government Systems Group to Resell Hyperwave's KM Solutions to the U.S. Federal Government
by Barry Zellen


Application Integration
Web Services Description Language (WSDL) 1.1 now in the Hands of W3C
by John K. Waters

SAN JOSE, CA—The cause of open standards moves forward apace. The latest development: The W3C has accepted the submission of the 1.1 release of the Web Services Description Language (WSDL) specification for standardization. The submission by WSDL co-authors IBM and Microsoft, along with 23 other companies, represents the highest number of co-submitters ever on a specification contributed to the W3C. Among the companies endorsing the WSDL submission were Ariba, Hewlett-Packard, Oracle, Intel and Commerce One. The previous record was 11 co-submitters of the Simple Object Access Protocol (SOAP), also co-authored by Big Blue and MS.

WSDL is an XML-based language used to describe programs accessible via the Internet (or other networks), and the message formats and protocols used to communicate with them. WSDL is important because it enables Web services to describe their capabilities in a standard way, which allows for easier interoperability among Web services and development tools.

"We spent the 90s working out the standards for Web documents, things like HTML," said Bob Sutor, IBM's program director for e-business standards strategy in Somers, NY. "What we need now is a similar stack of standards for the way we use the Web for business. And that boils down to open standards. It's got to be about interoperability."

According to Sutor, WSDL complements the UDDI (Universal Description, Discovery and Integration) registry, the cross-industry initiative designed to accelerate and broaden business-to-business integration and commerce on the Internet. The UDDI initiative was intended to form a collection of registries and databases describing what businesses do and how to access their services electronically.

"WSDL gives us a good start toward helping people to describe their services in a standard way," Sutor said. "UDDI helps people to publish [those services]."

WSDL was developed outside of the UDDI group and will either formally be submitted to the coalition for a specification or be submitted to a separate standards organization. Sutor said WSDL would not be the only choice for describing Web services. Microsoft and IBM felt it made sense to combine their efforts, he said. "This is a straightforward merger of the two approaches," Sutor said. "They were very close to begin with."

IBM is sponsoring a W3C workshop, scheduled for April, focusing on Web services descriptions, security, reliability, and workflow. Co-chaired by David Fallside, IBM's senior technical staff member for XML standards development and head of the W3C's XML Protocol working group, the workshop's mission is to facilitate discussion on what new standards activities in the Web services area should be started and how to best implement these working groups quickly and efficiently.

After the workshop, the W3C is likely to set up several working groups relating to Web services, Sutor said. It is within its working groups that the W3C develops its specifications. In this case, the working groups would be starting with a fairly advanced version of WSDL, but the process is bound to change it. The W3C established the XML Protocol Working Group in response to last year's IBM/Microsoft SOAP submission.

The WSDL specification is currently available on IBM's developerWorks Web site and Microsoft's site. IBM, although fully supporting the open standards approach, was never one to wait around for the specs to arrive fully cooked. The company is offering developers the WSDL Toolkit, the Web Services Toolkit, the Web Services Development Environment, and other tools for SOAP and UDDI, all available from its alphaWorks Web site.


eBusiness
PacketVideo Pockets Big Bucks from Tech Firms
by John K. Waters

The economy may be in a downturn, but at least one startup is bucking the trend. Wireless multi-media company PacketVideo got a big nod from some industry heavyweights this week in the form of $90 million in funding. Investors in the privately held developer of software to streamline video and audio content over the Internet included Intel, Motorola, Qualcomm, GE, Sun Microsystems, and Texas Instruments.

Since the San Diego-based company was formed in July 1998, it has raised $145 million, a company spokesperson said. The latest investment alone ranks among the largest private technology-financing deals this year.

Does this funding frenzy mark PacketVideo as a company to watch? The CEO sure thinks so. "Our ability to 'beat the odds' by raising significant funds during poor market conditions is a testament to the quality of our technology, our people, our partners, and the tremendous market opportunity we face," said James Carol, chief executive officer and co-founder of PacketVideo. "This new funding will help to propel us into 2001 and beyond and allow us to continue to execute on our strategy."

Whether PacketVideo is the company to watch remains to be seen, but this kind of support from such heavy hitters certainly should put its technology on everyone's radar. The company makes multimedia playback software for portable wireless devices. Its MPEG-4 compliant software enables the delivery, management, and viewing of video, audio, and multimedia applications over current wireless networks to mobile devices, such as SmartPhones, PDAs, and laptops.

According to Anjeanette Rettig, PacketVideo's VP of corporate communications, the attention the company's technology is getting lets it be selective about investors. "A lot of the investors we have are also partners or customers of the company," Rettig said. "We look at these investment relationships on a much more strategic level that just the investments themselves."

PacketVideo is working with such wireless semiconductor providers as ARM, Intel, Lucent Technologies, QUALCOMM, Texas Instruments, and Zucotto Wireless. The company recently announced that its PVPlayer decoding software will ship with Mitsubishi Electric's new Trium Mondo, one of the world's first integrated GPRS/GSM mobile phone/PDA devices. PacketVideo's software also operates on laptops and on Microsoft Pocket PC devices, including the Compaq iPAQ, Casio's CASSIOPEIA, and the Hewlett Packard Jornada.

Rettig says that the company's primary competitors are Real Networks, Microsoft and Apple, all of which are already making streaming multimedia software for wired networks.

The company filed for a $64 million initial public offering of stock in March 2000, at the peak of Wall Street's passion for tech-related IPOs. But the company withdrew the offering in late April, because of "unfavorable market conditions." The funds from this round of financing will be used to further PacketVideo's growth, provide working capital, fund research and development, and support additional resources for engineering, sales, and marketing.


Data Management
Beyond.com's Government Systems Group to Resell Hyperwave's KM Solutions to the U.S. Federal Government
by Barry Zellen

Beyond.com, an e-commerce service provider that builds, manages and markets online stores, and Hyperwave, a provider of enterprise-class knowledge management and portal software, have joined forces. Under the terms of the partnership agreement, Beyond.com's Government Systems Group will resell Hyperwave's knowledge management solutions to numerous U.S. government agencies.

Beyond.com's Government Systems Group provides digital distribution of software and related services to the Bureau of Engraving and Printing (BEP), Defense Logistics Agency (DLA), Department of Defense (DoD), Internal Revenue Service (IRS), Office of Thrift Supervision (OTS), National Imagery and Mapping Agency (NIMA), and the Patent and Trademark Office (PTO). Beyond.com's General Service Administration (GSA) custom online store allows government employees to purchase software over the Internet.

"We chose to partner with Hyperwave because of its strong integrated knowledge management and corporate portal offerings," said Don Beery, vice president and general manager of Beyond.com's Government Systems Group. "We're excited about partnering with Hyperwave and providing its offerings in the government sector."

The partnership agreement provides Hyperwave with a GSA-approved schedule for its Hyperwave Information Server, Hyperwave Information Portal products and future product offerings. And it provides Beyond.com with additional knowledge management offerings for its customers.

Hyperwave Information Server delivers a platform for developing intranet and extranet knowledge-sharing applications. Its features include bi-directional link integrity, granular security, document/content management, collaborative tools, an open API, and universal access from browsers and desktops. Hyperwave Information Portal (HIP) is an out-of-the-box corporate portal solution that is built on Hyperwave Information Server infrastructure, offering both aggregation and collaboration and giving users access to internal and external resources, including unstructured, structured and groupware content on one or more servers.

"Federal agencies have a clearly demonstrated need for our knowledge management and corporate portal solutions," said Tim Kounadis, vice president of North America Marketing for Hyperwave. "Through our partnership with Beyond.com, we expect to better service our existing government customers and greatly expand our distribution arm to this important market."



eADT for Monday, March 19, 2001
This complimentary newsletter is sent to opt-in subscribers, and to subscribers of other 101communications publications that might be interested in its content. If you would like to receive this newsletter, subscribe by visiting our website at http://lists.101com.com/main.asp?NL=adt.

eADT, a weekly electronic complement to Application Development Trends, provides independent, timely insight to the tools, technologies and trends affecting corporate business application development.


If you are not currently a subscriber to Application Development Trends and would like to be, go to: http://www.adtmag.com and click on the "free subscription" button.

In this issue of e-ADT

Ovum (http://www.ovum.com) on e-business
The Road to Nirvana—Component Re-use

by Gary Barnett

Component Strategies
WebGain and ComponentSource Partner on Enterprise JavaBean Component Development

by Barry Zellen

Application Integration
AOP Modularizes Crosscutting Concerns

by John K. Waters

eBusiness
The Return of the Open Source Manifesto

by John K. Waters


Ovum (http://www.ovum.com) on e-business
The Road to Nirvana—Component Re-use

by Gary Barnett

Component based development is touted as many things.

We are told that it will lead to better software; that the software we create will be flexible and adaptable and that we'll soon be able to 'compose' applications from a menu of pre-built components in half the time.

Indeed given all of these obvious benefits, it seems mystifying when component based development turns out to be fabulously difficult, results in poor software that is almost impossible to change and any thoughts of re-use dissolve into frustration at never seeming to be able to get your hands on that great component you know exists but just can't find.

The truth is that component based development is hard—and the benefits of component use and re-use only come after you've stumped the price for a ticket to Component Nirvana.

Begin by looking at those areas where component re-use really does work, start with all those 'widgets' on the menu bar of your favorite development tool. What is it about these 'text boxes' and 'option buttons' that makes you use them rather than writing your own?

Firstly, they are easy to find; they're there—right in front of you. Secondly, it's relatively easy to figure out what they do; you don't need to spend hours figuring out what impact changing the 'FontSize' attribute is going to have on the contents of your text box. Thirdly, and rather obviously, every application needs a text box or two.

These three attributes—accessibility, simplicity and utility are the cornerstones of re-use in component-based development. And no matter how well crafted your component, if these qualities aren't present, your component is likely to remain a very well kept secret.

Accessibility—is the ability to find the component

Simplicity—is the ability to understand what it does

Utility—is a measure of how readily 're-usable' the component is—if it is too generic it isn't particularly valuable, but if it's too specific you might have to wait a decade for another identical requirement to emerge.

The first of these attributes is actually external to the component, as it relates to providing a component yellow pages—a directory that describes the components it can 'see', and what they can do in a manner that makes it easy to discover a component that meets a given requirement.

The final two attributes are entirely down to you—It is up to you to make sure that it doesn't take a genius (or the source code) to figure out what your component does. The whole point of a component is that its purpose should be unambiguous, and you shouldn't have to pry it open in order to figure out how to use it.

Utility is the most difficult quality to get right—In component based development 'Size Does Matter', but there is no hard and fast rule about which size is optimal. Aside from the basic rule 'make it cohesive' and 'loosely coupled' the process of defining a component that is big enough to be valuable, and small enough to be applicable frequently comes down to 'Component Zen', and may the force be with you.


Component Strategies
WebGain and ComponentSource Partner on Enterprise JavaBean Component Development

by Barry Zellen

WebGain, Inc. and ComponentSource have formed a partnership to promote component construction and usage at the enterprise level. They will integrate their technologies to simplify Component-Based Development (CBD) using Java and enhance the supply of off-the-shelf reusable Enterprise JavaBean components.

WebGain Studio provides an integrated application development environment (ADE) for modeling, developing, deploying and integrating Enterprise Java applications, and ComponentSource has assembled what it describes as the world's largest source of commercial-grade components and related services, with its component re-use automation technology, ComponentFind.

WebGain will integrate ComponentFind with its WebGain Studio development modules to help developers locate off-the-shelf Java components throughout their application development process. WebGain will become part of our ComponentSource's Component Authoring Program which supports education efforts that include enabling Component Authors to work with WebGain's EJB development products to facilitate the creation of commercial-grade EJBS.

"We are extremely pleased to partner with WebGain to offer Java component developers vastly improved technology and resources," added Sam Patterson, CEO of ComponentSource. "Integration with our online marketplace via our SOAP-enabled automation will provide an intuitive environment for WebGain's enhanced, modular Java development platform."

Patterson adds, "The speed to change—this is what it's all about. The ability to discover and find standards based, tried and tested off-the-shelf components in whatever context you are in, means that you can focus on writing code that truly differentiates your business and speeds up your time to market."

"A lot of companies today are revisiting the question—should I buy or should I build?," Patterson observes. "We have seen an 800% increase in the demand for off-the-shelf Java and EJB components over the past year. Our partnership with WebGain will help further our efforts in boosting the supply of off-the-shelf Java expertise, by providing our component author community a mature application development environment for the development of Java-based components."


Application Integration
AOP Modularizes Crosscutting Concerns

by John K. Waters

PALO ALTO, CA—A new generation of software development tools supporting a new approach to programming is sliding down the birth canal. As early as this summer, coders can expect to see the first Aspect Oriented Programming (AOP) products becoming generally available.

AOP allows global properties of a program to determine how it is compiled into an executable program. Traditional units of modularity in programming languages include objects, functions, modules, and procedures. But some functions can't be encapsulated in single modules; instead, their implementations end up scattered across the class hierarchy. AOP programming deals with new units of modularity, called "aspects," which involve more than one functional component, such as synchronization, integrity control, persistency, and interaction. Aspects can't be neatly separated using traditional units of modularity.

Put another way, code relating to aspects is often expressed as small code fragments tangled and scattered throughout several functional components. Because of the way they cross module boundaries, it is said that aspects "crosscut" the program's hierarchical structure. Aspects encapsulate crosscutting concerns.

AOP can also be seen as a complementary design and implementation technique to object-oriented programming. In fact, one of the early AOP development tools, AspectJ, is specifically designed to work with one of the leading OO languages: Java.

"AspectJ doesn't replace Java," says Gregor Kiczales, professor of computer sciences at the University of British Columbia, and a principle scientist at Xerox Palo Alto Research Center. "It makes Java even more powerful by solving some of those problems that Java doesn't handle well."

AspectJ is a general purpose AOP extension to Java. It enables the modularization of such crosscutting concerns as: system wide error-checking strategies, design patterns, synchronization policies, resource sharing, distribution concerns, and performance optimizations. An open source project, AspectJ is the product of years of research at Xerox PARC. The project is partially supported by the Defense Advanced Research Projects Agency (DARPA).

Another promising AOP tool is IBM's HyperJ, currently distributed through the alphaWorks Web site. IBM doesn't us the AOP moniker, but says that HyperJ "...supports advanced, 'multi-dimensional' separation and integration of concerns in standard Java software..." which "... facilitates improved modularization, adaptation, composition, integration, and even non-invasive remodularization of Java software components."

This first HyperJ release is a file-driven, batch version. Some key features planned for future releases include: an interactive GUI, support for composition of Java library classes, and more convenient ways to specify a number of common compositions.

The implications for developers of the emergence of real-world AOP tools is simple: if it works the way it's supposed to, AOP could become a powerful means of coping with the staggering complexity that increasingly characterizes software applications. AOP tools hold the potential for simplifying and all but eliminating some of the most time consuming, frustrating, and difficult aspects of OO development.

"AOP is an idea whose time has come," says Kiczales. "It solves one of this generation's biggest programming problems."

Expect to see the official 1.0 release of AspectJ this summer, but evaluation copies are available now for download from the Web site (http://aspectj.org). "It's ready for early adopters to beat on it," says Kiczales.


eBusiness
The Return of the Open Source Manifesto

by John K. Waters

SEBASTOPOL, CA—Anyone in IT who really wants to understand what open source software is all about should go out right now and pick up a copy of the newly updated and revised edition of Eric S. Raymond's The Cathedral & the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary. The accolades for this book are hyperbolic, to be sure ("the most important book of the software frontier of the 1990s," "the definitive work on the open source evolution," "a landmark piece of information technology"), but that shouldn't stop anyone from reading it. The Computer Press Association named it Best Nonfiction Computer Book of 2000, and it should probably win for 2001.

Reissued last month by O'Reilly Press, Raymond's collection of essays dives deep into the open source philosophy. The book takes its title from an essay Raymond read at the 1997 Linux Kongress, and later published on the Web. In that essay, he documents his acquisition, re-creation, and numerous revisions of an e-mail utility known as "fetchmail." In describing the fetchmail development process, he illuminates the "bazaar" development method he uses with the help of volunteer programmers, and demonstrates the efficacy of the open-source process.

Called a "hacker philosopher," Raymond's influence on the software development world has been profound. His evangelism helped persuade Netscape to release its browser as open source, and arguably, he put Linus Torvalds on the cover of Forbes.

Bob Young, CEO of Red Hat, called the book "...Eric Raymond's great contribution to the success of the open source revolution, to the adoption of Linux-based operating systems, and to the success of open source users and the companies that supply them."

The book includes other essays, including new material on open source developments in 1999 and 2000. New essays address the economics of open source and open source as a competitive weapon. Predictions in the chapter "Revenge of the Hackers" are examined from the perspective of one year later, and new ones are added.

"There's a juicy new section on the mechanics of bazaar development that discusses communications structures and the nitty-gritty of parallel debugging and why it works so well," Raymond said in a published press release. He also develops more detailed analysis of "project forking," looks at the economics of open source, and includes a statistical appendix on the growth of the "fetchmail" project.

Raymond's book has been called "...the manifesto and the declaration of independence of a revolution in progress." Again, hyperbolic praise, but it's that kind of talk that keeps the big players up at night.



eADT for Monday, March 12, 2001
This complimentary newsletter is sent to opt-in subscribers, and to subscribers of other 101communications publications that might be interested in its content. If you would like to receive this newsletter, subscribe by visiting our website at http://lists.101com.com/main.asp?NL=adt.

eADT, a weekly electronic complement to Application Development Trends, provides independent, timely insight to the tools, technologies and trends affecting corporate business application development.


If you are not currently a subscriber to Application Development Trends and would like to be, go to: http://www.adtmag.com and click on the "free subscription" button.

In this issue of e-ADT

Ovum (http://www.ovum.com) on e-business
Web Services—Pointing the Way,
by Gary Barnett

Application Integration
Intel's Barrett: Prepare for the Upswing,
by John K. Waters

eBusiness
P2P: the Bandwidth Killer,
by John K. Waters


Ovum (http://www.ovum.com) on e-business

Web Services—Pointing the Way
By Gary Barnett

Microsoft's .NET announcement, and the subsequent slew of 'me too' announcements by the company's arch rivals hails what promises to be a huge sea change in the way we build and deploy business processes.

But remember that object databases promised to "transform the way we store and use data". We know that there is a huge difference between a 'promising technology' and a 'mainstream' technology, and the road from 'cool' to 'mainstream' is littered with the bones of technologies that paid the price of promising way too much way too soon.

The good news is that web services has some powerful friends—Microsoft, IBM, Sun, and HP to name but a few. It also seems as if the time is right. E-commerce, the need to adapt business processes quickly, and the very real need to find effective ways of sharing those business processes with partners and customers all cry out for some means to be able to easily exploit existing business processes, and make them available to other people.

Most interestingly however is the appeal that web services are going to present to developers—the majority of whom would gladly swap the need to know the intricacies of IIOP and COM RPC for a straightforward and clear mechanism that allows them to build applications that interact with other applications over a network. This is after all the primary purpose of all those 'heavy duty' distributed computing technologies.

The business of making a service available to others via the Internet is actually relatively simple. From the CGI scripts of the early nineties to the richer XML based mechanisms of today the task has been straightforward: First you define a simple interface and then some means to call that interface and retrieve any output that your process might produce. Indeed, the very act of entering a URL follows exactly these steps.

By using an ubiquitous and simple protocol like HTTP with an ubiquitous and simple mechanism for formatting Data (XML), you have the seeds of a model for distributed computing that is functional and, above all, simple to use. Technical simplicity often draws scorn, but I'm a huge fan.

Whilst there are a slew of 'minor details' that need to be addressed by the owners of technologies like SOAP and UDDI, the fact that so many major players on the supply side of the technology world find web services compelling, and the fact that it could lead to a model of distributed computing that doesn't require a Ph.D. in nuclear physics to exploit, means that of all the 'Promising Technologies' I've seen for a while, this one seems likely to keep some of its promises.


Application Integration
Intel's Barrett: Prepare for the Upswing
By John K. Waters

SAN JOSE, CA—Despite an economic slowdown that is causing serious belt-tightening throughout the technology industry, Intel Corporation will invest some $12 billion in capital expansion and R&D during the coming year. So says Intel CEO Craig Barrett, speaking to attendees at last week's semi-annual Intel Developer Forum.

"You never save your way out of a recession," Barrett said. "The slowdowns are going to end, and you need to prepare for the upswing. The build-out of the Internet, the build-out of this digital world is still in its infancy."

During his keynote presentation, Barrett demonstrated Intel's successor to the long-delayed Itanium chip, code named "McKinley." According to Barrett, Intel plans to begin pilot production of the new chip before year's end. The chip will have a much larger integrated cache for better performance than the Itanium, Barrett said, and it will have three times the data throughput power. Intel claims that the McKinley chip will provide eight times the performance of an equivalent UltraSparc chip from Sun Microsystems on some benchmarks.

Barrett and Paul Otellini, general manager of the Intel Architecture Group, outlined other company initiatives for the coming year and on into 2002, including:

  • Itanium server chips: The company plans to announce the server version of the long-delayed chip in the second quarter of 2001.
  • Copper technology: Intel is investing in equipment to manufacture copper-based chips via the 0.13-micron process.
  • The "extended PC": Intel will promote its vision of home entertainment networks with Pentium 4 desktop machines running Microsoft's new OS, Windows XP, at the center. Both Intel and Microsoft are expected to push this idea aggressively in the coming months. Microsoft is touting XP as the perfect OS for delivering video, audio, photographs and other media.
  • Mobile Pentium IIIs: The company announced a 700MHz Pentium III at the conference; look for a 1GHz mobile Pentium IIIs later this quarter.
  • Mobile Pentium 4s: Intel expects to be producing P4 chips for portables sometime in 2002. A brand-new mobile chip architecture designed for power-efficiency and wireless connectivity is expected in 2003.

"I think the investment will pay for itself in lower costs," Barrett said of Intel's plans. "It is new technology and products that let you walk out of a recession."

Although Barrett's keynote could fairly be described as upbeat, he was anything but sanguine about current economic conditions. "There is absolutely a slowdown in U.S. manufacturing," he said. "It hasn't spread to other markets yet, but if the malaise does move overseas, a wider recession could hit."


eBusiness
P2P: the Bandwidth Killer
By John K. Waters

PALO ALTO, CA—Peer-to-peer computing evangelists say we're in for an IT revolution as earthshaking as the advent of the Internet itself. And it's clear that P2P has earned Next-Big-Thing status, with a range of companies swarming into the market with new offerings. One can't help but wonder, what's the downside to a future in which clients at the edges of the Internet connect to each other directly?

Here's one: increased bandwidth consumption.

Users running P2P applications—whether they are file-sharing Napster-like apps, or MIPS-mining distributed computing programs a la SETI@home—could end up devouring great hunks of bandwidth, forcing Internet service providers (ISPs) to rethink their usage and pricing models.

ISPs are able to charge flat rates because most subscribers log onto the network to collect their e-mail and do a little Web surfing, and then they log off. P2P applications can turn these intermittent users into always-on bandwidth hogs. Furthermore, many ISPs "overbook" their networks by as much as 40 to 1; most simply don't have the capacity to cope with widespread P2P interactions.

How likely is this scenario? Remember back in 1999, when a number of universities banned Napster because users overwhelmed their campus networks? Plenty of corporations have banned Napster and Napster clones from the company net. Today, Napster boasts some 62 million users, and whatever the resolution of its current legal troubles, there are plenty of P2P enterprises surfacing with equal potential to jam an unprepared network.

IT managers should keep in mind that P2P apps have some serious network-clogging potential, and that they could carry additional expenses from service providers. Analysts at the Yankee Group expect ISPs to attempt to solve the problem first with new pricing schemes. According to Yankee analyst Bob Lancaster, rather than raising rates across the board, we're more likely to see top service providers offering an additional tier of service for high-bandwidth users.

David Kopans, chief financial officer at distributed computing company Applied MetaComputing, advises IT managers and software developers alike to roll with the P2P punches.

Dwight Davis, industry analyst for Boston-based Summit Strategies, offers an even more succinct admonition: "The genie is out of the bottle. IT may find that, in order to be a player, it has to buy into the P2P model. They might have to say, Okay, fine, you guys like this technology so much, we will support it, but we will support it by having our own subset of an accepted list of components that you can download and add to your computers. Can they enforce that? I don't know, but it seems more intelligent over time for the IT orgs to be players rather than holding their fingers in the dike while the water flows over them."



eADT for Monday, March 5, 2001

In this issue of e-ADT

Component Strategies
NexWave Raises $7 Million in VC, by Barry Zellen

eBusiness
P2P By Any Other Name, by John K. Waters

Data Management
IBM's iSCSI Offering, by John K. Waters


Component Strategies

NexWave Raises $7 Million in VC
By Barry Zellen

NexWave, a French developer of embedded systems software, has completed its first round of financing, raising $7 Million from Mentor Valley.

NexWave had received earlier seed funding from two regional seed-funding organizations, benefiting from the aid of ANVAR (Agency for Commercialized Research) and DATAR (French National Development Agency). The company provides technology solutions for the design and development of embedded electronic systems intended for the telecommunications and electronics markets (wireless phones, Webphones, PDAs, and navigation systems).

With the completion of its first round, NexWave plans to fund its international expansion by opening subsidiaries in the US, England, Germany, and Asia by the end of 2001.

"This first round of financing will allow us to realize our ambition, which is to establish NexWave's solutions as the standard for software in the embedded systems market," says Michel Hodzaj, CEO of NexWave Solutions. "To accomplish this goal, NexWave will begin by expanding its team on an international level. We expect to hire more than 100 people by the end of the year for R (Research and Development) and international offices."

Krishna Gopala, co-founder of Mentor Valley, added, "We were looking for a company combining genuine technological innovation with a very strong management team capable of developing and deploying internationally. NexWave perfectly matches our requirements."

Founded in France in 1998, NexWave offers a components-based framework under which developers can create operating systems and applications for the embedded market. The solution is optimized for the rapid creation of intelligent systems, as well as for those requiring Internet access. NexWave has entered into numerous partnerships with major companies in these markets to make its technology a new standard.


eBusiness

P2P By Any Other Name
By John K. Waters

You can't open a newspaper or download a tech-news Web site these days without seeing "Napster" splashed all over the place. There's a reason Napster keeps getting all the peer-to-peer (P2P) computing headlines: the music-file-sharing service has signed up more than 58 million users in the two years since it was launched by an 18-year-old college student, and it continues to register more than 300,000 people every day. Throw in a couple of big lawsuits from music industry moguls who want to shut the service down, and it becomes a veritable headline machine.

But all the Napster noise is stealing the thunder away from what may be the more important P2P model; to differentiate, call it "distributed computing." Where Napster-like systems support file-sharing among individual PCs interacting directly with each other with little or no server involvement, distributed computing architectures organize linked machines for the purpose of sharing computing cycles. The first approach is about people interacting; the second is about marshalling computing power-and in the long run, it's the second one that has the most potential to influence a company's bottom line.

Computers running software from companies like San Diego-based Entropia utilize idle computer time drawn from a network of PCs to solve large computational problems, run financial analyses, and find cures for diseases. Working in the background, these programs take on small workloads that have been divided and distributed to various devices across the Internet--essentially borrowing processing power from other computers on the network. This process is also called "grid computing."

"Through the use of these resources, your computing device can be both a consumer and a producer on what becomes a universal platform," explains Scott Kurowski, Entropia's founder. "In as little as five years, we fully expect end users to be running applications that invisibly tap into huge amounts of computing capacity on the backend.

Researchers at the University of Wisconsin estimate that most companies use less than 25% of their computing and storage capacity. By some estimates, 90 percent of America's computer capacity is idle at some point during the day, especially when people turn off the machine and go home for the evening. Entropia's software "rescues" these wasted computing cycles between keystrokes while applications are running on the PC.

Entropia's network encompasses nearly 100 countries around the world, from which it aggregates 50 teraflops of potential computing capacity. In fact, Entropia squeezes so many MIPS out of its network that it can afford to give away the extra computing power to good causes, such as the AIDS@home project. "This is a good way not to have to buy a lot of servers to get massive amounts of computing resources," Kurowski says.

Probably the best-known example of the distributed computing model is the SETI@Home project. Under the auspices of the SETI organization (the Search for Extraterrestrial Intelligence), SETI@Home marshals unused computing cycles of a global network of volunteer PC owners to search for radio signals from an extraterrestrial intelligence. To date, the project has attracted more than two million participants.

The technology behind distributed computing is not new. Scientists have linked supercomputers this way for years. Companies have distributed-computing software on internal networks to get extra work done at night and on weekends. Some argue that distributed computing shouldn't even be called peer-to-peer.

"I find this a bit of a stretch," says Dwight Davis. "I know that they're sharing resources among desktop machines, but to me, that's not peer-to-peer. That's really just taking the model of a massively parallel processor and extending it out to the broader universe of PCs. But the PCs themselves are not communicating with each other; it's all orchestrated by a central server, in other words a controlling authority. For me at least, P2P has to have some element of direct communication between individual computers."

Nevertheless, distributed computing companies make up a significant membership in the Peer-to-Peer Working Group; Entropia was a founding member. And the model does something else: while companies like Oracle and Sun are touting centralized computing models ("the network is the computer"), distributed computing schemes suddenly make the desktop PC relevant again. Expect to see Intel and Microsoft emerge as big supporters of this kind of P2P computing--if the Napster racket ever quiets down.


Data Management

IBM's iSCSI Offering
By John K. Waters

SAN JOSE, CA--IBM took a big step into a smaller world with its recent announcement of a new network storage solution aimed at small to medium sized businesses. Big Blue's new Linux-based, iSCSI-enabled network storage appliance, the IP Storage 200i, supports networks of storage products built using TCP/IP, instead of the Fibre Channel communication standard typically used by high-end storage area networks (SANs).

The move highlights a serious investment by IBM in the new iSCSI standard, which is an Internet version of the small computer system interface (SCSI) technology used to connect hard disks and other devices to computers. IBM is working with Cisco Systems to make iSCSI a standard approved by the Internet Engineering Task Force. (Cisco acquired iSCSI developer NuSpeed in July.)

iSCSI is a type of IP storage. IP storage systems send block-level data over an IP network. iSCSI, transmits native SCSI over a layer of the IP stack. iSCSI lets a corporate network transfer and store SCSI commands and data at any location with access to the WAN or, if transmitted over the Internet, to locations with access to the Internet. It also allows smaller localized SANs to be built using the common Ethernet infrastructure. Consequently, iSCSI enables SANs to be implemented by a broad, mainstream market.

The iSCSI standard makes it possible to use ordinary TCP/IP hardware instead of specialized Fibre Channel equipment in the development of a SAN. SANs are highly scalable networks of servers and storage devices interconnected through Fibre Channel hubs and switches. Although a SAN is typically clustered in close proximity to other computing resources (say, an IBM S/390 mainframe), these networks may also extend to remote locations for backup and archival storage, using wide area network (WAN) carrier technologies, such as asynchronous transfer mode or Synchronous Optical Network.

As more and more IT organizations find their data storage requirements passing the terabyte mark, SANs and related technologies are becoming a hot topic. But it's not the data storage explosion that's driving the growth of this market; it's the data management demands. Hard disk space is cheap and plentiful, and IT administrators rarely give a second thought to adding 50 GBs here and there when the need arises. But the overhead associated with directly attached storage is considerable, and managing all that data is becoming a nightmare in some organizations-especially when you throw in the high-availability demands of e-commerce.

"As e-business and the Internet move storage from the back room to the heart of the IT network," said Linda Sanford, head of IBM's storage group, "customers are looking to take the islands of storage they have built and create interoperable, open storage networks, be it within a single department or a worldwide enterprise."

IBM has targeted its iSCSI offerings for workgroups and small departments that cannot justify the expense of bringing Fibre Channel-based SANs to the desktop, but which already have the infrastructure in place to support TCP/IP.

But IBM has also made it clear that it is not abandoning its larger enterprise customers that depend on Fibre Channel solutions. In addition to the IP Storage 200i announcement, the company announced a series of storage technology enhancements and customer-focused solutions, including native Fibre Channel for IBM's 3584 UltraScalable Tape Library, a 32 GB cache upgrade to IBM Shark, and a Web-based support program aimed at storage customers.

The company also announced the industry's first open network-attached storage (NAS) gateway, the IBM TotalStorage 300G, which allows LAN-based clients and servers to interoperate with an existing SAN. The idea is to leverage the features and performance of a SAN with the ease and convenience of a NAS product.

According to IBM's Sanford, the 100i version of the IP appliance will be available in the first half of this year for a starting price of around $20,000.