News

Can data quality elude commoditization?

Now that IBM, Microsoft and Oracle have built (or are in the process of building) robust ETL facilities into their flagship relational databases, the commoditization of ETL is all but certain. At least it would seem so.

"Data quality is still a relatively new area in tools," says Philip Russom, senior manager of research and services with TDWI. Unlike ETL, Russom says, the data quality space hasn't yet experienced the erosion of pure-play market share by encroaching commodity vendors. He points to ETL powerhouse Informatica's own data quality strategy, which--like that of BI giants Business Objects and Cognos--taps data quality technology from third-party partners such as FirstLogic and Trillium.

FirstLogic VP Frank Dravis isn't exactly a dispassionate observer of the data quality marketscape. "I would even position [being a pure play] as a place of strength," he says. "Microsoft in their new [SQL Server 2005] Integration Services...is indeed coming out with some basic data quality functionality. OK, that's fine. But that only works on SQL Server 2005," he comments.

Nevertheless, some industry watchers believe mainstream commoditization in the data quality segment is inevitable. "I think there's an ongoing trend to recognize the importance of data quality upfront," says Mike Schiff, a principal with data warehousing and business intelligence consultancy MAS Strategies. "I think [data quality pure plays are] going to be hot commodities, and I think a lot of vendors are going to move to own the technology rather than relying on a partnership."

About the Author

Stephen Swoyer is a contributing editor for Enterprise Systems. He can be reached at [email protected].