News

Dash on real-time, BI and the road ahead

The real-time enterprise should be pursued, not avoided. The real-time enterprise is an intriguing concept, and our bet is that it will come to life during the coming years. This was borne out in a recent conversation with Jnan Dash, an eminent technologist who is much interested in the real-time enterprise these days.

In stints at IBM and Oracle, Dash played important roles in the development of DB2, Oracle8i and Oracle9i. Today, he consults and “acts as mentor” to a number of promising start-ups in real-time enterprise software, networks and related fields. We spoke to Dash as he was preparing a session he will moderate at the Application Development Trends Management Symposium in Boston (August 13–14, 2003).

For CIOs today, the drive is on to cut down on latency and labor, said Dash. He provided a quick view of how we got here.

“We started this thing back in the late ’80s at IBM where two of my colleagues coined the phrase ‘data warehouse.’ In those days, a 50-Mbyte-size data warehouse was considered a big deal. It is as hard to remember those days as it is to recall what life was like before electricity!

"You would take your transactions over time and that would become your data warehouse. The poster child for this was Wal-Mart.”

Star schemas appeared, then OLAP and multidimensional data views. At times it got silly, Dash remarked.

“We got into a funny debate of ROLAP vs. MOLAP. What a waste of time. Who cares as long as I can give you the information in four different ways. You are the sales manager and you want to see sales by region. You are the product ‘guy’ and you want to see forecast by product type.

"Then there was another group that said ‘This pie-in-the-sky warehouse is too big a risk.’ We’d like to do this departmental thing called data marts. So some people got carried away on that path and created too many data marts, and then we were back to square one.”

Without an overarching architecture that guided planning, the means they used to solve the problem tended to increase the problem, Dash said.

Then came the Internet, click-stream analysis, customer profiles and the like. Another force that came into this picture was the packaged application.

“The packaged apps were mostly designed for clerks who might do order entry all day. High-level executives and knowledge workers were left out,” he noted.

So IT started to pursue the notion of key performance indicators, which were rolled up to the top executives.

“Now we have gone through the Internet boom and bust -- everybody’s gloomy, but the truth is that’s a very natural cycle. It is nothing new. Too many companies ran into this space [and] now we are saying: ‘Let’s be careful and figure out what this is all about. And this is the beginning of the notion of the real-time enterprise.

"I meet with CIOs, and they tell me: ‘The pressure on me is to cut down latency and labor. Everything that took four hours should happen in four minutes. Anything that took four days, I’d like to do it in four hours,’" said Dash.

Dash tells technologists discerning risk-reward here to visualize two circles sitting side by side. One circle represents operational and transactional elements such as ERP, CRM and mainframe data. The other circle represents a snapshot of data, and this represents the data warehouse or analytical sphere. As these two circles move to partially overlap, suggests Dash, the real-time enterprise appears.

It is the frequency of the refresh of the analytical snapshot that defines the amount of “real-time” involved. A company pursuing a strategy like Wal-Mart’s may have to opt for what Dash calls “extreme crispness” in its snapshot. Others may rightly decide day-old data will do for certain functions.

ADT Contributing Editor and Columnist Tony Baer has taken a close look at these issues. Last year, in ADT’s pages, Baer posed some of the questions IT managers must ask as they look at the pro’s and con’s of ‘having a real-time.’

In ‘Analyzing data in real time’, ADT, April 2002, Baer asked if the addition of current transaction data to the list of analytical targets is really worth all the associated development headaches. Other questions included:

• Is it truly necessary to supplement managed query environments, star schema databases or OLAP cubes with applications that also draw some real-time transactional data into the mix?

• Will infrastructure upgrades, such as those incurring added hardware expenses, throw the cost-benefit ratio of the real-time project out of kilter?

• Is real-time data inherently less accurate?

• Could it throw off analysts, generally?

The issue can perhaps be stated too baldly, but the results may be most telling. If, for example, you want to know a lot about a customer, it is going to cost you something to gain that functionality, and the customer better spend more than enough over enough time to justify it. Thus, Baer states that the most important factors in deciding whether to conduct real-time analysis exist “in direct relationship to what the customer is worth to the business.”

About the Author

Jack Vaughan is former Editor-at-Large at Application Development Trends magazine.