Evaluation criteria for dashboards and scorecards
- By Wayne W. Eckerson
- January 1, 2005
Many vendors offer ADEs for building dashboards and scorecards. Some solutions are more canned than others, each prescribing a certain look and feel. However, others present developers with a flexible portal interface that they can populate and paint with substantial flexibility.
Dashboard-and scorecard-specific ADEs should support:
Specialized visualization components. These include speedometers, gauges, dials, maps, and other icons that visually depict performance against plan.
Dashboards evaluate performance against metrics using predefined goals. Dashboard developers define a range of acceptable performance goals across multiple time intervals. To support this, an ADE needs a strong rules engine that lets developers define high and low markers of performance for each metric. Ideally, the rules engine should have a simplified Boolean interface so end users, not developers, define and mange the rules.
Alerts and agents.
The rules engine should also support event management. That is, it should let developers and end users define rules about when and how they should be notified if a given metric exceeds goals (i.e., alerts) as well as when and how to initiate automated actions based on those alerts (i.e., agents). Visual alerts should be accompanied by text that explains the problem, along with a report that users can click to see data, and a URL to initiate additional action, such as refreshing a report or displaying contact information for someone to notify. The rules engine should also accept events from third-party systems.
Monitoring performance over time against goals involves time-series analysis, which is process intensive. To ensure rapid response times when users access, filter, or drill into time-series tables and charts, ADEs should store metrics, rules, and results in a local repository. This lets the ADE apply rules to historical, time-series data without repeatedly querying the source database and bogging down performance.
Drill to detail.
Dashboards should enable users to drill down three levels: from graphical performance icons to metric tables/charts to detailed transaction data.
The purpose of most dashboards is to foster better communications between managers and staff, not to punish staff for poor performance. The metrics themselves communicate what is important to top executives — namely, strategy — and how to implement it at each level. Conversely, the staff understands what the numbers represent, both historical and future trends and events. A good dashboard, therefore, enables users to attach comments to metrics and reply to those comments in a threaded discussion. It should also let users create activity workflows, such as distributing a report to a group of people for their review and comments.
The Balanced Scorecard methodology requires organizations to map strategies and initiatives to metrics, and map causal relationships among metrics. This ensures that you have created metrics that work synergistically to achieve strategic goals. Whether or not your organization uses the methodology, an ADE should support the ability to map causal relationships among metrics and between metrics and strategy.
Customization and personalization.
Every dashboard should be customized to a workgroup or individual so that they need only to monitor a small number of metrics that are relevant to their roles and tasks. Users should then be able to personalize their views via a portal interface so that they view metrics, documents, and other relevant information in a manner that suits their preferences.
About the Author
Wayne W. Eckerson is director of education and research for The Data Warehousing Institute, where he oversees TDWI's educational curriculum, member publications, and various research and consulting services. He has published and spoken extensively on data warehousing and business intelligence subjects since 1994.