Best Data Visualization and Dashboard Management Practices in Self Service BI Programs

I've seen many organizations run into difficulty defining governance and best practices managing the lifecycle of BI dashboards. Perhaps you have deployed a self service BI program - data scientists have access to tools, IT has deployed severs to publish completed dashboards, and employees are anxious to leverage visualizations and become more data driven. Job done right?

Not so fast. Flashback 10-20 years ago when business users became proficient with MS Excel, MS Access and more advanced BI solutions. Perhaps too sufficient and you have an organization of spreadsheet jockeys and a data landfill instead of a landscape. Has the organization ended up with an explosion of spreadsheets and reports? Ever have an issue where decisions were made on a report made with buggy calculations? When you open a report, do you have questions on what the columns mean, where are the sources of the data, or what is the logic behind an aggregation?
This is common and occurs because organizations fail to put in some basic governance practices around data, reports, dashboards, visualizations, and analytics. 
I stress the word basic - not overbearing, but recognize that a self-service BI program without basic data governance may provide value but will slow down before achieving its full potential.

Best Practices in Self Service BI Programs

So let me suggest some starting points.

  1. Define a life cycle - I strongly suggest organizations manage dashboards like applications. They are developed, they need to be tested, their needs to be some documentation, users need to be trained, they need to be published, feedback from users needs to be gathered, enhancements should be prioritized. This doesn't need to be as onerous as it sounds, but having these disciplines insures that dashboards developed provide enough value and worth the effort. Testing, documentation, and training insures that dashboards developed are leveraged by a audience wider and hopefully prevents the duplication of effort in creating similar or derivative dashboards.
Implications for IT
    • Data scientists need a central place to store source files; spread sheets, Tableau workbooks, Qlik application, etc. Ideally these should be stored somewhere where files can be versioned and tagged.
    • Servers should be configured to handle the equivalent of dev, test, and production.
    • IT should consult with data scientists to consider tools, templates, and storage of documentation.
  1. Documentation should focus on data flows, data definitions, calculations, aggregations, and known data quality issues. Data scientists joining the organization need to have a strong understanding of their starting points while dashboard consumers need to be able to use them and interpret the results. Organizations need to define documentation standards for these audiences and when in the lifecycle they should be updated.

  2. Define style guides covering layouts, control choices, common components, visualization types, color palettes and other design considerations so that there is consistency between dashboards developed by different data scientists.

  3. Testing should focus on insights, calculations, and usability. Do the results look reasonable? Are calculations providing expected values? Are the dashboards usable and intuitive or complex and clumsy? The lifecycle should be developed for iterative review and feedback. Dare I say agile data practices?

  4. Governance practices should target reviews of the quantity and disparity of dashboards developed. This is really important in order to avoid landfills because if everyone develops their own dashboard versions that largely perform the same analytics, the sheer quantity of them will make it more difficult for users to navigate and possibly provide conflicting results. It will also add complexity when data models or software systems need upgrading.

  5. Measure usage and impact because data visualizations and dashboards not being used should be phased out.

Evolve the Practice

Not all of this needs to be defined up front. Define these and other practices as needed.


  1. One of the biggest hurdles to this type of ideal and desired structure is the impossible to avoid new analysis project, developed on the side and in the shadows. These projects crop up as adhoc one-offs, and are under extreme time constraints. They ultimately deliver the desired result quickly, and then become an "expected" resource. There will always be the need for off-the-cuff results, but the trick is being able to identify when they need to be properly reevaluated and reengineered through the formal process - instead of trying to push it through as-is. They become maintenance hogs, and end up costing you more time and money than you thought you saved. To avoid this, you need management buy-in to spend money on something they view as "working" and "already completed". The cycle of the scenario - a growing data landfill, buggy calculations, and undefined data points - you so accurately described, comes full circle.

    The process you described is spectacular, and I'm going to compare it to ours and see where we can improve. Meanwhile I'll be thinking of ways to improve communication with decision makers when working with adhoc projects that carry visions of future reproducibility.

    1. Some very good points. These projects - could be soreadsheets, databases, bi dashboards all intended to be used once and then get out of hand. As you point out, having a practice to identify these projects isn't trivial and getting its business owner to fund a transition may be challenging.

      Thanks for the feedback