What a mature data & analytics maturity model is; and what it isn’t.
President Woodrow Wilson once said, “I not only use all the brains that I have but all that I can borrow.” Maturity models represent a tool somewhat reflective of this sentiment. They typically consolidate a collection of thinking and experience on a particular topic to answer the questions: What are the acknowledged best practices or key capabilities for a given domain? And how far away is your organization from where it wants or needs to be? However, not all maturity models have achieved the same level of … well … maturity.
Throughout my career, I have developed data/information management and analytics maturity models for META Group and Gartner, and have used those from the Enterprise Data Council to the DAMA to TDWI to MIKE 2.0 (a riff on the META Group model), with dozens of clients. In addition, I have analyzed and done comparisons of several maturity models for clients. In doing so, I have learned more than a few things about what makes a maturity model effective, useful, and even indispensable to organizations. These include:
1. Not a sales tool
Maturity models, particularly those from data and analytics technology vendors, can be heavily biased toward their solutions. Or worse, they can be not much more than clickbait for capturing contact and other data about your organization. Even maturity models from consultancies can slant toward their own key capabilities. Ensure the model you use is balanced and even includes maturity indicators that don’t necessarily align with the vendor’s messaging.
2. What to do, not how to do it
Maturity models should not be confused with a methodology. Methodologies can be highly valuable, but they prescribe and detail particular procedures and approaches. A good maturity model might imply certain best practices, but not a work plan for your project.
3. Leading indicators
A good maturity model should include ways of assessing those activities, capabilities, behaviors, or other indicators that typically beget successful projects, not necessarily indicators of current or past success. I have come to believe that all indicators should be positive behaviors or observations, not negative ones. In this way, the model can be used prescriptively (See #6 below) and will help prevent finger-pointing.
A maturity model should be able to capture and compare maturity scores for those within your industry, geography, and company size. Very few, if any, in the data and analytics space do this yet.
5. Longitudinal and proforma assessments
The maturity model should enable an organization to compare itself period over period (e.g. annually) to identify areas in which it may be improving or degrading. As well, a good maturity model enables an organization to establish desired target levels of maturity over multiple time horizons, e.g. one, three, and five years.
6. Partial and combined assessments
Expect your maturity model to be able to aggregate assessments from multiple parts of the organization and allow those with knowledge of a particular area (e.g. data governance) to participate in completing that part of the assessment. In addition, the model should enable scoring by individuals or teams. Of course, any good model has indicators and aggregate scoring grouped into a half dozen or more key dimensions.
7. A range of analyses
Ideally, the maturity model tool should be able to generate analytic insights such as areas of disparity in which the organization is highly mature in one area but much less so in another. Imbalances are as or more important than overall maturity scores. Ideally, a maturity model should include a variety of diagnostic and prescriptive analyses, as well as the ability to do self-service analytics against the data.
A good maturity model asks not just a couple of handfuls of questions but includes dozens or hundreds of indicators grouped into a range of key dimensions. While these indicators are exhaustive, they should not be specific to any kind of organization or environment. Also, using separate maturity models for data management, analytics, and data governance will only serve to confuse your organization. Today, data and analytics go together hand-in-glove.
9. Usable by independent assessors, or self-service
The maturity model should include a process for an independent in-depth assessment by a consultancy, typically via a series of interviews and other discovery. Just as easily, the maturity model should include a tool wrapper to enable any organization to deploy it internally themselves.
10. Identify perception differences
Deploying the model assessment, along with its built-in analytics capabilities, should enable differences in maturity perception to be highlighted. Often, differences in perception among different parts of the organization (e.g. IT versus business units, or among different business units), or among different levels of the organization (e.g., executive leadership versus management versus other staff) can be very telling and indicate communication or change management issues.
11. Regular Updates
Especially in the realm of data and analytics, techniques, technologies, and even roles change quite frequently. Some maturity models, for instance, don’t yet acknowledge the roles of the chief data officer or data scientist, or the emergence of self-service analytics, cloud technology, or blockchain. Ensure that the maturity model you choose is current, and has and will be updated incrementally over time. Wholesale updates to maturity models, while admirable, make it difficult to impossible to perform longitudinal assessments with any certainty.
Any maturity model will have indicators that leave a bit to interpretation and a scoring model that attempts to turn subjectivity into objective metrics. However, it’s important for it the model be based in reality. If the majority of organizations are scoring a 1 or 2 out of 5, then the best-practice indicators are overly ambitious.
So be sure to consider the maturity of the maturity model you select for your organization and rely on the above as a handy checklist.
About the Author:
Doug Laney is the principal data strategist with Caserta and led the creation of the Caserta Data & Analytics Maturity Model (CDAMM) which incorporates over 200 vendor-independent best-practice maturity indicators. For more information on this maturity model or Caserta’s data and analytics strategy and implementation consulting services, contact: firstname.lastname@example.org.