It’s easy to get caught up in the day to day work of Digital Analytics, whether working client side or as a consultant. You are busy ensuring the data is correct, improving the tracking, creating reports, answering questions and performing analysis on the data. While you want to grow the impact of Digital Analytics on the business & do more exciting work, it is difficult to identify the steps required to do so and even what the end goal should be.
This is where Digital Analytics maturity models are incredibly useful. They allow you to identify where you are now, where you want to get to and what you need to do to plug that gap. They are also useful in setting expectations for how long the process can take.
This webinar reviewed these needs, using multiple maturity models as examples. It went into details of the use cases of maturity models, the need to keep balance across multiple areas and how to progress at a sustainable rate for success in the long term.
The recording of this session is below and the deck used within the presentation can be downloaded via the button at the end of this blog post.
As a bonus, Stephane Hamel has just updated the self assessment tool for his Maturity Model, try it out for yourself at https://digitalanalyticsmaturity.org/.
Audience Questions
Do you see situations where data maturity models would change depending on the stages a business is in? Do they apply differently for the scope of that business (growth / profitability)?
This question, after clarification, was about whether startups should treat data maturity differently to established large companies. The simple answer is no, this is equally relevant and the models work for any company.
There is one key difference and that is a newer, smaller, agile company can establish good data maturity practices from the start. The general rule of thumb of one year per maturity stage can be over ridden by putting in place technology, processes, training programmes and a data culture from day 1.
But claiming to be data driven and having high end technology will still fail in the long term if the foundations are not there.
Who needs to be involved in an Analytics Maturity Review at a minimum for it to be useful/effective? And how easy have you found it to get senior business stakeholders involved?
To do a data maturity review properly, you need the data team, the heads of each department that will be using the data and an executive sponsor involved. It isn’t a small undertaking for any decent sized company.
There is scope though to do a quicker review just within the data team, even by a single analyst. The benefit there is it provides the framework to create the business case for more investment in people, technology and change. Which hopefully leads to the senior stakeholder buy-in and a proper strategy review.
The challenge to get senior stakeholders involved is to believe in the link between data maturity (the use of data to make smarter decisions/actions) and business success. If that link is understood, they understand the need to improve data maturity. If someone is resistent to the use of data, they want to continue basing decisions & actions on instinct (and knowledge/experience), then it can be impossible to get that buy in.
Thinking organisationally, as we increasingly analyse unified data sets, will the function become more centralised? Might the existence of an Analytics Centre of Excellence be evidence of maturity?
I would expect any company at or near the top levels of data maturity to have an Analytics Centre of Excellence (assuming the company size allows for this). I think the hub and spoke structure for data specialists to be spread through the company to be the most likely and this definitely allows for that centralised function.
Where in this model would one place advanced/customized tagging? I can only see basic tagging in the lower left. For me it would be placed before a stable site analysis or site optimization stage.
This question was referring to the second model displayed and the lack of detail such as this is a reason I worked on my own maturity model. I actually believe site analysis can be performed with core tracking in place, delivering value sooner.
The advanced/customised tracking would then be based on findings from this initial analysis, so that areas of interest can be analysed in more detail. These answers would then feed into optimisation efforts. The tracking would expand over time as changes are made and the focus areas change.
Bascially the level of analytics tracking should evolve as the organisation’s data maturity evolves.
Where would you place attribution modelling in the levels of maturity?
Similar to the previous question, I would place marketing optimisation efforts (of which attribution modelling is only one possible approach and not my recommendation) at multiple levels of maturity. Using my model as the example, it might be:
- Level 2 (Data Informed): report on marketing performance by channel
- Level 3 (Data Competent): review marketing performance at channel level on a last click basis
- Level 4 (Data Savvy): review marketing performance at campaign level against multiple success actions
- Level 5 (Data Practitioner): evaluate each marketing campaign against the purpose for that campaign, testing activity to verify findings
Any more questions?
Let me know your thoughts on Digital Data Maturity, the models I shared and the process to use them. Have you gone through the data maturity process for your organisation? How did it go, has it led to the results you hoped? I would appreciate any stories that are shared in the comments.
If the Digital Data Maturity of your organisation is something that needs work and you need help performing the strategic review, please get in touch. I can talk through how I have worked with organisations previously and how I can help you too.