Since I came into the market in 2015 there’s been a seemingly perpetual conversation to be had about where the traditional term ‘Business Intelligence’ ends and the shiny new term ‘Analytics’ begins. 

Freely admitting without shame that I myself am no ‘techie’ (After all, I’m a Recruitment Consultant, not a Solution Architect) and having had to learn very quickly that one is not simply a replacement for the other, I sat down with Matthew Cartwright from The Hackett Group to discuss the changes resulting from new technology and terminology alike. 

Matthew is Director within The Hackett Group’s EPM, BI, and Analytics Transformation Team. His primary focus is on Advanced Analytics, both executing use cases with Hackett’s Analytics Lab and developing blueprints to embed service delivery capability within clients. Previously, he led the UK Benchmarking team, diagnosing performance challenges and shaping recommendations based in best practices across all enabling business functions. Before Hackett, Matthew had spent 10 years in consultancy across Finance and EPM Transformation.

Matthew, thanks for sitting down with me to talk about what is a very interesting time within the BI, EPM, and Transformation sector as the market moves into the space of Analytics. Naturally, there are some areas where these two terms overlap and others where they are quite different concepts. It often appears to me that the market is still trying to work out where one finishes and the other begins, so in your view what separates modern Analytics from traditional Business Intelligence. And, as you see it, where do genuine overlaps still remain?

I think that’s a good question to be asking, and truthfully a genuine overlap does remain. At Hackett we typically talk about the ‘Analytics Maturity Curve’. This is a way of examining the capability that Analytics delivers to a client. The four stages we refer to are ‘Descriptive’, ‘Diagnostics’, ‘Predictive’ and finally ‘Prescriptive’.

Typically, ‘Business Intelligence’ refers to a tool-set that delivers against the first two of those maturity states. That is to say, answering ‘what happened?’ and then providing an initial insight into answering ‘Why did this happen?’. The answers to these questions in any given setting are driven from the information model supported by Business Intelligence and often supported by some visualization capabilities. 

Analytics, conversely, incorporates the entirety of the Maturity Curve, and allows you to start answering ‘What is going to happen?’ or ‘How do we affect what we want to happen?’. At this point, you begin moving in to machine-learning techniques, or other artificial intelligence techniques or advanced time series techniques, and that for me is when you’ve moved into modern Analytics.

The reason I say there is a genuine overlap is because I view ‘Business Intelligence’ as referring to a set of technologies and ‘Analytics’ as referring to a set of techniques. I look at it in a similar way to how I look at Mathematics – there are techniques that make up Mathematics, and some are more of less complex when it comes to delivering core capabilities. It’s like asking what separates ‘mathematics’ from ‘using a calculator’ – one’s a tool-set and one’s a technique that may use that tool-set, and if you’re doing particularly advanced mathematics you’ll need something more advanced than a simple desktop calculator. 

So, based on this overlap, BI remains an important aspect of a complete Information Delivery Architecture?

Absolutely. You can’t be disparaging about what is referred to as ‘BI’. It’s no bad thing that so many people are focusing on Predictive and Prescriptive Analytics – they should, to keep up with where competitors will no doubt go – but a lot of value often gets left on the table in just executing your Business Intelligence well. Your diagnostic capability is always going to be important.

And therefore Analytics is here ultimately to build upon BI rather than to replace it?

Yes, it’s part of a spectrum of capabilities, you need to be able to do it all.

Based on that, how should a firm seek to gain maximum benefit from Analytics now that Analytics is considered as a topic in its own right, rather than as a subset of BI or other areas of Technology’?

Another complex question! At The Hackett Group, we set out our thinking on ‘How to Succeed with Analytics’ in five key elements.

The first is to set a vision. This means understanding how Analytics capability would be executed in terms of the service delivery model within your organization. You can approach this in similar fashion to how you would build a service delivery model for any other function within your business – aligning technology, service design, organisation and talent. That vision needs to ask ‘How do we appropriately manage our demand pipeline for Analytics?’ and this ultimately means building out a pipeline of use cases that will benefit from these techniques, but also recognising where business value can be expected and prioritising accordingly. Once you take that capability to the internal customers within an organisation, there may well be more use cases that can be serviced with existing talent that the organisation already has. Just as an IT function has to have a very strong demand management for a project, Analytics has to have the same, and that’s part of the service delivery model. 

The second key element is Proving the Potential. This means using Proof of Concept and minimal viable product development techniques to solve real business challenges and prove that the analytics models can add value to the business situation. To do this successfully, you need to have lab-like capabilities, you need to have rapid low-cost delivery capabilities. Particularly in the modern business environment, it’s important to prove the value to a firm so it is confident investing in this capability.

The third key element is 'Focusing the Demand'. At Hackett, we align this to the Performance Driver Tree when guiding clients whose Finance functions are using Analytics to enhance performance management. It’s important to think about the information delivery channels so, depending on the nature of Analytics, this is where we can often cross back into the BI space and think about the stakeholders and how they are going to receive that information. In simplified terms, we might ask a client, for example, ‘Is this Analytics for an operation tool? Is it Analytics to support a one-off and ad-hoc insight? Is it Analytics to govern an ongoing process?’ This will have an important impact on how that demand is focused and prioritized. 

The fourth key element is 'Democratising Data' . This focuses on making sure that organisations use flexible data platforms that support multiple analytics use cases. Multiple teams of Scientists and Engineers need to have data platforms they can all work from, but in my view without restricting too far the tools they can apply to that data. In addition to democratising the data it’s important to also recognise that not every Analytics use case is necessarily going to be solved within a single centre of excellence. While the centre of excellence model is likely to be a very strong component of most organisations’ Analytics, it’s likely to be combined with strong communities of interest and tools that enable business partners to start delivering some of their own Analytics. So, it is critical to ensure that data is available through purpose-specific different channels to ensure long-term success.

And the fifth key element is 'Developing the Talent', which is a key limitation facing organisations now as there is a shortage of critical talent in this area. How this talent is mixed and teams assembled is extremely important, too. When we look at the skill-sets required for delivering Analytics use cases there’s a very broad range of skill-sets needed, trying to hire people who embody all these skills, Analytics Polymaths if you will, is a bit of a red herring. These people don’t exist (beyond exceptional cases) so there’s a need to identify and form multi-disciplinary teams on Analytics projects. This isn’t just limited to IT functions either, it needs to be understood across an entire business and this means emphasising a strategic workforce plan around Analytics and aligning this with any recruiting activity and internal training and development plans. In our case, through the Hackett Institute we’ve developed a Certified Enterprise Analytics Professional training programme to bring a broad range of stakeholders up to speed on what it means to be an ‘Analytics Professional’. 

I think those five key elements offer a really good insight on how a firm can gain maximum benefit from Analytics. Leading on from that, the next natural question to ask is what do you see as being the most important considerations for a firm embarking on their first major analytics project? And, tied in with that, how can consultancies offer best value to their clients?

It might sound obvious, but you need to bring everything back to the business problem you’re trying to solve. Don’t start with the technology, or even with a particular technique or algorithm – start with the problem and work from there. Daniel Jeavons from Shell spoke at our European Best Practice Conference and I think he summed it up best when he distilled things down to ‘solve the engineering problem last’. 

In terms of offering value to our clients, Hackett’s approach focuses on two major solutions. We directly help clients solve their own advanced analytical use cases, and the key value-add here is our highly experience team of senior data scientists who deliver a wide variety of advanced Analytics use cases using a wide range of techniques including machine learning. The solution area is helping clients design end state service deliver models to deliver Analytics capabilities and setting out the roadmap to deliver. This draws on Hackett’s rich experience in developing and implementing operational strategies across business functions.

Of course we also combine these offering to set strategy and prove potential behind it. That is helped by our Hackett Analytics Lab, which means clients don’t need to over-invest in technology straight out of the gate and we give them an opportunity to enjoy more of a ‘try-before-you-buy’ approach and have new tools demonstrated to them through application to their own business challenges. 

You mention the ability to try different technologies there, which is naturally an advantage of going to a firm who are multi-vendor. You and I have met a few times at Oracle events, though, and I was interested in your view on OACS? Although it’s now a more mature product, I see it as a technology which often remains underutilised. Given many reading this blog will be from the Oracle community, what would you say are the main reasons for firms to invest in OACS in 2019?

I tend to agree with your premise that it is an under-utilised technology and, if we go back to the start of the conversation and return to the Analytics Maturity Curve – there’s a lot of excitement about the Predictive and Prescriptive aspects here, but actually Descriptive and Diagnostic delivery through good BI remains very important. OACS deliver very strongly in terms of that capability, including with Visualisation. Personally, my own background includes a long history with Visualisation and I continue to believe that  many firms still under leverage it as part of Business Analytics. But now OACS is combining that classic Visualisation and BI Delivery with Machine Learning techniques, both in terms of analytic insight and to accelerate some of the data preparation stages. That’s quite exciting when you think about increasing self-service channels for delivering Analytics. 

If we bring this back again to some of the earlier points, whilst I think the Analytics Centre of Excellence will be a key engine for delivering advanced techniques, I think it will also be a key engine for community building. Community Building in turn will require tools that easily enable the Citizen Data Scientists, and these may be people like business partners who are doing their own exploratory data analysis and even building localised predictive tools. These tools will need to have intuitive user-interfaces and capabilities, which coming together in OACS. Natural language processing is a good example, as the interaction with the tool starts to behave more like the digital personal assistants we have in our phones. 

Ultimately, the key reason to invest is because there will be a host of users in your organisation who can benefit from more direct access to analysis on top of their data and it’s not always realistic that a Centre of Excellence takes on every case. 

I think that’s a really interesting answer and, actually, it leads me into an extra question – you mentioned the uptake of new technology and the inherent limitations there’ll be of people who have experience of new tech as it’s released. How should firms, be they consultancies or end-users, go about sourcing for those first projects with new technology, at a time when the market can’t rightfully lay claim to said skills on their CV? Or, at the very least, very few will be able to, and certainly not enough to satisfy demand. 

That’s a hard question, but I think it comes down to getting the balance right and understanding there’s always a range of criteria to be considered. I think the stakeholders who will leverage this technology need to be involved in the hiring process, and if you’re looking at tools for mass audiences like OACS or similar, there’ll be a better range of what you need from proven vendors. 

When it comes to advanced analytics tools for more delivery in the centre – so, where you’ve got centralised data scientists – I think there needs to be a relationship between your talent pool and your preferred ways of working. In fact, I think it’s important not to take too much of the technology choice away from the hands of the data scientists.