APRIL 30, 2020
This is part 5 of a 6-part series.
In the first episode, we discussed the 5 most commonly cited data in audit challenges. In this episode, we discuss solutions to the 4th challenge - superficiality.
The five items are:
- Access to data - you can't get all the data, or you can't get it quickly enough.
- Low value - the analysis doesn't provide new insights.
- False positives - too many of them; results are overwhelmingly noisy, distracting your focus.
- Superficiality - the results are not deep enough to properly understand and refine the problems or to provide opportunities for improvement.
- Reporting - the results are not available in time for reporting, the report is not tailored to the audience.
Welcome to the Assurance Show. This podcast is for internal auditors and performance auditors. We discuss risk and data focused ideas that are relevant to assurance professionals. Your hosts are Conor McGarrity and Yusuf Moolla.
Today we're talking about the fourth thing in our solutions to get over assurance analytics challenges - superficially.
So this is where the results of the analytics are not deep enough to understand the specific problems that have bean identified. If we know through the analytics work that we're doing, that there is going to be some sort of challenge that needs to be overcome or some sort of finding that needs to be addressed. But we don't go deep enough to really properly understand and refine their problem down to an item that or set of items that can be well understood. So we're explaining the problems very broadly or we're explaining the situation very broadly without sufficient context to enable people to really understand what's going on and properly provide opportunities for improvement.
So does this come back to your planning? And when you're starting on assurance project and making sure that you clearly define the problem at the outset.
There's a couple of things. One of them is what data we actually using as part of the assurance work that we're doing? Are we using high level data? Or are we able to get down into the proper level of granularity to be able to analyse and then explain what the results actually are? So it's probably a combination of what we do in planning, but also what we do then in execution and then finally a little bit in reporting. But it does come down to, you know, primarily in the execution. Execution, though, is really driven by how well you planned what you're going to be doing. So, of course, everything comes back to planning. But really, it's It's about how much of data you're using and how you're using that data to get to a more granular and accurate set of results.
Yes, so it's always a good idea, I think, to check in with the business as you go along, and if we're talking about superficiality of results then to check in and say hi this is what we're seeing is this real?
as we go, we need to be validating assumptions. So we need to be ensuring that what is that we're looking at is correct as early as possible. And that means that we need to be talking to the business whenever we find the problem or as soon as we find a problem. It's not always that straightforward, because sometimes we want to be able to dig in a little bit to understand exactly what's going on before we go back to the business with just, you know, this is a problem that we found. Can you help us resolve it?
So one of the issues we find with the purists who suggest that checking in the business, somehow diminishes your independence. Your or your ability to make an independent decision. My would be you can still maintain your independence when also validating what you're seeing.
I really haven't come across that that sort of thinking around checking in with the business because really, you want to make sure that what it is that you're seeing makes sense unless there's, ah, a reasonable explanation for it that can be given up front or there's something else that you just, you know, just don't have or just some information you just don't have access to. So I can't see how that would...I mean I know that there was that sort of consideration, you know, a couple of years ago, but I can't see how that really makes a difference in terms of, you know, determining what it is that you're looking at, you really do need to speak to the business and work out. Is this making sense or are we just not seeing something? Or do we just not have access to some data that we should have access to?
And so people these days are very, particularly management, are time poor, as we all are, and one of the most useful or a useful reporting tool is the the growth of dash boards in the past few years. But that also brings risks when you're doing assurance projects.
In terms of assurance work. One of the challenges is that it's quite easy when you looking to start or accelerate an anlytics programme to say I want to go and become a little bit fancy and, you know, bring out some cool toys and look at Ah, dashboard that has some nice visuals on it. There's nothing wrong with that. There's very good uses for visualising your data for both exploring the data that you that you're looking at, as you know, during the audit, but also in explaining the data afterwards. But if all you're doing is using a dashboard, as in, all you're doing is taking a data set and putting it into a visualisation tool and saying, I've now done analytics. But you haven't, actually, you know, cleansed the underlying data, understood it brought different data sets together and then analysed it. You really are going to end up with those superficial results. So one of the one of the ways So you know, you said the first one there was you need to validate assumptions and outcomes, you know, as early as we can. So if we are seeing problems, we validate those with management. This is the second solution to the challenge of superficiality, is that if you emphasizing the use of dashboards or use of visuals as the mainstay of your analytics approach, then you want to tone that down a little bit and start focusing on real analytics work, which is about bringing different data sets together. There's a good couple examples of that where you can see where just putting something into a dashboard without thinking about what the data is or thinking about what other data might be able to be used in conjunction with it. That doesn't get you to the right. The right answer.
We're almost doing a disservice to our audit committees or to our boards or decision makers. If we're just grabbing information, not cleaning it, not testing it and just put it into a dashboard rather than doing any emphasis, or sorry, doing any analysis that might actually explain what we're portraying. So in some instances we might be doing more harm than good.
There's a couple of things, right, so we we are definitely doing them a disservice by not giving them the right level of analysis that we need to be able to. We're also doing the audit function a disservice by not showing what it is that we're actually able to do.
So the use of dashboards in all our assurance activities are very important. The risk is that we rely too much on using them to put forward information rather than actually using the depth of analysis that we actually do behind the scenes.
The third potential solution is similar to what we explained in a couple of the previous solutions yesterday and the day before. If we are sharing data and sharing information among the team because we're bringing different data sets and because we're starting with the data set that we've potentially already used and already understood because we looking to bring different data sets together, we're able to start at a deeper place, and we are able to refine our analysis rather than starting at zero and then saying we don't have enough time and going down the dashboarding route in just bringing some visualisations together.
Sharing lessons among the team to maximise the benefits of analytics as we outlined in principle three is the third way to get over the superficiality challenge. The most contemporary example of that is the data we are seeing now around pandemic, right? So because we've got a whole lot of virus related cases and deaths around the world on this is, you know there's there's no way to be over over sensitive about this topic, but just to think about it in terms of data, the various data sets that we that we are seeing and all of us are seeing this sort of thing sorts off. You know, mapping related dashboards of the moment. Where there's this data that is being brought in from W. H. O. And a few other places that are explaining how many cases, there are and how many deaths we seeing right? That is an important metric, and we want to be able to track those things. But you really get a very different perspective when you overlay all of that with the populations of those individual countries to see. You know how many cases we have. Ah, per capita and how many deaths we have per capita. That's not. That's just an example of where you see a significant difference by just looking at raw numbers. Which is what a lot of dash boarding is about is just looking at raw numbers. But when you actually go into the effort off cleansing some of that and then blending it so you're cleansed it to be able to blend it. But blending it with population data and in portraying that population overlay data on your pandemic related debtor. You then have a very different few, right? So you you can then see the rate of spread starts looking very different now. This is not to say that this is the way in which health professionals and others are supposed to be doing this. But that's just an example of what we seeing most recently, where there's one view that is just purely, this is the underlying data. And there's another where you actually bringing different data sets together to understand what the scope of the problem is and what it looks like in the context of the individual countries.
Yes, so context is really important, and I just pick up on something you said they're Yusef. Sometimes being able to put the results of your data analysis in the context takes a little bit of time.
People are understanding more and more how long some of these things take. And, yes, some of these things do take a little bit longer than you would want them to. But they there are. There are lots of different techniques and tools that you can use nowadays to make those pieces of analysis go faster, so they're not always. It's not always this is necessary that you're gonna take a long time to do it, right? Some. Sometimes you do. But quite often you don't. If we think about just blending two data sets together and cleansing them for blending. That's not going to ehhh break the bank. It's We're not talking about lots and lots of effort to to to do that. Obviously your overall analytics work on a particular audit will take a little bit of time, but adding a few additional things in definitely won't. The challenge you have is if you're purely doing dash boarding as part of your analytics work and you're not following any of the other. So you're not validating any of your assumptions early on or you're not sharing information between the team. And quite frankly, all you're doing is dashboarding. There's nothing much that you can share between your audits, right? That's right. It's only when you start using different data sets that you have something worthwhile to share and Only when you start cleansing data and you find what the data actually has that you have something something worthwhile to share. If that is what makes up the bulk of your analytics effort, and you are now used to spending you know, a small amount of time on what you think is analytics. Then that's the benchmark you're going to have. And if you then start needing to do proper analytics and spending more time, you won't understand it. So don't take dash boarding as the thing, the vision. You know, the small visualisation and spending a couple of days pulling some small data set into a visualisation tool and saying, I've done analytics and thinking that, you know, analytics only takes a few hours, so we should just, you know, do this each time. That's a completely false sense that you have then and you're creating a situation where you don't really understand how much effort goes into properly producing analytics that can help provide a better assurance outcome. So that's channel number four superficiality. Tomorrow we'll be talking about timing and reporting and look forward to that.
Sounds good and tomorrow's episode will round out our five solutions to the analytics challenges.
If you enjoyed this podcast, please share with a friend and rate us in your podcast app. For immediate notification of new episodes, you can subscribe at assuranceshow.com. The link is in the show notes.