APRIL 28, 2020
This is part 3 of a 6-part series.
In the first episode, we discussed the 5 most commonly cited data in audit challenges.
In this episode, we discuss solutions to the 2nd challenge - low value.
The five items are:
- Access to data - you can't get all the data, or you can't get it quickly enough.
- Low value - the analysis doesn't provide new insights.
- False positives - too many of them; results are overwhelmingly noisy, distracting your focus.
- Superficiality - the results are not deep enough to properly understand and refine the problems or to provide opportunities for improvement.
- Reporting - the results are not available in time for reporting, the report is not tailored to the audience.
Welcome to the assurance show. This podcast is for internal auditors and performance auditors. We discuss risk and data focused ideas that are relevant to assurance professionals. Your hosts are Conor McGarrity and Yusuf Moolla.
Now in recent episodes, we've been talking about five assurance Analytics challenges. In the previous episode, we talked about solutions, to challenge number one. And that challenge was the access to data. So today we're gonna talk about the second solution, to number two, which is lower value and then in subsequent episodes will talk about the challenge of false positives. Number four will be around, superficially of data and then five around timing. But today is number two. We're gonna tackle. And that's low value where the data doesn't provide any new insights.
If we're trying to work through providing better value to management, we're always looking to provide better value to management, right? If we think we are not able to provide the right level of insight, what can we do about it? The first thing is think about how you identifying your hypotheses upfront. If you using standard analytics test libraries, they were developed in the nineties, but in the early two thousand's, they are old hat. They could be useful to identify certain common tests that you might have. If you have particular audit objectives or assurance objectives, the ability to add value is fairly limited. If you've never conducted procurement analytics within your organisation and now is the first time that you're doing it, then maybe you can use a bit of analytics test libraries to help you out with that. What you can do is you can sometimes use those as a bit of a bottom up approach, but really you want to go top down? What is it that you're trying to achieve from the assurance work that you're doing? So the audit topic And then what are you gonna use analytics for? See if you can identify those hypotheses upfront and focus it very specifically on the strategic objectives, your organization's unique strategic objectives and work with a team to come up with what it is that we can use analytics for to help in the hypotheses that relate to those objectives you're in a much better position than just using a standard set of 15. You know, procurement analytics tests just because there's a routine for it, that's a very old way of doing it. And if you're gonna be doing it that way, you may come up with some useful insights for the first time around, but that value will not be there the second time. You tried to do something similar, and I don't mean for the same subject. I just mean you need to start working out how to identify hypotheses so that you can use analytics on an ongoing basis across audits. In the performance audit world. You won't really have analytics test libraries. For the most part, you may have a few, but quite often you need to identify those hypotheses upfront. So low value in that instance is probably a bit of a misnomer for performance, or that this would probably apply more to internal auditors. The next thing that you can do is to share lessons among the team. If you're sharing lessons among the team, what it helps you to do is to start in a different position. So if you have worked on an audit before and you've used particular set of data or explored a particular subject area that may be adjacent to the one that you're now exploring. If you are able to share those lessons among the team and share that information among the team, you in a situation where you can help to prevent repeating some of those insights because nobody wants to see the same thing twice, and seeing the same thing twice can be seen as is lower value. But then you also want to then prevent repeating issues that you know may appear to be valuable at first. But don't turn out to be actual real opportunities.
If it's not causing us to think deeply about how we set up these analytics regimes. So if it's not hurting our brains, trying to identify what we're trying to achieve, would you say that then you have to really reconsider the value of that particular test.
There's two things, right? So one is that you need to work out. First of all, what value will mean in terms of conducting the test or conducting that piece of analysis. There's probably two areas of value. The one is where you providing additional value to management where they haven't been able to identify something before. It's a new insight that they haven't seen before. That's the one. The other is that you conduct a particular piece of analysis in order to determine what the level of assurances that you're able to provide. So sometimes you're just not going to be providing the results or the results don't show anything. The challenge with that is you don't always know what that's going to look like up front, so you you want to go a little bit wider. But there's a bit of misnomer that if you gonna find nothing that there's no value, and that's that's not true, if you find nothing then the value that you providing is in the fact that you found nothing. So it's that you are able to show that the particular subject area is being handled well by management, and you can provide assurance on it. However, if you make a big song and dance about the fact that you're using analytics and all you can do is provide assurance then there may be questions asked. It absolutely isn't a bad thing. Just be careful about how it is that you communicate what you're going to be able to provide as value because different people see value in different ways and for you, value might be getting assurance for other people. Value might be new insights, and so you need to be a bit careful about that. If you were able to share that information about what's been done before amongst the team, you can then de-duplicate that and not have to go through the full analytics lifecycle to get to the same level of insight that you had before. The other thing is that if in conducting audits as you identify risks or opportunities that are not relevant to the audit and the second one that we're talking about is really drawing on risks, opportunities that may have been identified before the audit actually began. But this is relevant because it's almost a bit of a continuous cycle that you need to be in if you identify any risks, opportunities that are not relevant to a particular audit but that you think might be relevant to a future planned audit or to a separate audit team, then you want to bring those up, and what that does is that helps you in terms of the input into the audit that you are conducting at the time that you conducting it. This is a little bit more advanced and you probably need to already be in a cycle of audits in order to be able to use something like that. The other is where you've done work before in a particular area, and you think that there may be relevance for use of either that data or that analysis in your audit so if you again, if you're sharing information between audits and you've identified that there's a particular audit that you saw where certain data was used or certain insights were identified, and you think about how you can link insights from the data from that audit to the audit that you are conducting to provide some sort of new insight than that again can help increase the level of value that you provide. Quite often, you don't management look at their individual silos or in the individual subject areas and you know they usually know them very well in most cases. Anyway, as an auditor, you are able to see matters that go across the Organisation for Internal audit and across the public sector for performance audit. So if you able to then bring different data sets together that show the relationships between the subject matter, you able to provide a different level of value. And if you have already used the data set for a particular purpose before, and you can quite easily incorporate it or incorporate the outcomes of that analysis into the new piece of work that you're doing, you then able to provide a different level of value.
One of the sources of data that we've used in the past that can deliver exactly those sorts of insights is census data or any of that sort of data that's collected by Central Statistics Bureau or something like. That's whereby. You may, for example, for a particular performance audit and that most of this stuff is open. And available to anybody. You may have looked that particular statistical data gained through census for a performance audit, but you see in there that there's a lot more information that might be useful to future performance audits, albeit just from a slightly different perspective. So I guess you're you're analysing some of that data and joining it with other data, potentially, to arrive at a particular conclusion in relation to a performance audit. But when you're in there, you see all the possibilities that exist with using that data that's available from a slightly different angle
With performance audit it's a little bit easier because a lot of that open data is relevant to the topics that you're going to be focusing on, right. So it's a little bit harder for internal auditors because, yes, there is value in open data. But the level of value that you're able to generate is a little bit lower than what it would be for performance audit. And that's because open data for performance audit that the open data is just data that would have been organisational or proprietary, except that it was deemed to be valuable enough to be put into the open into the public domain with internal audit looking particularly outside of public sector for now. You you don't often have that same level of extent of the data in the open that would be directly relevant to the work that you're doing for internal audit purpose. There is still some, so there's some mapping data that you can still use census data in certain circumstances, particularly where you dealing with households or individuals, some sort of B to C business. But the level of uses is a little bit lower. For performance audit, absolutely. That open data is vital.
Three things in low value. The first is throw away your analytics test libraries. OK, Don't throw them away, but don't rely on them completely.
Don't roll them out as a limousine when in fact they may be something entirely different.
Yeah, make sure you're identifying your hypotheses upfront and linking them very specifically to your organisation' strategic objectives. The second thing is sharing results among the team sharing the data and the results of analysis that had been conducted. The third is identifying other risks and opportunities and feeding them into subsequent audits. And then the last one is continuous improvement and using previous data and previous analysis to supplement the audit work that you're doing such that you can link data that's not ordinarily combined for new insight.
And this is a bit of a bonus Again. The last thing is, value isn't just about providing new insights, value can be in actually providing assurance, so you don't necessarily have to have some spectacular finding. You quite often can use the analytics that you do to show that everything is working well, and that could be just as valuable as providing new insights. And again, that depends on the way in which to communicate what you're going to be using analytics for and what you hope to achieve. So communication, like in the previous episode on access to data becomes quite important. How are you communicating what you're doing? And so that's how you overcome the Low value challenge.
Tomorrow, looking forward to catching up where we'll be discussing some of the challenges around where false positives are derived from our analysis. So I'm looking forward to that. Chat then.
If you enjoyed this podcast, please share with a friend and rate us in your podcast app. For immediate notification of new episodes, you can subscribe at assuranceshow.com. The link is in the show notes.