APRIL 30, 2020
This is part 6 of a 6-part series.
In the first episode, we discussed the 5 most commonly cited data in audit challenges. In this episode, we discuss solutions to the 5th challenge - reporting (and timing).
The five items are:
- Access to data - you can't get all the data, or you can't get it quickly enough.
- Low value - the analysis doesn't provide new insights.
- False positives - too many of them; results are overwhelmingly noisy, distracting your focus.
- Superficiality - the results are not deep enough to properly understand and refine the problems or to provide opportunities for improvement.
- Reporting - the results are not available in time for reporting, the report is not tailored to the audience.
Welcome to the assurance show. This podcast is for internal auditors and performance auditors. We discuss risk and data focused ideas that are relevant to assurance professionals. Your hosts are Conor McGarrity and Yusuf Moolla.
We're talking about the fifth item in our Assurance Analytics Challenge Solution series.
Five of five. So we're talking today about timing and reporting of results from Assurance Analytics.
To recap we spoke about access to data, low value, false positive and superficiality as the first for assurance analytics challenges. And then we had the solutions to those over the last four days as individual episodes. So this is the 5th one - reporting and timing - and what we're gonna do is talk about the solution to the reporting and timing issue that most audit analytics teams face as a significant challenge in conducting assurance analytics,
Reporting is very important. It's the culmination of all our hard work.
Access to data and reporting and timing are the two challenges that you know most organisations will face the other three really getting to where you starting to become a little bit more mature, are starting to use analytics more, But reporting and timing is the second of the items that where, every organisation will face this challenge. Interestingly, access to data is a very well known challenge. Reporting is not as well known, and there's a reason for that.
Can we tease that out a little bit, why do you think it's not as well known.
So what's become very evident - and this is across internal audit and performance audit - is that the level of the quality of reporting that is produced isn't as well understood across audit teams. Audit teams don't know how well they're doing in terms of reporting, and that's because the standard is quite low. You look at analytics and visualisation, when you get to reporting, if you haven't considered your audience. In most cases - now we're talking about outside of the audit world - in most cases, if you haven't considered your audience and haven't built your results for your audience, your result will largely be thought of as having failed.
Is the difficulty that internal audit or performance audit don't reflect critically on their own reports and/or because they don't consider their audience in how they formulate their reports?
So it's three things the one is they don't consider the audience. So the second thing you spoke about, they don't consider the audience in preparing the report. The second is that don't reflect on their own reports and work and, you know, determine whether it sort of makes sense and is addressing the issue and is explaining it in a way that is easy to understand but also has the level of depth. But also the recipients of audit reports are used to a particular standard, and unfortunately, that standard is low. So quite often I mean, sometimes it's different, right? But quite often the expectation of the reporting that comes out of audit, it is low because the experience of reporting that is received from auditors is lower quality.
So we're trying to change that right.
We have to change that. We're in a situation now where audit outcomes are being compared to the reports that are produced by other teams with, you know, as as it goes for internal audit and other agencies as as as relates to performance audit. And so if we're not going to be raising our game and we're not going to be changing that a little, but thinking about our audience better and producing results that actually explain to the audience what it is that they need to know in a way that is easy to digest. Then we may end up with deliverables that are not read any more, that understanding of the audience is important.
The second thing is that the amount of time that is spent in producing a deliverable, so an audit report or a dashboard In the case of some performance auditors, as we've been saying, there's a lot of time and effort and money that is being spent on delivering that, and if that is going to the wrong audience, then it's not actually delivering any benefit. And so let's think about it, from a performance audit perspective. We've seen so many dashboards produced by performance auditors over the last few years. And really, the question that we ask is, who is the audience of these outputs, and we can't put our finger on it because it's either not explained or it's when you look at what the data is actually saying, you can't figure out who the audience for this thing might be. It seems to be quite jumbled up, you know, it might address a little bit of what person A needs, a little bit of what person B needs, but it doesn't actually address what what somebody needs to look for, what somebody's looking for. And so what you would be doing there is recreating something that is not usable and as performance auditors, we always look at, how other agencies are or we evaluate other agencies in terms of what they do as it relates to economy, efficiency and effectiveness. And if we're doing something like that, we're not actually delivering on any of those.
Yes, I was looking at an annual report from one of the Auditors-General in Canada. This is the Alberta - Office of the Auditor General of Alberta. In 2018. They've got there, had a new auditor-general come in, Doug Wiley and he set out four priorities to demonstrate the value proposition of the office. The fourth of those priorities is enhance. I'll read it out, enhancing how we work by evaluating and improving our own operational efficiency, economy and effectiveness by focusing on continuous improvement of how we conduct our work and holding ourselves to the same standards we hold to those we audit.
So that's like a performance audit of the performance auditors
When they're doing performance audit they think about efficiency, economy effectiveness. So they need to enhance their own work by improving their own measures in that regard. If they doing that and every performance audit office, every auditor general, every supreme audit institution is doing that as well, which I imagine they wouldn't be averse to doing if not explicitly set out, then when dashboards and others and reports are created, they need to be thinking about Is that the most efficient way to do things? And is that catering to your audience and delivering what you should be delivering that is relevant to the audience that you have? And so the example that we gave earlier of the dashboard that was created without very clear definition of who the audience was and then like we said, reading into it, you can't really see who the audience might be because it appears to be aimed at the few different types of individuals without, you know, hitting any one of them in particular. Then the question is how much of time you spent on that. How much money you spent on that ? How much of resource is spent on that? And is it actually achieving the objectives that we set out to achieve? If anything.
But I wonder, Yusuf, if too much effort and focus is put on the output one of the outputs being a dashboard, saying, ahhh, as part of this performance audit or internal audit that we need to produce a dashboard instead of the focus being on the dashboard as just a vehicle to show us what we found.
So I agree with you. I do think that it's important to set out what you know. I think if you're going to be producing a dashboard, and there's nothing wrong with producing nice visuals to make things more readable for people and make things into active people as well, if you are going to be producing that dashboard, it can't just be we have some data. Let's put something together that looks nice.
More sophisticated readers of dashboards or decision makers would say, Well, that's fantastic, this dashboard, But what is this telling me? And then the secondary question would be well if the audit or the assurance team is not displaying that information in a way that's intelligible, then it's not because they haven't done the analysis. It's because they're not displaying it properly. Is that right?
Yeah, so intelligible is one. So it has to actually make sense to the reader. But also it has to be focused on the reader. So you know that this is there's both you. You can create something that is very easy to read and very easy to understand, but actually says nothing, right? So the there's a couple of dashboards that I'm thinking about closer to home in Australia that I've seen recently from performance audit teams where those dashboards are created and if the intent is to, you know, be able to share information, fantastic, that sharing of information and the creation of open data is is definitely a positive thing, and we don't want to take away from you know that element of it. However, when you look at what sorts of questions it enables you to answer, I don't actually know what it is that I'm looking at. What sort of information it provides to me that I can do anything with
Are you the audience, Yusuf, as a sophisticated viewer of dashboards, are you the audience for which that is developed?
Maybe I'm not the audience, but in the absence of the definition of who the audience is, I can't make head or tail of it. So I can't even put myself into the shoes of anybody that might be the audience, because I can't figure out who that is. Because it's neither mentioned explicitly, nor can I work out implicitly what it might be.
The solution sounds like clearly stating who the audience is for the display of that information. Is that right?
That's right. I think with this particular challenge, reporting. Really, if we all just worked on knowing exactly who our audience is, what it is that they need to know and what we can say to them to be able to give them information that enables them to make some sort of decision or get information that they didn't have before that would be useful to know or understand. Then we've got to a much better position than we in right now. So that's the one thing I would say is a solution to the reporting problem that we have today.
So in both internal audit and performance audit should we spend more time in trying to understand the information needs of the users of our report at the outset?
That's exactly what we should be doing. Yes, going back to the report that I spoke about the Auditor General of Alberta, they actually lay out performance audits they have conducted over the course of the the last year or the previous year, and they explain it quite simply. For each audit they have a one pager that says why they did the audit, what they found and then outlining very clearly why it is important to Albertans in their case, because Albertans are their audience the public with in Alberta, Canada. But they explained very clearly why that performance audit and the outcome is important to them. So that's an example of where it's being done well, we've spoken about timing a few times, so we'll we'll leave timing out apart from saying that we want to be able to start our audit work as early as possible. But in terms of reporting, if we recap, we need to understand the audience. So we need to understand what their needs are. What do the audience need to see that, then, is the solution to the reporting problem. You even know that you have a reporting problem or you don't. And I'm hoping that it's more of the former than the latter. But even if you are working through the fact that you want to improve your reporting, etcetera, the first thing is think about your audience. And the second thing is work out what they need to be able to see.
So over the last five days, we've spoken about five items. Do you want to recap for us?
Yeah. So the five challenges we looked at over the past five days were 1. accessing data. So if you can't get all the data or you can't get it quickly enough, what should you do? The 2nd one was low value, so value there in terms of we can't provide new insights. Although we may be able to provide some assurance on how do you manage those expectations around using data provide insights. Number 3 was around false positives and Yusuf you gave some great examples there about what the heck do we do when we get too many false positives, how do we manage them? If there's a deluge of information, don't run away from it and you give some practical examples of how we might deal with that. The 4th one we talked about was superficiality, not superficiality in terms of what we're trying to achieve. But superficiality in terms of the the results we're seeing probably haven't gone deep enough into understanding the problem and its root causes. How do we get around that. And the 5th one we just talked about today is reporting and timing. What are some of the main things you need to talk about it in your reports? How do you focus on your audience? How do you make sure that you're absolutely hitting the nail on the head in terms of the key messages you need to convey.
fantastic. That wraps up this five part series, and we'll we'll talk again next time. Conor. Thanks, man.
If you enjoyed this podcast, please share with a friend and rate us in your podcast app. For immediate notification of new episodes, you can subscribe at assuranceshow.com - the link is in the show notes.