MARCH 23, 2020
In this episode we explore Open Access to data used within Internal Audit and Performance Audit teams, with focus on 8 exceptions that may necessitate a level of restriction.
In general, access should be as open as possible, as explained in the previous episode.
But there are 8 specific exceptions to consider, carefully:
- Cross-border concerns.
- Strict organizational data governance regimes.
- Highly sensitive data.
- The need to maintain Chinese walls.
- Very large audit teams.
- Where specific data is sensitive.
- If there is risk that data will be altered.
- Vulnerability - heightened security/cyber risk.
Welcome to the assurance show. This podcast is for internal auditors and performance auditors. We discuss risk and data focused ideas that are relevant to assurance professionals. Your hosts are Conor McGarrity and Yusuf Moolla.
Hi Yusuf. How are you?
Good thanks and yourself.
Not too bad. Thank you.
So the discussion today is going to be about exploring open access to data used within internal audit and performance audit teams, with specific focus on eight exceptions that may necessitate a level of restriction. In general, access should be open as much as possible, as explained in a previous episode. But there are eight specific exceptions that we need to consider in determining open access.
Can we get a bit more specific there? Can you talk about an internal audit where it was a requirement to keep the data set closed to particular internal audit team members as supposed to opening it up more broadly?
if we take an example of a health audit. So where we dealing with medical related data for individuals. Typically you, and this is not now, this is not completely keeping the data set separate. This is about particular fields within the data. Personally identifiable information generally needs to be kept private. But when you get into health data and in particular when you get into health claims data, that can be very sensitive. And so what you want to do - is the broader data, so if you think about looking at, for example, health claims type audits. You'll have a range of data. You'll have personal data, so that's about the individual. You'll have data about premiums that have been that have been turned to make sure that premiums were actually paid before a claim. You'll have the actual claims data, and then you might have some reference data. Reference data is fine. The personal data, to an extent depending on what's in there. Premiums data is usually okay. It's when you get to the actual claims data that you need to keep some of the claim detail sensitive.
Just to jump in there. When you say reference data, what are you talking about there.
So reference data would be the data that would be common across the data set, regardless off the types of claims that are captured, for example, within claims, you may have particular claims line items that have descriptions. And so in your claims data, you may not put that in, so you may not actually put the descriptions in. You may not have the actual what you would call look up data within that table, so you keep that separate. But that's fairly generic. So you'll say, you know, claim AT is of type physiotherapy or claim AB is of type clinical, nursing or whatever, right? There's different things that you'll actually put in there. So that's generic reference data that you use.
One of the problems with opening data up. And I know we've been talking about internal audits, so maybe let's switch a little bit to performance audit. What are the challenges that people usually have with sharing data in the performance audit sphere that is.
So if you think about performance audits - and a lot of internal audits are actually performance audits, looking at economy, efficiency and effectiveness of things - probably the most sensitive data in that respect would be the efficiency data. So how well are you doing things with the least amount of resources. And why that's sensitive is because quite often that can demonstrate that there's opportunity to improve how you do things and make them work better with the current resources. So why is that challenging in terms of sharing resources, it's basically because nobody wants to have their performance shared with other people.
So this is in the public sector we're talking about.
Yeah, mostly, primarily.
And what sort of performance do people want to keep to themselves?
I don't know that they want to keep it to themselves. I think it's because things are funded from the public purse, for example, it's not always beneficial for that stuff to be made to make it into the public domain. Now, you might say, Yusuf, well, we're the taxpayers. Of course, it's important for that stuff to be transparent and I would totally agree with you. I'm thinking more of the political domain in that it's not appetising, sometimes, for politicians who look after certain portfolios to have performance information in the public domain that may suggest that their portfolio was not performing as it should be.
I've got a list of eight exceptions. We can call this a 9th exception thrown in if we like. So let's go through those eight and see if any of them resonate. And if any of them actually do overlap with the matter that you just brought up. Sure, so the first one is around cross border concerns. So this is where we have audit functions within multinational organisations. So you know, you've got an audit function in the UK and an audit function in the US and another audit function, maybe somewhere in the east. And so if you have that situation, you may have that data can't move between those jurisdictions. The other cross border concern that you may have is where audit work is outsourced to offshore processing centres. That's been happening more probably over the last 20 years. Somewhere late in the nineties, early two thousand's is when that started happening. In the last 10 years it's been accelerating and actually, in fact, some of that has been shifting back to the original processing centres. But that's a potential risk where the data is sensitive and cross border transfer is a real risk.
I think that outsourcing off the internal audit function is potentially a topic for the future when maybe we could dig into it in a bit more detail.
Sure, sure. That makes sense. Yes, so cross border concerns is the first one. So if you do have some cross border concerns, the data is sensitive and cross border transfer is a real risk. So not just something that you throw on to decide to close access. But it's a real risk then there's rationale for closing access up. The next one is where the organisation has a very strict data governance regime. And maybe this one ties in a little bit to the thinking around political expectations. So there's a very strict data governance regime in place across the organisation more broadly. The audit team may decide to follow that, and that's to align with overall organisational thinking. And some functions do that legitimately. But the thing to be careful with this, is that it may limit your ability to provide a strong independent function.
So can I put you on the spot here, Yusuf, and ask you a question? Do you see any particular industries or sector types that have an overly restrictive data governance regime in their internal audit teams as opposed to others. It's a very broad question. I know.
Certain public sector entity types. Defence, the tax office, revenue offices will have very strong data governance regimes. A public sector organisation that deals with health and in fact, any private sector organisation that deals with health. Anybody that deals with processing of credit card data, large retailers but that would purely be in relation to PC I-DSS. And then obviously financial services firms. So many financial services firms have very strict data governance regimes in places. And in particular. what we see is not necessarily retail banking or retail insurance, but where you have high net worth individuals. So wealth management. That's where you'll have very strict data governance practices in place. There maybe a few others. But if I think about that all, I'll come back. So back to that so independence. So if you follow the data governance regime of the organisation, strictly, does that impair your independence?
I don't think I understand the question. How would it impair independence?
So if the reason you are making a decision around sharing data within your team is that the organisation has a very strong data governance regime and you want to follow it. Does that mean that you are following a regime, following a process, following a governance expectation to the detriment of the level of independence you're able to provide?
To be honest, I've never contemplated that previously. Shooting from the hip. I would say the answer is possibly at the risk of talking in vagaries. I would suspect that any internal audit team that's using data or performance audit team would have to operate within the organisation's data framework. As to whether or not that impairs independence, I'm not sure. If a team was doing something that either wasn't spoken about in the data governance regime or potentially went contrary to that - I would suspect that the reason and rationale would have to be very well documented.
Yes. So my view on it is that if you are purely restricting access across the team in order to comply with the organisational data governance framework, then there may be a threat to independence and that is - in particular, if that restriction of access prevents you from doing a better job overall, so prevents you from, you know, linking matters up between different subject areas appropriately.
But is that an independence thing? Or is that an ability to perform your function?
Well if your ability to perform your function is being degraded by your need to follow a particular framework that the organisation has set, then your independence is impaired.
I think I've got a contrary view to that because to me, independence is about being disassociated from having an interest in what you're reviewing.
Being independent means that you are not subject to restrictions that others are subject to, and that you can go into areas that others won't be able to go into so that you can provide those charged with governance with a broad view. If you are following an organisational framework, expectation and that doesn't allow you to go into those areas, you have impairment to your independence,
But if you want to circumvent or go outside or workaround established entity wide protocols or data governance architecture, then you need to have a really good decision that's well documented and rationale.
What you would then do is you would apply for an exemption from the overall data governance framework. Ok, so that was number two. So the third one is where you have highly sensitive data. So this is where all or most of the organization's data is classified as highly sensitive right. Now, all or most of, so this is you know, it's got to be more than 75/80% of the data is highly sensitive, and you're not going to find that in traditional sectors. So a singularly focused entity. So we're talking about maybe Uber 10 years ago right? So if they existed back then. But they would have they would have started off with ride sharing, and that's all that they cared about so all of the data related to that so they'd have some travel related data, they'd have some trip related data, they'd have some passenger related data. So it was very singularly focused, whereas if you think about your your typical public sector organisation or your typical banking organisation, they would have a range of data and so wouldn't fall into the over 50% threshold. That's the third one. The fourth one is where audit functions regularly provide consulting or advisory services, and there's a need to maintain Chinese walls.
I'm really interested in this one because this is something we see coming up time and time again in the public sector sphere in that audit offices around the world, I think it's fair to say, are being called on by their clients, which are public sector departments, to provide advice and insights. How do you define that line between performing and audit function and providing something that's more of an advisory function? That's really interesting.
Okay, so this is definitely a topic for a separate episode and we'll pick that up again. But what we're talking about here is more where individual members of the audit team have been involved in providing advisory services. So where, for example, a member of the Internal Audit team has been seconded to the business to help with control or risk related remediation activity.
Okay, sorry, I misinterpreted.
But again, the risk needs to be clearly articulated here. So even if they were involved in something that doesn't necessarily preclude them from seeing that data, there may be a few situations where this would be relevant, but we're talking about few and far between, so let's clearly articulate what that risk is.
So presumably somebody who has gone to the business to help implement some controls as a result of an audit can't then come in and a year later, be involved in the audit team. That may look at remediation of particular findings.
Well, typically, you want to avoid that. The timelines vary, so sometimes the businesses moved so far from so far beyond that point within a year that that some functions will say a year is reasonable and you can actually go back and audit something that you've been involved in a year ago. Maybe you just can't do it in the first 12 months, and it also depends on what the level of, I guess, what the level of involvement that person had and whether there is any - so they can't be involved in the audit, Yes, but does that mean that they shouldn't be able to see the data that is relevant to that audit.
I'd probably take a prudent approach there personally and wouldn't be that comfortable with somebody coming back into the team and auditing something with which they had previous exposure. Even within a 12 month timeframe.
Even if they're not part of the audit team, do you need to prevent them from having access to the data that was used for the audit? So the situation is that John went off to particular business function. He comes back a year later and an audit is being conducted on that particular function. He's excluded from the audit because he was involved directly. But does that mean that the data that's received he can't actually see?
So it might be beneficial for John in your example to see the data? But just not be a decision maker as too how for example, that audit...
That's the thinking.
Another factor might be, and this is the fifth one. We have a very large audit team, and I was going to use a number, but I'm not.
Come on, give me a benchmark.
So we're talking about, you know where you have more than 80 to 100 FTE. So we're talking about a large team that will be potentially distributed and where they wouldn't be working very closely together day to day as an entire team. So you're not going to have 100 people working closely together day to day. This may be the smallest of these factors, but there's some situations in which larger teams opt to exert higher levels of control. And that's because they're not sure what is happening within some of the smaller teams and whether there are movements among those teams and other risks that need to be considered. But the thing to be careful with this is that it could just be that management practices are not as optimal is they could be and that these sorts of measures that have been have been put in. So this sort of control that's been put in because the team is large is to overcome some of the sub-optimal management practices that may exist. So not knowing what your team is doing.
Okay, so that's pretty interesting. So are you saying that the bigger the team, the more control over data that potentially exists that should not exist? So let me rephrase that. In your experience, is there restrictive practises in sharing data the larger the audit team?
In my experience, there's restrictive practices in even sharing information, the larger the audit team and the risk that is usually expressed to back up that restriction - so back up the closing access vs opening access - is that we're not sure what people are gonna do with this. We're not sure whether people should have access to this and the question with that is - why are you not sure what your people are doing? So regardless of the size of your audit team, you should have the right distribution of management responsibility and accountability and information flow to know what you people are doing and to know what they might do with data to be able to articulate that risk clearly.
Okay, so is that more of a management responsibility and having visibility and and knowledge of what's going on within even bigger teams as opposed to overly restricting how they do their job?
Yes, Absolutely. There's three other considerations. We're up to five. There's three other considerations that probably fall more - less organisation and more sort of individual audit focused. So where you have specific data that's sensitive? Yes, you want to be able to open the access generally but specifically segregate that access. And we spoke about that before. If there's risk that data will be altered. So now we're on to number seven. If there's risk that data will be altered if we think that the data that's coming in might be altered. But there again, we need to. We can actually lock down the individual source audit file.
So are you talking there about direct access to systems where auditors have direct access to data systems? Is that what you're talking about?
No, not quite, because auditors generally won't have direct access. What I'm talking about there is let's say we do 25 audits a year. If we execute one audit and maintain the audit file that has all of the source data that was used and then all of the analysed data and the results, and if we then provide access to that audit file more broadly amongst the audit team. Then there's a risk that either the source data or the results will be altered. And this could be by mistake. Or it could be intentionally by someone that wasn't involved in the audit directly. So the simple answer to that is that you shouldn't give access to make changes to anybody. In fact, you look, your audit file should really be locked down in the first place, right? So once your file, once your audit is complete, you should be locking it down. But let's say the auditis still live. And when an audit is live, you're not locking down all of these files yet. So the first thing is that you want to lock down your source data because nobody should be changing source data. There's no need to do that. So the data that you actually get in from the outside you keep in a read only form, even for the audit team that are involved. Then the second thing is that you can give read access to an audit file to a member of a different audit team so that there's no risk that they're going to intentionally or unintentionally change that data.
I know we're only on number seven of yet I think it is. But would your view, Yusuf, be that there should be a custodian of the data that's obtained for particular audit? Should there be a person in the audit team that's a custodian of that data that makes sure that it's not able to be manipulated by others?
So I think that we could do that. And if it's a smaller audit team, or the practices are not as consistent between audits. And I'm not saying that in a negative way. Quite often you have very dynamic high value audit teams where the audits themselves differ in the way in which they're conducted. And that's not a bad thing. In fact, it could be a good thing. If you're in a situation like that, you may want to have a custodian to look after that particular element. If your audits are structured more consistently, then you really want to have a process in place rather than an individual custodian. So if you say that this is what we're going to do, we have these five people that are involved in the audit, and when we have these five people involved in the audit, they have direct access to to the data to make changes - except for source data, of course. And anybody else is given read only access. Then you don't necessarily need to have a direct custodian. The manager involved, or the team leader can determine whether a new person should get read access or not. And so should that decision be made on an audit by audit basis. I think it's easier to make that decision more generally. So to put that in place more generally, and then if there are exceptions on an audit by audit basis, you can deal with those exceptions. All right, and then the eighth one is vulnerability. This a bit of a newer topic. It hadn't really come up before a lot in my experience anyway. And so that's where if there's some if there's heightened risk - security risk, Cyber risk, call it what you like -because of the access to the data that the braoder team has. So if there's knowledge that the broader team has access to data, or if giving access to more people means that you're spreading the risk of the data being breached, then there might be a consideration. But there's other ways to mitigate that risk. This other ways to put controls in place, to mitigate that. And that is one of those is largely strengthening the identification, authentication and protection mechanisms of the audit team's access.
So the traditional posture would be, let's lock down this data set. It's only eyes only the people or the person who needs to do some analysis. But it sounds like based on all everything we've discussed today, that that's not necessarily the most efficient or the best way to do it. Is that right?
That's right. Yeah. So what we're saying is that this there's a number of reasons to restrict access either broadly or generally, and we spoke about eight of them just there. I mean, obviously, this will be in the show notes, so we're not gonna we're gonna repeat those individual items, but yes, unless there's good reason to restrict access, that access should be kept open. We spoke last week about why we want to open access up and what the cost of restricting access is and this is yes, exactly what you said.
Do you see any cultural shift from your observations in how audit teams are treating that data? Are they making it more open across teams now? Have you got any observations to share on that?
So I don't know that there are many teams, audit teams that are thinking about this purely because there are many audit teams that are not using data yet. I think the default position is restrict, but in conversations more recently, there are some shifts away from that. I can't say that there's a theme more broadly. What I can say is that conversations that we've had recently suggest that there's a shift away from closed access towards open access.
So I'm gonna put you on the spot here. In conclusion. Would you say there's more risk, to, the effectiveness and efficiency of internal audit and performance audit teams not sharing data then there is to sharing data?
Are we doing this similar audits where we've already got information last year or the year before, six months ago that could inform what we're doing and cut down the audit time or audit resources, substantially. We've got to start thinking about some of those things.
That's a good idea, and we keep coming up with with topics for future episodes. So we're gonna have to round it up there before you come up with another topic. In summary, everything we've been talking about today is about taking a risk based approach, not just putting controls in place, because that's the easy way. Or because that's the way that we think is going to reduce risk. We always talk to management about taking risk based approaches to the exercise of business. You're conducting business, and so we should really be drinking some of our own medicine there.
It's been really good to get your insights. Yusuf, thank you.
If you enjoyed this podcast, please share with a friend and rate us in your podcast app. For immediate notification of new episodes, you can subscribe at assuranceshow.com - the link is in the show notes.