

- Title
Parliamentary Joint Committee on Law Enforcement
09/12/2021
Law enforcement capabilities in relation to child exploitation
- Database
Joint Committees
- Date
09-12-2021
- Source
Joint
- Parl No.
46
- Committee Name
Parliamentary Joint Committee on Law Enforcement
- Page
1
- Place
- Questioner
Aly, Anne MP
Conaghan, Pat MP
- Reference
- Responder
Simmonds, Julian MP (The CHAIR)
Dr Zirnsak
CHAIR
- Status
- System Id
committees/commjnt/25257/0001
09/12/2021
Law enforcement capabilities in relation to child exploitation
ZIRNSAK, Dr Mark, Senior Social Justice Advocate, Uniting Church in Australia, Synod of Victoria and Tasmania [by video link]
Committee met at 10:00
CHAIR ( Mr Simmonds ): I declare open this public hearing of the Joint Committee on Law Enforcement for its inquiry into law enforcement capabilities in relation to child exploitation. The committee's proceedings are public. The committee prefers evidence to be given in public but, under the Senate's resolutions, witnesses have the right to request to be heard in confidence, described as being in camera. It's important that witnesses give committee notice if they intend to give evidence in camera so we can switch to that mode. I remind all witnesses that in giving evidence to the committee they are protected by parliamentary privilege. It is unlawful for anyone to threaten or disadvantage a witness on account of evidence given to a committee and such action may be treated as an act of contempt by the Senate. It's also an act of contempt to give false or misleading evidence to the committee. We have a copy of your submission, which we have read thoroughly, and we're keen to ask questions. Would you like to make an opening statement for the committee about what you are keen to explore and your takeaways from the submission? We will then leap into questions.
Dr Zirnsak : Thank you for the opportunity to appear before the committee. Without rehashing the entire submission, I will stick to making some high-level points. One is that the sheer volume of child sexual abuse and child exploitation online is so large that the police strategy of simply arresting and prosecution is inadequate, and law enforcement agencies have repeatedly said that. So our focus has been on the need to look at measures that prevent, deter and disrupt this activity to get the volume of offending down, to prevent people entering that trajectory of getting into child sexual abuse and to also provide a safer online environment for children in the first place so there is less potential for them to be targeted. The work by the eSafety Commissioner for Safety by Design is a valuable technological contribution being made in this space.
I want to emphasise one of our key measures we have put forward is that technology companies move to identify users. Unfortunately, the current debate in public around this is exceedingly disappointing. It is either you have a completely public identity or you have a completely anonymous identity. Obviously, you can have an anonymous identity that is outward-facing—people don't necessarily need to know who you are—but the platform provider in all cases should know who you are. The test here is that the platform provider knows, so if you misuse your profile, your account, then law enforcement is able to easily identify you and not waste time doing that.
To highlight the damage done, there is the recent case of Alladin Lanim from Sarawak Malaysia. He was recently sentenced to 48 years in prison. He started posting child sexual abuse material online in 2007 but, because of his ability to have an online anonymous identity, it was not until this year that law enforcement agencies globally were able to identify him. In that period, he was able to post approximately 10,000 images and videos of child sexual abuse. Australian law enforcement agencies have identified 34 of the children who were the victims of his abuse but there may have been more beyond those that have been identified. So that is the consequence of allowing people to have these completely anonymous identities rather than a situation where the platform knows who you are, even if you have an anonymous identity that is public-facing which is needed for some other reasons. The other thing that I probably would emphasise here is really a no-brainer—the complaints mechanism that we've suggested. The research by the Canadian Centre for Child Protection indicates clearly there is a need for online platforms to lift their game in the way they provide complaints mechanisms, particularly for survivors to report material that is present on their platforms and also help identify offenders who are using those platform.
The one we didn't cover I should make mention of is pop-up messages. I know the Australian Institute of Criminology has just released a paper that talks about pop-up message and there was work done at the University of Queensland looking at pop-up messages. They are used in other jurisdictions. You can tie it currently to where ISPs are required to give people a warning message if they're trying to access a site that is on Interpol's 'worst of the worst' list. The pop-up message warns the person they're trying to access illegal material. In some jurisdictions, there is the suggestion you can offer the person a referral: if they have thoughts about child sex abuse, there are places they can go to get assistance to help them overcome those thoughts and that behaviour. They are things people are experimenting with. I know Facebook announced at the start of 2021 they were also experimenting with pop-up messages on their platform.
The final thing I would point to is at times in this space I often have a significant disappointment with sections of the human rights community who appear to champion the right of privacy over all other human rights. If you read their submissions in this space around online regulation, they will not acknowledge that the abuse of children in the online space is a human rights abuse. I find that truly bizarre. Even when it comes to the right of privacy, which they claim to be upholding, they don't seem to give any acknowledgement to the violation of the right to privacy of survivors when images of their abuse are being posted online and when platforms are not taking steps to remove the material. So there is more than just the right to privacy of potential offenders; there is also the right of privacy of the victims and survivors of child sexual abuse. I will open it up to questions.
CHAIR: Firstly, thank you for the church's interest. I can tell your submissions are very genuine and have practical in nature and well researched too. To pick up on your point about providers having to know the identities of users, obviously we're going that way or are keen to go that way with our anti-trolling measures we've recently announced, and you alluded to that debate. I can see how that would have context in the example you used, where somebody is use using a social media service to access CAM, but the reality is, I would have thought, that most of them aren't doing it as openly as that; most of them would be in encrypted forums or on the dark web or other of the like. Does that need to know who a user is relate only to social media accounts, or is it more broad than that?
Dr Zirnsak : I would argue it's across all platforms. A recent documentary, Children in the Pictures—I'm sure the police witnesses you will have before you will testify more to this, but it was quite clear—we've known this for quite a while—that police will break into a network and then have to spend a long time trying to figure out who the actual real people are behind the handles used in those child sexual abuse networks. If it were a matter that they could just got to the provider and obtain that information—'Here's the handle that's being used. Who is the real person behind that?'—clearly that would speed up detection of offenders and prevent ongoing child sexual abuse. It's really disturbing that that delay in being able to identify an offender means that children can be subjected to ongoing sexual abuse while the offender is being identified.
CHAIR: So it would be a matter of ISPs knowing who each user is, essentially, on top of tech platforms?
Dr Zirnsak : It would be all the technology companies. Effectively, when I open an account of any sort, or when I subscribe, I have to provide the natural identity. None of those system also be absolutely perfect; we're not naive about that. A really dedicated offender might still go to enormous lengths to conceal and use a false identity, but you're going to make it harder. This is the point, the issue of disruption: the more you can do and the harder you make it for people, the more you are going to deter them from doing that and the more you will remove any perceived sense of anonymity they have that increases their sense that they can get away with this.
CHAIR: That's a good point. I have sympathy for your point that, to a certain extent, penalties can only go so far as a deterrent. Once you have stringent penalties in place, which I think we do, then it comes down to the likelihood of being caught and them being convinced that they will be caught. A number of your recommendations relate to, for example, penalties against tech providers who don't comply with law enforcement—nothing against that in principle. The committee did recent work on the AVM Act, which has a similar framework. I wonder, in putting together its recommendations, how familiar the church was with things like the AVM Act and the new SLAID powers we've given law enforcement.
Dr Zirnsak : Absolutely. I think the parliament has been moving in the right direction on this. What we were highlighting, though, was just wanting to make sure that in the legislation—our suggestion to the committee really is to explore the degree to which the current legislative framework would allow individuals to be prosecuted for refusing to comply rather than the company. Our fear is that if it's just the company that gets hit with a financial penalty, for example, they might still think it's worth their while to not comply with a request, whereas if it's the individuals inside the company making the decision to not comply with a lawful request under the legislation, I think you will find that will motivate compliance more.
I think the complexity around this—I don't want to paint the big tech companies—many of them are not completely villains and not completely heroes; they have mixed behaviour in all sorts of ways. I don't doubt the genuineness of some of the measures they are taking to address this. But when I look at the recent expose in The Wall Street Journal, where you've had insiders and leaked documents from inside Facebook showing enormous delays in complying with basic law—allowing a Mexican drug cartel to recruit hitmen and pay them over their platform and promote their work for five months before taking it down—it suggests at times there are people inside the organisation who are not highly motivated towards protecting human rights. I think a clear signal that individuals can be held to account would certainly help encourage greater compliance among those who might be resistive.
CHAIR: Who do you think that kind of deterrent legislation should be holding to account? You talk about individuals. Are these, do you think, content managers or, ultimately, CEOs or board chairmen? Who is the individual, in your view, who should be receiving penalties for not acting reasonably enough?
Dr Zirnsak : I think what we said was the decision-maker. I'd be reluctant to see, for example, a contracted Filipino content manager in a subcontracting firm being held to account for following strict instructions about what they should be doing. That would seem inappropriate. The person who needs to be held to account is ultimately the person who's setting the instructions for those employees, so I think who gets held to account is at the level of who's making that decision inside the organisation. That would also then put the CEO and the board potentially on notice because, if they're not sending clear signals to their staff about what the acceptable standard is and the need to comply with the law, then ultimately it will come back to them. If they're sending a signal that refusal to work with law enforcement or obstructing or hindering law enforcement is an acceptable behaviour, then ultimately they would be the ones held to account as well.
CHAIR: Yes. We're going to hear evidence later on today from Victoria Police. They noted in their submissions that their request to, say, Facebook can take anywhere from six months to two years. In these kinds of situations, you would like to see an enforcement regime where the CEO and the board of Facebook are held accountable for those delays.
Dr Zirnsak : If they're the ones creating the delay. If they've sent a clear instruction and it's a middle manager who's not complying with the company's behaviour, then ultimately that middle manager would be the one. But you'd think that if it was only a middle manager breaking instructions there'd be internal discipline by the company to hold them to account as well. So it does seem to suggest that resistance of that nature goes further up the chain, in terms of who's setting policies that allow that kind of lack of cooperation to occur.
CHAIR: You mentioned this idea of content managers in the Philippines, and I picked up in your submission, too, this concern you have of content managers in third-party countries who might, in their efforts to take down material, be destroying evidence that law enforcement needs. You mentioned the documentary as one example. But, beyond that, have you collected evidence of other examples? How widespread do you think this is? As an issue, it strikes interest in the committee. I think we need to explore it further and I'd like to know how far you've taken it yourself.
Dr Zirnsak : The private investigators we used have mapped some of the content management firms in the Philippines and they've already found destruction, not so much of evidence. But the way in which some of them market themselves is disturbing—'We offer the cheapest possible labour for content management that you'll find anywhere on the globe.' That's not a great sign of how they treat their employees or of expectations. We actually had set aside budget to investigate, and then, unfortunately, COVID hit us. So we actually have an on-the-ground investigator lined up and ready to go, and as soon as restrictions start to ease we will be trying to do some on-the-ground investigation to further verify the kinds of problems. Also, that documentary, The Cleaners, that I referred to didn't identify which providers were accessing the content firms where you had the content manager basically alleging that they had destroyed child sexual abuse material without preservation. We did take this up with, for example, Facebook. Facebook gave us an absolute guarantee that their company-wide policy and practice is to preserve evidence and report it to law enforcement agencies or report it to the National Center for Missing and Exploited Children in the US, and we have no evidence to the contrary to say that that's not what they're doing. It could be other providers that are using the content managers where evidence is being destroyed. We plan to look at this further, but we also thought it was worth raising with the committee. I come from a church background. The church has failed terribly in dealing with child sexual abuse, where often the allegations were verbal allegations. In this case you've actually got the hard evidence being presented. You're actually seeing the abuse taking place. So to destroy that evidence is, to my mind, far more egregious than someone who simply dismisses a complaint being made—not that dismissing a complaint is in any way excusable either; that's completely unacceptable behaviour as well.
CHAIR: Did you say you had undertaken some efforts to map particular companies? Is that map providers?
Dr Zirnsak : Yes. We have mapped content management companies in the Philippines that we wish to investigate, and we will look to do that.
CHAIR: So you're not sure whether they are in fact undertaking this practice, but you've made an effort and had the awareness to see them. Is that something that you can provide to the committee, even if it were on a confidential basis? We would keep it confidential.
Dr Zirnsak : We can provide you with a list of content management companies that potentially we'll look to investigate. We're happy to do that. Probably the only issue is that we haven't had the chance to do the investigation yet, because COVID stopped us from getting there.
CHAIR: We understand, and you're not trying to imply guilt just by having them on that list. I understand that.
Dr Zirnsak : Correct. Exactly.
CHAIR: But the list would be helpful, just so the committee can understand the extent to which these companies are used and so we can undertake some more due diligence.
Dr Zirnsak : We're more than happy to provide that.
CHAIR: Finally, before I hand over to the deputy chair—I'm exhausting my time pretty quickly—I wanted to ask about a topic that has been raised by a number of submitters. We're going to be talking about it throughout the day, so I'd like to get your views on the record. It's the idea of putting in place harm minimisation strategies, whether around rehabilitation or similar to drug programs, where offenders, or potential offenders, would hear from victims about the damage that it does. Can I get your view on how you think that would work? I'll be upfront with you and say that I probably have some concerns that we would have to be careful in such programs not to destigmatise the offending too much within the community, because it quite rightly deserves to be stigmatised, but I'm willing to have the discussion with people about how they think those programs would work.
Dr Zirnsak : I haven't kept up entirely with the offender literature in recent years, but we did look at this a while back. It seemed at that stage that it was quite a debated area. One side of the literature said that there are offenders who are non-contact offenders; there's a notion that there are people who simply view material or purchase material online and that they themselves are not contact offenders, so they commit no physical offence against children. The argument has been that, sometimes, when those people get sent, for example, into a treatment course—once they're prosecuted, incarcerated and put in with contact offenders—they have a similar reaction to contact offenders to what any other member of the public would. They somehow see themselves as different to contact offenders. In their mind, they somehow justify this distorted thinking: 'I didn't do anything wrong. I was just looking at pictures. I didn't actually harm the children.'
Another side of that debate says that those people don't exist; that there aren't non-contact offenders or, if they do exist, they're a very, very small group; and that, in fact, many of those who get classified as non-contact offenders are simply contact offenders who haven't been detected for their contact offences. I've heard that view from law enforcement quite strongly. So there are law enforcement agencies who will very strongly say that offenders—
CHAIR: We certainly understand both sides of the debate. What I'm trying to get at is: where would the church fall? You can offer your own view or whatever you want to do, but do you have an opinion one way or another?
Dr Zirnsak : I think that, if there is an identifiable group of non-contact offenders and there are programs that would help rehabilitate them, we would certainly support those. I think some of the challenges would be—and this is one of the things we've talked about with people—whether they occur in sufficient numbers to justify a program that would target just them. Otherwise, a contact offender who has also engaged in offences online will probably access the standard rehabilitation programs that are available for contact offenders more generally. So I think that's probably the question: does that group exist and, to the extent that they exist, can it be justified to have a program that specifically targets them? There is literature that seems to suggest that programs targeting genuine non-contact offenders are quite effective at reducing offending and recidivism.
CHAIR: Okay. Deputy Chair, I'll hand over to you.
Dr ALY: Thank you, Chair, and thank you, Dr Zirnsak, for appearing today and for your submission. I wanted to start with the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019, which this committee recently reviewed. Some of the evidence that we heard in the review of that act was that the majority—around 90 per cent—of material that's taken down relates to torture and rape and, specifically, child abuse. Can you please comment on the possible effectiveness of the abhorrent violent materials act, which compels social media platforms to remove AVM, and its effectiveness in addressing CAM.
Dr Zirnsak : We fully support that kind of legislation and see massive utility for it. A proper evaluation would need to be done to measure how well that has performed in addressing child sexual abuse material. As our submission goes into—and that is based on the evidence of the Canadian Centre for Child Protection—you have content hosts who will resist takedown notices and dispute the need to remove material related to child sexual abuse or who will engage in massive delay in removing that material. Obviously you've got some providers who are very onboard and very supportive and will move very quickly to remove material, but, as per our submission, there are clearly far too many content hosts who will either delay, not see it as a priority, or in some cases actively dispute removal of material. That's why a law that requires that removal is really necessary to level the playing field, to make sure there is a minimum standard for removal.
Dr ALY: The AVM act requires them to remove material, and if they don't comply, or if they're considered reckless, there are pretty high penalties. One of the things we heard in evidence was that the eSafety Commissioner has never had to issue a notice to any of the big social media companies. Then we heard other evidence that it's usually the smaller service providers who lack the capacity to monitor, to remove content, to do all the things that the larger ones do. Is that also something you found—that it's predominantly the smaller service providers? Or are we also talking about large service providers here—big social media providers?
Dr Zirnsak : I haven't heard it about some of the very large service providers. I think the reputational risk alone in refusing to remove child sexual abuse material would compel them to not be in that space of resisting. I think it's correct to say that it would be the smaller providers. I accept it probably is a capacity issue for some of them, around the speed at which they remove, but, as I indicated, the Canadian Centre for Child Protection has clearly demonstrated there are providers who actively dispute—so it's an ideological resistance to cooperating in that space.
There was some evidence of delay with WhatsApp previously, around the group. Within the large corporations, where they've absorbed other companies, the culture across such large corporations can vary. My understanding is the culture in WhatsApp has been more towards secrecy and providing secrecy to their users more than meta, as the large company would have it, as their across-the-board policy. I would accept there's variety within that space. Again, that highlights the necessity for that.
Our view would ultimately be: if a company is incapable of meeting such a basic requirement to provide a safe platform and remove material, should they be in this business at all? I don't think we should be catering our laws or sacrificing the rights of victims and survivors for the benefit of people who seem incapable of meeting basic requirements, in the same way we wouldn't allow someone to establish a financial institution if they couldn't meet the basic requirements of running that financial institution appropriately, with the appropriate probity and safeguards over the way they operate. I think the same applies to technology platform providers.
Dr ALY: Following on from that, on page 13 of your submission you state:
In 2019, the Internet Watch Foundation reported that child sexual abuse material was hosted on the following platforms …
I found this quite interesting: 84 per cent of material is on an image host—by far the majority. The next one after that is cyberlocker, at six per cent. Then there are: banner sites; websites; forums; video channels; image boards; search providers; and social networking. Then you go on to give the example of Facebook and WhatsApp. I wanted to ask you about platforms like Snapchat, for example, or platforms where you have small groups of people who can share images with each other. Are you seeing an increase there? Are they being used to distribute and share child abuse material?
Dr Zirnsak : I think there are a couple of things to say about that data. The Internet Watch Foundation data is about more static material. The other thing to remember about that is that it also represents the sample of detected material, so there is a risk that what it's indicating is that material is more easily detected on an image host than on some other platforms. Potentially, if a platform is very good at providing secrecy then detection rates are obviously going to be much lower. We presented the data, but there are a couple of cautions around the interpretation of that data.
Dr ALY: There's always caution around data!
Dr Zirnsak : Then you've got the difference between the hosting of material and images versus platforms that allow networks of people to interact with each other. I think the Virtual Global Taskforce have indicated they have seen an increase in people networking on a variety of platforms. Most of those, I think they have indicated, are in the dark web, but I don't think they're exclusively in that space.
Dr ALY: We've been given some material on this to read. In 2020 the Child Protection Triage Unit received 21,000 reports of online child sexual exploitation, and the AFP charged 191 people with, in total, 1,847 alleged child abuse related crimes in 2020. That's a stark difference—the volume of material that's out there compared to the number of people who are held to account and the volume of material that that applies to. In your opinion, is that largely because of the gaps in legislation that you have mentioned in your submission? Here I predominantly refer to your suggestion of making it an unambiguous offence to hide the identity of an offender.
Dr Zirnsak : The current system is stacked against law enforcement, and investigations end up being slowed down by that inability to identify offenders. This is also why we recommend that the parliament amend, to the degree that's necessary, the international production orders. But the notion that simply to get data to identify someone can take six to 12 months through a mutual legal assistance request across a border, as currently happens with the existing regime, is just unacceptable. So there is a slowness. Our understanding from law enforcement is that they have many more cases than they can possibly deal with.
On the other side, though, Facebook put out some data at the start of this year. They looked at all the material they'd reported to the National Centre for Missing and Exploited Children in October and November 2020, and they found that 90 per cent of it was material that had previously been identified and that half of the complaints that they'd reported related to just six pieces of material. So, with those 20,000 complaints, what would be interesting to explore is how many were duplicates of the same material—to what degree is that multiple complaints of the same material? It would be more interesting to look at the unique complaints that identified individual new material that law enforcement was previously unaware of and then look at the gap between that new identification versus what law enforcement have been actually able to tackle.
You're right to raise the question. I think, absolutely, the more that this committee can make recommendations that speed up the ability of law enforcement to deal with cases, the more cases they can deal with. But, as I said at the beginning, I think that, even if you did all of that and even if that system was perfect, the sheer volume of offending is still at a level where investigation, arrest and prosecution, in itself, is not going to be enough. Law enforcement itself says that; therefore, we need more measures that prevent and deter right at the beginning so that we're reducing the volume of offending by deterring people from getting into this in the first place. That leaves law enforcement to go after the higher-end offenders, the worst offenders in this space, who won't be deterred or swayed away from this behaviour.
Dr ALY: I have a few questions around deterrence, but I'll stop there and pass on.
CHAIR: Thanks for keeping us on track; that's very kind of you. We'll try to come back to you if we have the time at the end. Pat, do you want to go next?
Mr CONAGHAN: Hopefully, I'm going to follow on from the deputy chair in terms of deterrence. Dr Zirnsak, thank you for your report. It's very detailed, and, obviously, you and the Uniting Church have a major concern about child protection, which is good. You've dived deep, in terms of the challenge of general deterrence in the online world, at 5.1 of your report. In the legal world, whether it's states, territories or the Commonwealth, deterrence is broken down into both general deterrence and specific deterrence. You're nodding, so I see that you understand that there's a difference.
General deterrence, obviously, refers to the general public, and penalties are imposed or consideration is given to general deterrence to prevent or deter people from committing these offences. That's the first issue. The report seems to conflate both general deterrence and specific deterrence, where someone is being sentenced and specifically punished as a specific deterrence to the individual, in terms of jail sentences and ongoing punishment once they're ultimately released. Is there a reason that you didn't separate the two and look at specific deterrence when you were talking about this rehabilitation program, which I don't support?
Dr Zirnsak : Our focus, as I said, is that we're looking to get the volume of offending down in the first place. Our understanding, based on our look at the research, would be that the problem is the number of offenders. If the bulk of the problem is that the same people are offending over and over and over again, then you would argue that the solution is tougher penalties targeting those individuals. If the problem is that there are large numbers of people engaging in this behaviour and the reason they're doing it is that most of them think they can get away with it, our understanding of the criminological literature is that simply punishing more harshly the smallish number who are getting caught doesn't provide a general deterrence to that population who would otherwise offend and think they could get away with it.
We're trying to say that the bigger problem you've got in this space is: how do we create an environment where people think, 'If I engage in this behaviour I will get caught'? I'm postulating to you that, if I'm engaging in this offending behaviour, what's probably going to be more important to my mind is not, 'Will I get five years or 10 years in prison?' but, 'Will I get caught in the first place, and, if I get caught, will I be successfully prosecuted?' I think what the criminological literature suggests is that that is the greater deterrent effect. That's why we talk about general deterrence, because it's getting that volume down.
Then the question you've got is: if you give someone a five-year jail term or a 10-year jail term or a 15-year jail term, to what degree does that deter that individual from reoffending? To what degree does the severity of that penalty affect recidivism for this particular crime type? Our understanding is that that increased severity doesn't do much for recidivism, even among those individuals who are subjected to those sanctions. The Australian Institute of Criminology might have further views on this. They are the people we have spoken to in the past about what works in the recidivism area in this space. Having read the literature previously, my understanding was that simply harsher penalties beyond those that we have already have in place are probably not going to do a lot to reduce recidivism further.
Mr CONAGHAN: I don't know if you have heard the term 'paradox of dispossession'—those who have little to lose don't really care; they will commit an offence anyway because they have nothing to lose. Let's look at the people who have a lot to lose, generally the ones on the boards of the platforms, the service providers, and look at the specific deterrence to them. They are all based overseas; they are not based here. Have you done any research or a deep dive into what international agreements would apply, such as penalties for the executives not complying, in treaties? For example, could we enact legislation and had a treaty with, let's say, the United States, that looked at the executives of these platforms to say, 'If you do not act, then we are coming for you and here is your specific deterrence in financial penalty, restrictions to your online platform or even jail sentences.'?
Dr Zirnsak : I would suggest we already have that in other forms of cross-border crime. If a multinational company has an Australian branch and the Australian branch pays a bribe in an international bribery situation, you would hold the Australian management of that subsidiary to account. These companies have subsidiaries based here in Australia. My guess would be you would hold the management of the Australian subsidiary to account for their failure to comply with Australian law. That would be my immediate response to that. That might be something to take up with the Australian Federal Police, how practical that would be to do that. But I don't think it is uniquely different. Similarly, we have large multinational banks that have branches and subsidiaries here in Australia. We hold the executives in those institutions to account who are based here in Australia, so it is not a unique situation to this space.
Mr CONAGHAN: I am conscious of time. Thank you very much, Mr Zirnsak.
CHAIR: I want to get your view on how you think the public would respond to the recommendation about the requirement for those under 13 to not get on without parental consent, because, obviously, it would involve a parent having to give some form of ID or something like that to a service provider. But I suppose we are moving down that path with the anti-trolling legislation. How much attention should we pay to ensuring that under 13's are not on this platform and to pushing platform providers into that avenue versus pushing them to provide more parental controls on an ongoing basis and building that into their platforms so when people do slip through, as inevitably they will, there is ongoing monitoring parents can do?
Dr Zirnsak : I don't have strong views one way or another on this. I suspect the public might react more negatively to a view of the parliament if you are below 13 you can't be on a social media platform at all. I suspect that might create more pushback than the suggestion that parental approval is needed for someone 13 or below to be on the platform, and that was aligned to existing US legislation anyway—
CHAIR: And a lot of that is not enforced.
Dr Zirnsak : The issue, as we raised, is the psychological evidence. My understanding of child development evidence is children aged eight to 12 are particularly good at using the technology but do not understand the consequences and the trouble they can get into. I am also swayed by having attended the eSafety Commissioner's conference and listened to people talk about their experience of having their kids on platforms that are supposed to be child-only platforms. The feedback was child sex predators are prolific in those platforms because of the lack of identity that is required. You are not creating a child-safe space if you declare it as a child-safe space but then anyone can pose as a child and enter that space; in fact, you are making it a magnet or an attraction for those who wish to target children. That is the kind of thing we're trying to respond to with that recommendation. If the committee has a better way to deal with it, we are open to seeing other, better solutions to this particular problem. The problem we are trying to address is the current one where we don't identify people and the fact that children below that age are at risk of knowing how to use the technology but not really understanding the consequences of what they get into or how they might be targeted.
CHAIR: I am with you there. Thank you very much. I appreciate you taking the time to give evidence. That was a very beneficial 45 minutes. Again, our thanks for the detailed submission and the passionate way that you and the church are taking this up. It is much appreciated.
Dr Zirnsak
: Thank you for the opportunity. I really appreciate it.
.