Note: Where available, the PDF/Word icon below is provided to view the complete and fully formatted document
Parliamentary Joint Committee on Law Enforcement
Law enforcement capabilities in relation to child exploitation

BROWN, Ms Anne-Louise, Director of Corporate Affairs and Policy, Cyber Security Cooperative Research Centre [by video link]


CHAIR: I did a bit of the formalities at the start, so I won't repeat that, except to say obviously this is a public hearing, so be aware of that. If you want to switch to in camera or confidential mode, please us a heads up before you give that evidence. We have read your submission thoroughly, me particularly thoroughly because I'm stuck at home quarantining and there's not much else to do. Obviously, we are not looking for you to rehash the entire submission and we're keen to ask you questions but you are welcome to give a five-minute opening statement if there are particular things you want to highlight for the committee's benefit?

Ms Brown : I thank the committee for the opportunity to contribute to what is a very important inquiry. It's particularly timely given the exponential increase and in production and distribution of child sexual abuse material which has been further fuelled by the COVID-19 pandemic. It is essential Australia remains a prosperous digital economy, while also protecting the most vulnerable members of our community, our children, from online sexual exploitation. The Cyber Security Cooperative Research Centre is dedicated to fostering the next generation of Australian cybersecurity talent, developing innovative projects to develop our nation's cybersecurity capability. We build effective collaborations between industry, government and researchers, creating real-world solutions for pressing cyber-related problems.

Sexual offences against children are among the most abhorrent crimes in our society whether committed online or offline. Before the advent of the internet, these crimes generally occurred where an offender could access child victims. However, the proliferation of the internet and encrypted messaging applications have proven a boon for child sexual offending and for the production and distribution of child sex abuse material. While there are perceptions the victims of these crimes are overwhelmingly located in overseas jurisdictions, there are also many Australian children affected. The methods used by abusers have grown ever more sophisticated with encrypted applications in the dark web providing a cloak of invisibility to their offending. Accessing and viewing child sexual abuse material is not a victimless crime, as many perpetrators believe; it has devastating life-long impacts on the children abused, with abuse images living on unfettered and uncensored in a world of clandestine online child sexual abuse material.

Given the scale of child sexual abuse material and the devastating impacts it has on the millions of exploited victims around the world every year, it is necessary that law enforcement has the appropriate power to disrupt, detect and bring offenders to justice. To this end, the SLA(ID) Bill, passed by the government in August, is a game changer in tackling the scourge of child abuse material. However, this is a shared responsibility. There is a clear role for technology providers to play in helping law enforcement combat child sexual abuse material through monitoring, removal and reporting such content.

Finally, the CRC submits that an absolute right to privacy can never exist, and there must always be exceptions, especially when it comes to maintaining the common good. This is a principle recognised in the International Covenant on Civil and Political Rights, which makes explicit exceptions where privacy can be overridden, including for the protection of national security, public order or public health and morals. The CRC contends that, while privacy is valuable, it must have limitations and that those limitations must correlate with the social contract all members of the community enter into upon which modern democracies like Australia's are built. This is a concept that can no longer be applied just to the physical world. In 2021, it must also incorporate unacceptable behaviour that occurs in the digital domain. Thank you.

CHAIR: Thank you very much. I might kick off the questioning; it's the chair's prerogative. I'm a big fan of what your organisation does. We've previously engaged on other issues. Congratulations, and thank you very much for the submission. Your opening statement is right along the lines of the committee's thinking. Given the CRC's significant knowledge and technical knowledge and the like, I want to throw some of the more challenging aspects that we face at you, if that's alright. You mentioned the SLA(ID) Bill. I'm pleased you singled that out. I'm with you; I think it's a good step forward. But you indicated challenges, and one of those still outstanding challenges is encryption. We're going to be talking to Facebook tomorrow. But you raised encryption in your submission. Some other submissions raised that there are still some technical things that providers can do—even if they want to maintain their encryption—to check messages and the like. I know it's a really tough challenge, and that's why I warned you in advance. Where do we go to start solving this encryption issue?

Ms Brown : I'll start by saying that I'm not a technical expert, but I do have a high level understanding of what this means. I know that, within some of the submissions from tech companies, they note, for example, that there are ways they can use metadata to use encrypted data to track behaviour. I've done some reading on this. The Google submission, in particular, raises that they can use metadata and behavioural analytics to help locate and stop child exploitation material online. The problem with this is that it really does, in my understanding, rely on behavioural aspects. So the metadata can correlate patterns. It's like pattern theory. It will find patterns in a particular behaviour and the anomalies within that behaviour and pinpoint that, which is fantastic until the criminals perpetrating this kind of activity realise that they are being tracked or traced via this form of behavioural analysis, at which point, they will change their behaviour. That's the nature of this kind of crime; it's to remain undetected. So, while encryption provides that safety and behavioural analytics can be used to detect it, it's only a matter of time before these criminals pivot again to something new and just go dark again.

CHAIR: When you say—

Ms Brown : I'm sorry.

CHAIR: No, you go.

Ms Brown : I was just going to say: the other problem is that, despite that, they can't actually see what is in the message either. So, while they might have an inkling that it is child sexual exploitation material, there is no way of telling.

CHAIR: So should it be an expectation that, if these platforms are going to set encryption as a default, they do the flip side of the coin, and that is to at least make an effort to monitor, where they can, the behavioural aspects of it? Is that something that you would be looking to government to regulate or to legislate? Are there other jurisdictions that are doing something like that?

Ms Brown : I think that it's an admirable aim. The tech companies will push back against that.

CHAIR: Yes. We've been getting kind of used to that!

Ms Brown : Yes! We know that Facebook are very committed to going end to end, as well as going ephemeral—so, to have disappearing messages. At that point, that content is—

CHAIR: What do you think that will do to the prevalence of CEM being shared?

Ms Brown : I know that the National Center for Missing & Exploited Children in the United States has indicated that, if this does occur, it will reduce the amount of reports that they receive by up to 50 per cent. That's a huge amount of material that will go undetected and unreported. We know that, domestically, the AFP rely on open web sources for a lot of referrals, as ACCCE do. So it's a matter of great moral concern.

CHAIR: So do you think it's fair for us to put to Facebook, when we question them tomorrow, that going this way will see up to 50 per cent of this material that we are currently discovering, potentially lost to law enforcement?

Ms Brown : Certainly, and I think that's noted in our submission, in those stats from the National Center for Missing and Exploited Children.

CHAIR: Can I take you back to the limitations of these AIs, then, because we hear a lot about these, particularly from tech companies and, to a certain extent, law enforcement as well. What I don't want is to see the hype around AI being used by tech companies as an excuse not to put the resources in place in terms of personnel and also better controls on their platforms. So I'd really like to understand: what do you think that AI can actually achieve? And what are its limitations, particularly as we currently understand it?

Ms Brown : AI offers amazing possibilities for a whole range of things, but it also comes with inherent risks. And it is a form of technology that's always developing. It's evolving so quickly, still. One of the great problems with AI—and I know Dr Aly knows a lot about this—is the selection bias problem, because AI programs rely on data, and large amounts of data, to be able to operate. They are sort of trained on data and on using particular data sources. A great example is in the US, where AI has been used in relation to, I think, prediction of crime, and it will always go to a particular demographic because that's how the algorithms that operate the AI have been trained. So there are inherent risks with that, where the AI, if trained in a particular way, will miss anomalies.

CHAIR: So its strength is in finding—I forget the names of them, but they are referenced in other submissions. Its strength is in the ones that can, for example, take a photo and search the internet and find them on other providers?

Ms Brown : Yes.

CHAIR: But you're saying that it's a lot more limited when it comes to actually, say, predicting behaviour or building a profile or finding somebody who may be offending but not necessarily sharing a specific image of which we are already aware?

Ms Brown : It relies on the content and the data that it has. So, when new data is coming in, it essentially has to be trained to be able to understand that new data.

CHAIR: So it's a realistic thing for tech companies and providers to rely upon, in order to try and find material broadly on their platforms that already exists, but not so much to identify new material or to make those kinds of judgement calls about people.

Ms Brown : Child exploitation material is an interesting area as well in this regard, because one of the things that people talk about a lot is dark web crawlers, or web crawlers, which can also offer some sort of solution to this problem, at least in detection. I've spoken to some researchers that are trying to develop a dark web crawler for child exploitation material, and the problem is that, from a research perspective, to train the technology they can only use adult images, legally, so they are relying on adult pornography to train the web crawler. Obviously, when that goes into an agency or to the AFP, they will be then able to use images within their stores to train the web crawler, but, as it stands, it is really difficult to get the ethics approval to be able to do this kind of work. It's difficult, and it's long term. While it can offer some wonderful solutions, it also has its limitations in that it, as you said, can recognise a child in one video, a child in another video and a child in a photo, but it can't tell you much more about that child. It will just find the links, in terms of how much that image has been shared.

CHAIR: Can researchers not partner with the ACCCE to access that kind of material through law enforcement, for example?

Ms Brown : I believe that is happening, but, again, it is a long-term process. The other thing is that they are reliant on the images that they have, which is a great start, of course, but the other daunting thing, especially with child exploitation material, is that they are children: they grow up, and they begin to look different.

CHAIR: Can I take you to your view in your submission about potentially expanding the definition within the AVM act to encompass more CAM material? This is something the committee's looked at recently, and we struggled with trying to shoehorn more into the AVM act. It's trying to fulfil a specific function. Is your point that there needs to be a similar regime of penalties in relation to a broader range of CAM, particularly that relate to individuals in, say, tech companies?

Ms Brown : Yes. While you could try and, as you say, cram it in there, that might not be ideal, but there could be scope to create a new regime that deals specifically with this issue or similar issues in relation to child exploitation material. As we point out in our submission, while some material could be captured currently, there are limitations to the abhorrent material act as it stands: it must be rape; there must be penetration combined with the element of violence.

CHAIR: The committee certainly understands the limitations, and the idea of picking up that kind of regime, which the committee feels has worked well in relation to AVM, and relating it to a broad definition of CAM has some merit in terms of exploring. Do you get the feeling that right now, though, the tech companies—the individuals—are not being held responsible enough or do not have those penalties in place to comply? Are you seeing a lot of examples where they're not complying with law enforcement quickly enough or aren't incentivised to—

Ms Brown : I couldn't comment on that specifically, and I think that's probably where the eSafety Commissioner will be particularly helpful. There is no doubt that tech companies are trying to take steps to deal with this, and it is a huge problem; however, we know that they could do more.

CHAIR: You've seen our new Online Safety Act, which we're looking at. It creates an enforcement regime for the eSafety Commissioner. Do you think we need to go further than that, or do you think the Online Safety Act goes some way to addressing those concerns?

Ms Brown : I think it goes some way, but it is still quite limited in its scope. It's the same with the trolling legislation; while it's a great step in the right direction, it is still limited. With the trolling bill especially, while it might be useful for defamation proceedings and whatnot, when it comes to dealing with this kind of material, it doesn't hold the tech platforms to account.

CHAIR: No, that's quite alright. There are a lot of different things that we're doing in this space, and we're very cautious about not creating duplication or overlap. Neither do we want to have gaps there, so that kind of advice to us is very helpful. Finally, is there enough focus, in your mind, from a tech perspective? Obviously, the eSafety Commissioner and others in law enforcement are dealing a lot with tech companies and social media platforms. Are we neglecting the cloud providers, where a lot of these images are hosted, who perhaps should be playing more of a role, in terms of searching for images or being proactive in looking at what their services are hosting?

Ms Brown : Potentially. Again, it is a very fraught issue, and we've seen what has happened in the US with Apple. They have tried to start scanning images going through to the cloud. That was supposed to happen—it is supposed to be happening now—and it's not happening, because of the backlash.

CHAIR: The Europeans are apparently taking steps towards that. Are they having a similar backlash, do you know?

Ms Brown : I don't know. I could take that on notice and have a look, if that is helpful to you.

CHAIR: Yes. If you could take on notice any further research around those cloud hosting aspects of it, that would be great. I'll hand over to you, Deputy Chair. I've done my 15 minutes. Over to you for a similar period.

Dr ALY: Hello, Anne-Louise. It's lovely to see you again. Thank you for your submission this morning. My questions around AI have already been answered, as well as those around the AVM act, so I'll move on to some of the other questions that I have. The first is in relation to the link between online child abuse material and contact offending. Do you see, currently, that the legislative framework that we have is effective in capturing individuals who are accessing online as well as contact offending, or are there gaps in that space?

Ms Brown : There are no clear links as it stands. I think that the AIC have done some fantastic work on this specific issue, and it's noted within our submission as well. Globally, while there are tentative links between the two, when it comes to online-only offending, it seems like they're a very different class of offender, contact as opposed to non-contact. The AIC have done some excellent work in that space. In terms of the gaps, it's such a fraught issue because, as we all know, when it comes to child contact offending, it's a really underreported crime. We're starting to see, off the back of a lot of work that has been done to shed light on this issue of child sexual abuse, a lot more people coming forward and, especially, a lot more historical cases coming forward. But because the internet and this kind of ability to access these materials is still relatively new, in terms of the history of the world, it is going to take time, I think, and a lot of longitudinal work to be able to work out what those connections are and how they can be addressed, be that via intervention or legislative levers.

One thing strikes me, and this is just a personal anecdote. I was a journalist and a court reporter for a long time. I remember sitting in courtrooms 10 years ago, and the number of carriage service offences that were starting to come through at that time was quite daunting. I haven't been a court reporter for a very long time, but I hate to think what the volume of those cases is now. At that time, it was quite astounding. We did a desktop analysis in terms of our submission, and I think we found 47 mentions within a particular time frame. Again, that's just at a higher court level. A lot of these matters are going through magistrate's courts and local courts around the country as they travel through to higher courts. Off the back of the pandemic and, I am sure, the increased detection of some of these crimes, it will be really interesting to see what that trend looks like. One of the things that I really want to do personally is a bit of a mapping exercise of the increase and the changing nature of carriage service offences.

Dr ALY: Is that tenuous link, if you like, between contact offending and CAM offences because there is a lack of data and empirical evidence?

Ms Brown : Yes, there is a lack of data just because of the fact that this is still a relatively new class of crime. Again, this is just the crimes that are detected. Given encryption and the dark web, the amount of this crime that goes undetected is huge. So, as it goes to contact offending and online offending, it is hard to make those correlations.

What we do know—I think we have seen this through a lot of the dark web forums—is that there is an onus on people, when they go into those forums, to provide material, so that creates a really clear link in those instances around the sharing of child sexual abuse material online and the real-life act of abuse. Obviously the Shannon McCoole case in Adelaide is a wonderful example of that. There is another case mentioned in our submission of a grandfather, Appleby, who was offending against his grandchildren and sharing that material online. Again, it's a very different kind of offender. I think one of the most terrifying or awful things about the Appleby case is the fact that his online handles indicated that he was a grandfather and had access to children.

Dr ALY: I wanted to just probe a little bit further the online offending versus contact offending. I guess the defence by offenders online is that it is a victimless crime and they are not actually doing any contact offending, but obviously they are watching—in some cases we have seen that they are watching live offending, live rapes, sometimes in other countries and perpetrated by people in other countries. Do you find that this kind of defence or this perception that it is a victimless crime, or less of a victim crime than contact offending, is pervasive in the way in which online material is approached?

Ms Brown : I think that all of the research that has been done very much illustrates that. The ACCCE have done some fantastic work in that space as well around the fact that there is a lot of minimisation around online offending: if they are not touching somebody, then they are not offending. It couldn't be further from the truth. One of the things that has been shown through the research is that when it comes to online accessing of this material, there is an escalation of behaviour. It will start off, potentially, as accessing adult material, and then a person will become emboldened or want something different, and they will go down a path to where they start accessing child abuse material. But then they start accessing it more frequently, and the amount of images and videos that they will store is quite often huge. I think that the average number of images has increased now to about 10,000 per offender.

Dr ALY: I want to ask you about the McCoole case, which you put up as a case study. You note that McCoole signed over his online identities to police. But he had to agree to that, didn't he—he wasn't compelled?

Ms Brown : No.

Dr ALY: That was in 2015. Does the suite of laws we have introduced since then rectify that kind of situation, where an offender has to agree to hand over rather than be compelled to hand over?

Ms Brown : It does, with caveats; I think that's the best way to put it. While there is the compulsion now under SLA(ID) and there is the account takeover warrant, that can still essentially be refused if somebody says, 'No, I'm just not going to do that.' Then they get charged under section 3LA, where there is a maximum term of 10 years imprisonment for failure to provide credentials. That increased substantially from one or two years to 10 years; that happened in the last few years. But, in a case like McCoole, if the police had said, 'Okay, we want to take over your account,' and he said, 'No'—and I'm looking at this with SLA(ID) being enacted; it's very hypothetical—and there was no way they could get into those communications another way, then he would have potentially been charged with the offence of failing to comply.

Dr ALY: Which is 10 years.

Ms Brown : Taking that bet and saying: 'If I comply and they can access this material, I'm going to go to prison for 30 years. I'm going to roll the dice and I'm not going to comply, and I'll get 10 years maximum.' That's the caveat. Criminals aren't silly; the amount of encryption and self-destruct functions that some of these kinds of offenders have within their personal devices is quite remarkable. If Shannon McCoole had not complied, it is very unlikely that police would have been able to access his computer. It led to the global shutdown of the site—The Love Zone—and numerous arrests in Australia and globally, and the rescue of children. That's quite a terrifying thought but it is something that ultimately could occur still. Despite the wonderful changes that SLA(ID) has introduced, criminals can still refuse and potentially get a much less severe sentence.

Dr ALY: In the case of McCoole, he got 35 years despite being the leader of a worldwide dark web child pornography ring with 45,000 members and abusing at least seven children in his care. In my opinion, 35 years is not enough. This might not be anything you've done research on, but are the penalties for the offences suitable?

Ms Brown : I can't comment on that; that is something for courts to make decisions on, in relation to those sentences. In relation to the McCoole case, upon reading it would appear he received a lesser sentence due to his cooperation with authorities and the fact that the information he provided was able to have far-reaching effects. From a personal perspective, sometimes when you read sentences that are handed down for particular kinds of offending it is concerning in that you would sometimes expect there would be more severe penalties. But at the same time we don't necessarily know what information is being provided and what deals are being struck.

Dr ALY: I have one more question, with regard to the comments that you've made about the AVM and SLA(ID) and the trawling legislation. While CAM is kind of included in that, under the AVM it comes predominantly under torture and rape, and in SLA(ID) it's really about compelling and takeover warrants. Do we need some real standalone legislation specifically targeting online child abuse material?

Ms Brown : You can overlegislate these things, certainly. I think that, as much as this kind of offending can be wrapped up within existing legislation or rolled into existing legislation, that's certainly a good thing. That said, it is a unique kind of offending, and it's growing. We know that this is something that's growing rapidly. There is scope, I think, to be able to do that, and that would also potentially help with resourcing policing if there were standalone legislation.

Dr ALY: That's right.

Ms Brown : So there would certainly, in my opinion, be scope for standalone legislation relating to child sexual exploitation material.

Dr ALY: Thank you. That's it from me for now.

Mr CONAGHAN: Following on again, from Dr Aly, Ms Brown, you said that you were a journalist over a decade ago and you were a court reporter. Previously I was a prosecutor. You referred to the floodgates opening back then, and now there's so much material and so many charges going through the courts every single week. Can I just get your opinion? Back then, everybody seemed to recoil at these offences and found them abhorrent, and the sentences in the local court and perhaps the district court seemed to be a lot harsher. Do you feel that over the past decade, or perhaps a little bit longer, that abhorrence has sort of been socialised?

Ms Brown : I think that it's almost, in a way, a form of social conditioning. It's something that people have become used to and almost desensitised to. The other problem with this is that it's not really something anybody likes to think about too much. Again, I point to some excellent work that ACCCE have done on that kind of sentiment surrounding this issue. It's not something that parents, or people more generally really want to engage with, because thinking about it is horrible. But it also fails to recognise the sheer scope of this problem, which is huge. As we know from the research, Australians are voracious consumers of this material.

Mr CONAGHAN: With your background and your current position, and just as a layperson, do you think—whilst we have sentences of five, 10 or 15 years—that, because of the proliferation of all this material and the amount that's out there, magistrates and judges are, in fact, giving fairly low sentences?

Ms Brown : I couldn't comment on that, and I wouldn't feel comfortable commenting on that, because obviously every case is unique. I do think that, within the courts, contact offending is taken more seriously, potentially, but I think that's all that I can really say about that. I'm not comfortable about commenting on that.

Mr CONAGHAN: I understand. This is a comment or maybe a statement. I feel, having had 30 years in law and law enforcement, that some of these sentences—ICOs, or CCOs as they're called now—contribute to that proliferation we spoke about, because there's no general or specific deterrence to offenders. They think, 'If I do get caught, I'll get an ICO and I'll stay in the community.' But, again, that's just a statement. Thank you for your evidence.

CHAIR: Just to tie things off, I will come back again to the encryption piece, if that's alright. Is there any other research that the CFC has done around what might be appropriate regulatory or legislative responses from government to what seems to be just this inevitable march of encryption?

Ms Brown : Again, it's such a vexed problem. It's something we look into very deeply from existing legislative and regulatory frameworks. The key thing to take away from this is that it has to be a shared responsibility. So while government can legislate, and politicians and members of the public can jump up and down and say 'this is wrong,' what it needs is for the tech companies to come to the table and to agree. Like I said in my opening remarks, privacy isn't a given. If you're a criminal talking on the telephone, you know that there is a likelihood that your communications will be intercepted. Why should that be any different for social media communications? It shouldn't. It essentially creates the perfect ecosystem for this kind of crime to flourish, grow and be profitable.

CHAIR: Realistically, as much as I would want to make it big tech's problem and hold them accountable—believe me, no-one is more enthusiastic about that than me—do we really have any realistic hope of pushing them away from encryption, when it seems to be the way that they're all going? I don't see them turning their back on it any time soon, as much as we point out the dangers.

Ms Brown : No, I agree. Obviously, there are certain steps that could be taken. Facebook is moving to go end to end. The Secretary for Home Affairs, Mike Pezullo, has spoken very strongly about this. There was a really good submission from an ANU academic about actually taking the legislative step to make them a carriage service provider, which could help. So if you were to classify Facebook or some of these other providers as carriage service providers then there could then be onus for them to provide those communications but, again, it's something that would have to be tested.

CHAIR: It would open up more existing law enforcement tools, would it?

Ms Brown : Potentially it would. Again, this is a hypothetical but it is a potential solution. When these kinds of social media platforms are providing a new form of communication, the new normal of communication, then why shouldn't they then be classified as carriage service providers?

CHAIR: That's great.

Dr ALY: I have come up with another question and it's a bit left-field. I don't know if you can comment on it. Law enforcement capabilities use CAM, or CAM offending, in cases of domestic violence or child custody. I'm thinking also about the Hague convention as well and the return of children in cases where there might be online offending. Have you done work on whether the online child abuse material, online offending, comes up substantially in Hague cases, in child custody cases or in other Family Court cases?

Ms Brown : No, we haven't done any work in that space and it would be something that would be really difficult given the restrictions of the Family Court report. That would be hard to glean. That said, we do know that there are strong familial links at times, especially when it comes to live streaming. Obviously, the Philippines is a hot spot for that. But we do know that in Australia this is something that happens as well, that there are parents who will essentially use their children as commodities. But to your specific question, that's not something that I am familiar with or that we've done research on.

Dr ALY: On the live streaming, if the offence occurs in the Philippines—as you say, it is a hot spot—what are the limitations of law enforcement in being able to hold perpetrators to account?

Ms Brown : Well, it is difficult—

Dr ALY: Because there are perpetrators in that, aren't there?

Ms Brown : It is difficult, and the AFP has done some fantastic work in working with Philippine authorities to tackle this issue within the Philippines. While SLA(ID) does not offer a solution to prosecution necessarily, it offers disruption. The thing with live streaming is that as soon as the live stream stops, it has stopped, so it makes it even harder to detect whether this is happening and how this is happening. The disruption power within SLA(ID) is an excellent thing because at least, at the point that it can be disrupted and stopped, that can help kill off the economy of it, so to speak, but there are limitations. Yes, cross jurisdictionally, there are so many limitations, and that is where it comes to authorities from nations like Australia going into other nations where there is known to be higher rates of child exploitation and helping police that. That is well-established, especially in relation to the AFP's relationship with the Philippine authorities.

Dr ALY: But the demand predominantly comes from the consumer country, which is Australia.

Ms Brow n : Yes.

Dr ALY: And while you need the cross-country cooperation to bring perpetrators to justice and to save the children who are involved in this, do you consider that Australian laws at the moment are effective enough in capturing the perpetrators in Australia?

Ms Brown : If they can locate them then they are effective enough, but again, that is the problem. If it is a dark web live stream and you have 10 viewers in Australia, it is really difficult to locate and identify those people. They do it on the dark web and they can do it on a whole range of platforms. AUSTRAC obviously have a really important role to play in this regard, and that is something that they have raised as well. The number of suspicious transactions that they have seen going into countries like the Philippines during the pandemic has risen, and they suspect that that is in relation to live streaming of online child exploitation material.

Dr ALY: On AUSTRAC using suspicious transactions, does cryptocurrency make that more difficult?

Ms Brown : Cryptocurrency is something everyone is interested in at the moment and currently there are huge limitations in cryptocurrency in Australia. In relation to child exploitation, it would be difficult because you have the $10,000 threshold when it is going in and when it is coming back out. So when you are talking about transactions that are significantly lower than that, I would imagine, in accessing this kind of material from overseas, it is probably not that likely to ping.

Dr ALY: Right. That is interesting.

Ms Brown : I am not a cryptocurrency expert.

Dr ALY: No, but it might be something that we could explore further. Thank you so much.

CHAIR: Thank you, Ms Brown. Thank you for your evidence. We will let you go and we will take a short break.

Proceedings suspended from 11:34 to 11:49