

- Title
Parliamentary Joint Committee on Law Enforcement
09/12/2021
Law enforcement capabilities in relation to child exploitation
- Database
Joint Committees
- Date
09-12-2021
- Source
Joint
- Parl No.
46
- Committee Name
Parliamentary Joint Committee on Law Enforcement
- Page
17
- Place
- Questioner
Aly, Anne MP
Conaghan, Pat MP
- Reference
- Responder
CHAIR
Prof. Broadhurst
- Status
- System Id
committees/commjnt/25257/0003
09/12/2021
Law enforcement capabilities in relation to child exploitation
BROADHURST, Emeritus Professor Roderic, Private capacity [by video link]
CHAIR: Professor Broadhurst, thank you very much for joining us today. I went through some formalities at the beginning, which I will repeat—not verbatim—very quickly. You know it's an act of contempt of the Senate to give false or misleading evidence. You know that this is a public hearing, so everything you say is being broadcast live. If you'd like to give evidence in camera, or confidentially, you are quite welcome to. Make sure you request that of me before you say anything so that we can switch broadcasting off. Do you have anything to add to the capacity in which you appear?
Prof. Broadhurst : Thanks very much for giving me the time to talk to you this morning. I'm an emeritus professor at the Australian National University.
CHAIR: We have your submission, so thank you very much. We have all been through it in great detail. We are keen to ask questions. Without rehashing the whole submission, by all means you're welcome to take five minutes for an opening statement if there are particular things you want to draw the committee's attention to or to highlight.
Prof. Broadhurst : I will take a minute or so to highlight a couple of things I want to try to focus on this morning if we can. We don't have a lot of time but we've got enough time to cover the main things.
First of all, once again, thanks very much for the opportunity to meet with you this morning; it's very much appreciated. I want to acknowledge the fact that there had been a huge amount of work done in the child protection arena, particularly the online CSAM arena, which is the part I want to focus on. My main purpose is to suggest that a lot more attention needs to be given to the prevention aspects of what we're trying to accomplish here. I think our law enforcement agencies do a pretty good job. They work very effectively internationally, which is essential, as we no doubt know, and you'll hear more about that later in the day.
My main objective is to suggest that Tor, this large, anonymous platform that's widely used for all kinds of both noble and not-so-noble purposes—that the aggressive anonymity in that environment could be flipped in a way so that we can reach CSAM offenders, online offenders and others, through that medium. I wanted to highlight that. There are lots of other issues we could take on, but it's this idea of trying to develop and deliver effective online treatment for child sex abuse online users and to try to, if you like, engage them in treatment programs. It's one of the really interesting innovations currently underway, and I want to talk a little bit about the work that the Finns are doing in particular around their survey and the amazing results already from the ReDirection program, which is a program associated with their work. I'll give you a little bit more detail about that.
The other thing I wanted to touch on, which is a really demanding area of research but with high potential for better understanding, better intervention and so on, is the role of these child sex abuse material forums. In the policing world they're sometimes called boards because they're image galleries, basically, of offensive material. But there are other elements of these boards or forums which do enable us to understand much more about the behaviour of these—there is a diverse group of child sex abuse users out there. That gives us an insight which is really worth developing.
The third thing I want to touch on if we get time—we did mention a bit in the parliamentary submission that we made—are the challenges around detection generally of child sex abuse material and what we can do in improving the way in which we detect online child sex abuse material. I'm particularly thinking about the role of machine learning—I guess we would call it AI. I don't like to use that term. Machine learning can perhaps help do a lot more than our current technologies, the crawlers and so on that we currently use. They're the three things I'd like to touch on.
To bring you up to date: I've just had some recent correspondence from our colleagues in Finland about the progress of the survey they conducted, this rather nifty online survey which now has 13,000 respondents already. The data that's coming out of that is quite compelling. For example, based on that self-reported data, we know that nearly 40 per cent of people who responded to that survey are seeking or having contact—this is the divide between so-called contact offenders and non-contact offenders. I think the really innovative thing about that is that, accompanying that survey, which was on a pretty well-established Tor or Onion service dictionary—that's a place you go to on the open net, type in what you're looking for: drugs, malware, child sex, or whatever it happens to be. That then flags the survey but also flags this really interesting online program that the Finns developed called ReDirection. What's astonishing about that is that since that's been running as a pop-up in relation to that survey 8,000 respondents have clicked onto that ReDirection online program, 5,000 of them through the Tor darknet market.
What that's all suggesting is that there's quite a bit of untapped need. That suggests that the more we move up the crime prevention chain, obviously the national plan has a role for offender treatment and improving evidence as well as the responses to victims. Those numbers suggest that there's a really high level of untapped demand. The question is: can we design an effective online treatment program for these particular types of offenders—we would probably use 'consumers of this product'—because we really need to head them off. That's really promising, those numbers alone. As a small point, the Swedes have been running a very interesting treatment program as well. Rather than being online, it's a supported one. A counsellor is online and that sort of thing. There's contact with a trained psychotherapist. That program started back in late 2019. It really got going in May 2020. That's oversubscribed and has now closed.
All of this suggests to me and many of my colleagues that, if we go upstream and try to, if you like, intervene in the earlier stages of child sex abuse material consumption, we might be able to have an impact on the amount of abuse that goes on. We might be able to push people to assistance and that kind of thing, which is the model we really need to be investing in a bit more. Child sex abuse offenders are a global problem, so we have to work with our partners, which, by the way, the Australian government federal agencies seem to be doing a pretty good job of. They do some fantastic transnational policing operations, which we've talked about in other sections. That's the line I'd like to take just to get you up to date.
CHAIR: Great. Let's launch into some questions. If I could pick up your point about de-anonymising Tor or Onion. A noble aim but what are the practicalities of that? How would you actually go about something like that? Is there any jurisdiction around the world that is making inroads into it?
Prof. Broadhurst : A terrific question. Tor is designed as a decentralised service to defend against that very problem. There have been some agencies, particularly examples in the drug area and the CSAM area, where joint operations have been able to use some kind of malware, user error and so on. It's a bit haphazard in that sense. Some of the techniques available—the short answer, I'm sorry to say, is that it can't be done all the time. You might be able to do some de-anonymising around particular kinds of products or services. Just recently, for example, the Russians started to cut off servers that were providing relays for Tor users. That caused quite a bit of disruption. The relay system also uses a private bridge model, which enables me to connect without a server privately to Tor. From memory, I think we're well over 8,000 relays now.
CHAIR: There are some other submissions that talk about the effect of trying to shut down Tor, which is at least open source—that it may drive them to even harder models for law enforcement to look into.
Prof. Broadhurst : Yes. That's one of our concerns too, actually. It's a bit of a balancing act, isn't it. One thing that is worth saying is that the people who use Tor just for simple privacy reasons are not accessing hidden services. If I'm in Iran and I want to read the BBC media page, then Tor is the outlet for me to go to, so that other agencies won't know what I'm looking at. A lot of it is used in that context.
CHAIR: Do you think there's a legitimate reason in Australia, in a country with our freedoms, to use a platform that's so dedicated to anonymising what you're searching for?
Prof. Broadhurst : That's a good question. I think there are some circumstances. The classic example would be the whistleblower issue. I can think of examples in Tor where you can report sexual assaults or you can talk about offences of the state and that sort of thing. That would be one of the things you would lose, I guess.
The other thing, too, is that Tor is not unregulated. When we looked at the work done by the transnational policing agencies around fentanyl—it had a very significant impact. Of course, there's a lot of chat on the forums about how we improve our operational security now that we know this is a risk. We found that fentanyl just stopped being sold on the main markets; it completely dropped out of the system. And most of the markets and services—except for the child sex abuse material forums—actually ban that kind of material.
I think it's a balancing act. I don't think there's a straightforward answer. My gut feeling is that there is a risk of pushing them into places like Signal and Telegram, the sorts of things that make it even more difficult for us to track this kind of behaviour.
CHAIR: One of the aspects I'm really interested in is whether we're doing enough to target cloud content-posting providers, whose services might be used by CAM offenders to host material and then quickly and very easily delete it before law enforcement can get access to it. You mentioned the European example, where they're encouraging scanning and might be moving to mandating that, which has been a very different experience to what Apple had in the US. Could you talk a little bit more about the European example and what they're achieving?
Prof. Broadhurst : Your committee and members of parliament would know there's quite a lot of interest in what we can do to regulate big tech. They're kind of parajurisdictional operators. They really are a bit of a headache. With the Europeans in particular, the Lanzarote convention, which outlines what the Europeans hope to do to suppress child sexual abuse material online, has driven the policy there. Putting pressure on big tech to do what they can to scan for this sort of material is pretty vital. They are asking industry—and I think Australia has probably tried the same thing: 'Can you self-regulate? What can you do to help us so that we don't have to legislate? If you can't do what needs to be done in terms of regulating the so-called public space here, then lawmakers will step in and make that happen.'
Again, the appetite is there. The response of the big tech companies has been to be pretty wary. The Apple example, which I mentioned in the submission to begin with, just illustrates what those limitations are for big tech and for tech companies in general. There's only so far they can go. Apple's idea of screening for that material on your watch or your Apple phone did cause quite a lot of alarm, but, in the forensic view, we were probably more concerned about how effective it would be and whether it would work in regard to undetected images, the fact that you've got to match it with one that's already stored in an available database. We know that doesn't work that well—with all due respect to the tremendous effort that's been made in collecting those images and comparing them and trying to trace—
CHAIR: In the European case, where they have pushed big tech to do that scanning, is there any research to suggest there is a reduction in available CAM or in offences? Is there anything numerical we can point to?
Prof. Broadhurst : That's another good question. The problem is that it's growing so fast. So my short answer is that it is probably too early to tell. As I said before, forensically there's a bit of concern or wariness about just how effectively those sorts of systems would work, particularly because they rely on identification of images, matching them and then getting an authority in one of those jurisdictions to act. With the European Union, obviously, mutual legal assistance is pretty straightforward, with Europol and Juspol and so on. It is a bit like our Australia system, which is clumsy but gets there in the end across the nine jurisdictions of Australia.
So I think the short answer would be: not until there is the technology for identifying, detecting and matching these images. Until we know how to do that better, using multiple methods, I don't see us really clawing back the amount of material we're seeing. We might have an impact on some of the larger markets, but we have this massive problem now with TikTok, which the kids use for image sharing. That's opened up a really alarming vector for grooming children et cetera. The live streaming problem is huge. That's something, again, that we probably need to look more carefully at. It's not a problem on Tor because Tor doesn't support Java and it's not very good at rendering images and so on in that format, and with live streaming there are lots of delays. So people who want to live-stream don't use that platform. They tend to use platforms like Signal and even WhatsApp and Facebook. There's just no limit to the platforms.
CHAIR: Finally, in terms of the harm minimisation strategies that you're talking about, or trying to deter people from offending before it occurs, I will be honest with you: I remain pretty sceptical about it, and other members of the committee, will probably delve into that more deeply in their time. I think it was the Swedish example, was it?
Prof. Broadhurst : Both the Swedes and the Finns have attempted it. The Finns are still running their totally online program.
CHAIR: It would be early days but, again, is there any research showing that people who do self-report, for want of a better term, and put themselves into these programs then change their behaviour? Has there been some quantitative follow-up?
Prof. Broadhurst : It's an excellent question. The evaluation of these programs is a really severe challenge from a recidivism point of view—how to do that properly and how to confirm that. I agree with you. I didn't come to this willingly. I was very sceptical: 'Come on. Give me a break. How is online treatment going to change anybody's behaviour in that sense?' So I would have to sort of step back from that. I think it's obviously worth the effort. How effective it is is a really good question. The only data we have at the present moment is self-report data prior to the commencement of a program and data from eight weeks later or 10 weeks later—a follow-up evaluation of what's been accomplished. The kinds of metrics that are being used are things like: 'How many hours have you spent watching child sex abuse material? Do you have contact?' We note that there's quite a significant raising of awareness and that there are reductions in consumption.
The real challenge that the Finns and the Swedes have found, though, is that, when you want to contact a counsellor, the old confidential doctor system worked in Europe quite well, but everyone's so paranoid in these secret communities that there is the idea that your voice could be recorded—and, given the quality of digital audio forensics and so on, it's probably a fair fear—and you could actually be identified. So there's a very great reluctance to actually have conversations, even if they are anonymous. So we've been relying on, for want of a better word, rather clever chatbots to help that process, and then using techs. So that seems to keep the paranoia, if you like, under control—I won't give this an offender perspective—if I can use that idea.
CHAIR: The other concern that I have is—even if it were to work for the individuals—the idea that somebody could self-report and be treated for this. I would be concerned that that would go some way to destigmatising it in the broader community, which would, potentially, be very harmful. Again, with their trials, have they looked into the effect that it has on how the wider community views these offences?
Prof. Broadhurst : That's, again, a question that probably needs a bit more research. But certainly the protect-children venues get quite a lot of work on that area. It seems to me that, to summarise it, what is being balanced here is the harm versus the benefit. One of the interesting things about the data that has come out of those studies is how early some of these offenders report their exposure to pornography. There is a good point, actually: we have tried to educate the public to think about these kinds of offences as child sex abuse, and so the term 'child pornography' is one of the things that we're trying to move away from. So there's early commencement. There are obvious signs that viewing online CSA material does push people towards contact or wanting more. So it's interventions at that level.
Your bigger point, though, is: how acceptable is it for us to sort of reach out to communities like this—secretly, I guess, isn't it, really?—and offer some help, without really knowing what the consequences might be? Look, I'm not sure. I think that, in the discussions we've had at the ethics level, there has been a lot of discussion just to get the ethical protocols through, to get some of the work done. The committees have been quite fierce on that point: 'What's the benefit? Will this kind of work save a child?' If it can reduce offending and the gravity of offending and online there's a shifting to contact offending, then their broader answer is: 'We'll take that on board.'
CHAIR: Thank you, Professor. Sorry—I need to hand over to my deputy chair.
Dr ALY: Thank you, Professor Broadhurst, for appearing this morning. I do want to pick up on the issue that the chair raised, particularly around early intervention. I want to contextualise my question by drawing on the area that I know, which is terrorism. If an individual goes online and uses the internet or a carriage service to plan a terrorist attack, to raise funds for a terrorist attack or to recruit for a terrorist organisation, they are considered operative—they are considered a terrorist, right? What I'm trying to get my head around is: how is early intervention even possible in this space, when, in accessing the material, you are already offending?
Prof. Broadhurst : The short answer is that it really depends a bit on which jurisdiction you're in when it comes to child sex abuse material. In some jurisdictions it's not an offence to view the material; it's an offence to produce and distribute it. I think, from memory, a recent German study showed that about 20 per cent of jurisdictions around the world just didn't have those basic laws. So it is not an offence in some jurisdictions to view—
Dr ALY: To view?
Prof. Broadhurst : Yes. We've almost got a seamless kind of global view on the harm of child sex abuse, but it's not entirely seamless. There are a lot of jurisdictions, for example, that don't define 'a child' in the same way as the Australian parliament does. A child, in some jurisdictions, can consent to sex after the age of 13 and that sort of thing. So there are variations across the globe in terms of those sorts of details. Other agencies have talked a lot about the Philippines and the kinds of problems that occur there and how those kinds of developing countries can be exploited et cetera.
The terrorism example is an interesting one. There is a bit of an overlap in terms of forensic techniques and research. Looking at a terrorist forum, apart from the content, is very similar to looking at a forum that discusses drug use, in particular, or child sex abuse material. But there are some interesting distinctions. In the Tor community, or Tor environment, there are lots and lots of these boards or galleries —I mention a few of them in the report, Lolita being a notoriously large one—where images are swapped and shared. It's not much of a commercial business, in that regard. These people share the stuff because that's what they want to do. There is a little bit of money making around the live-streaming area, but it's a small fraction of the business, if you like, of uploading and sharing these kinds of images. Of course, sharing an image is often a requirement to join these boards or forums, so you join the conspiracy.
I get the problem. These are offences, yes, and certainly in Australia they're an offence—whether you're in Australia or abroad, actually. But it's unfortunately not uniform, and a lot of the jurisdictions don't have a great deal of regulation of—I think the chair mentioned the cloud earlier—the services. The IFPs and so on aren't always as regulated as our own. So there is a problem in that sense, but, as I understand it—especially having gone through quite extensive ethical review and so on—we can do this kind of research. We don't collect images, obviously. There are lots of ways you can work around some of those problems.
It brings out a bigger question, really, in a way: how do we get more researchers involved in this kind of work, particularly given the logical or quite correct barriers there are about working in this area? I'm not sure I've answered your question.
Dr ALY: I'm getting the sense from you that international cooperation is key. The fact that there are several jurisdictions where viewing material is not considered an offence appears to me to be a huge barrier in getting some international cooperation and some basic international standards around the sharing of material, the viewing of material and what constitutes an offence. It's a surprise to me that there are jurisdictions that do not consider viewing material to constitute an offence.
It brings me to my next question, about this idea of the victimless crime. There's research that you cite in your submission, and that we've seen in other submissions, about the psychology of online offenders and their having more empathy for victims than contact offenders do. I'm wondering, in relation to your suggestions or the research that you put forward around early intervention and using Tor, if somebody was going in and searching this material, would it not be more effective, instead of redirecting them to a program or having them self-refer to a program, to have a pop-up that says: 'What you are about to do is child abuse. There are victims. It is not a victimless crime'? Would that be a more effective approach in deterring people from going ahead and viewing material and accessing material?
Prof. Broadhurst : Yes. There's been some research. I'm thinking of Jeremy Prichard's work, in particular, where they recently set up what I call a honey pot called Barely Legal, and people self-enrolled on that. The deeper they went into the site, or the more potentially offensive the site was, the more warnings popped up: 'If you go to this next step, you'll be traced.' I think we need to do that all over the place, on the open net. The more we can do that, the better, because Pandora's box has already, for want of a better word, opened up. There are all these platforms—Pornhub, or you name it—and all of these sorts of sites on the open net where that would be very valuable to have.
What has been happening, of course, is the so-called non-contact CSAM users, for want of a better word, are very operational security savvy. They gravitate to places—even the redirection survey shows they prefer to engage aggressively anonymously. So we need to do something else for what I would call the deep end of the pool of potential CSAM consumers and abusers. We need to hit both the open, clear net, as you suggest, Dr Aly, and we need to do something different. There is no point telling a Tor user he's going to be traced if he goes into this board; he knows, or she knows, that's just nonsense.
One of the keys, I think—actually we mentioned it in the report, and I'm probably a bit keener on it now than I was even then—is that we really need to do the hard heavy yards qualitive work in the forums where we can do that. What's interesting is that these forums provide a protection. They're a carapace. They're like a tortoise shell presenting the cognitions, if you like, or the ideas of some of these users. You've got users that will tell you on these forums—because one of the great things about anonymised forums is that people tend to be more honest, and sometimes even brutally honest. It's quite kind of interesting. It can be rather hard to stomach sometimes. There are people there that talk about child love and that they are really child lovers. So the cognitive distortions, for want of a better term, are quite significant.
The techniques of neutralisation, we would call them—you're probably familiar with that in the terrorism realm—also get to play. We do get deeper understandings about how these folks operate and how they think, and that gives us some clues about how we might be able to treat them, particularly since we know that cognitive behavioural treatments, for want of a better word, are not 100 per cent effective. We use those in correctional service, and there is a lot of work done treating real and contact offenders that are in jail and in prison programs, so we have quite a lot of information about that. What I found fascinating was going onto the forums and reading what they were saying about treatment and about their treatment experiences and so on.
There is still quite a lot of work to be done in that sense. Again, it's qualitative rather than—obviously there are metrics involved, but it is largely qualitative. You have people on these forums who actually relish what they are doing. They see it as a problem with society, not with them—they say children really like sex, and so on. There are massive challenges in that. We just haven't done enough of the heavy duty research, I don't think.
Dr ALY: I was contacted by a group of mothers and wives of offenders who had accessed online material. Their argument to me was: 'He was a good husband. He just looked at this stuff online; he didn't hurt anyone.' I'm wondering, Professor Broadhurst, if this kind of focus on the difference in psychology between offenders who access material online versus contact offenders has actually blurred the lines for a lot of people and whether we need to have a bigger awareness piece here about the fact that online offending is offending. Online material does have victims; it is not a victimless crime. I'm wondering if you have done any research in that space or if you could comment on that space, because I was quite taken aback by these women who were basically defending their partners and saying, 'Well, they never did anything because they didn't have any contact with the child.'
Prof. Broadhurst : Let's just back up a little bit. There is this shift anyway within our own society about trying to turn off the idea that this is child pornography and trying to get people to understand that that might have been considered that many years ago, but it actually is sexual abuse. It does significant harm. You know about the survivors accounts being quite distressing in all sorts of different ways, and the long-term impacts can be very significant. So we need to turn around this idea, 'I'm not touching; I'm just watching,' so the people understand about the actual child in that image or that video. If you think about some of the worst examples of live streaming that we know about, that's pretty grim stuff. So we need to break that perception that it's harmless. I think a lot of progress has been made in that regard.
The point you raise is that, when there is a child protection issue, it can have devastating impacts on the family. I'm interested in the treatment arena. We know from bitter experience that we need to have a very compassionate approach. We're not going to change these people by telling them to stop it now. With all due respect, that's a great program and it works brilliantly in many parts of the world, but it hasn't reached out to that group. The group that you might potentially think you want to focus on is the group that is viewing this material. We know that that viewing of the material escalates. In other words, over time they're not satisfied. They have to go up another notch, if you like, in the grimness of it all. So, if we can intervene at that level, we need a compassionate approach to reach out therapeutically. That, of course, is going to be at some odds with our moral and legal codes about what we do about it. But the impact on families in some of these child protection cases can be very long term, and we perhaps underestimate just how much more effort we have to make to support those families.
CHAIR: Sorry, Professor and Deputy Chair. I've just got to hand over. We're running five minutes behind.
Dr ALY: Thank you.
CHAIR: We will go to Mr Conaghan for five or 10 minutes.
Mr CONAGHAN: Thank you, Chair. Professor, you've been talking about treatment and qualitative work. You started off giving your evidence today, and you referred to 'a nifty online survey'. That is unusual terminology when we're talking about CEM. Who drafted that survey, where is it, who's using it, and what are you doing with it?
Prof. Broadhurst : Good question to clarify. I guess the word 'nifty' is not really appropriate. I was rather referring to the method not the content and you can use a long-running existing search engine called Ahmia which has been running since about 2013. It's a very well-established search engine. So you can go and find hidden services on Tor. That's in the open net. That operates in the clear net. You can search it up or hit that search engine. You can type in your search term. You might want to visit a Tor site that's for malware. You type in 'child sex' or 'child' or 'Lolita' or some terminology. That terminology is really important, because it does change a lot—the so-called argot, the language that's used. You type that in, and then it pops up and says: 'Okay, you've tried to type in something to do with child sex abuse. Here's the survey.' That was developed by the Finns to protect children under the WeProtect Global Alliance, which funds this. What pops us is a survey saying: 'How can we help you? Fill out this survey if you want.' By the way, at the same time you get a pop-up message saying there's an online treatment program available through the ReDirection program, which is a well-designed online framework. So you can access that. That's where the numbers are coming from.
When I say it's nifty, I mean it's actually very difficult to find means to recruit people for self-report studies, CSAM or drug use. That's the nifty part: it's using existing hosting and an existing search engine. That search engine has been tweaked in a nifty way—if you don't mind my saying that—to let that sort of process work. As a consequence of that, it's worked really well. So that's the innovation that I think is interesting, and it could be applied to lots of different fields. When we asked them to join the survey, we went through some really interesting questions, like: Do you have contact? When did you start seeing porn? What kind of children are you interested in? It's quite a detailed 32-question format. Rather than just leaving them, because often it causes a bit of disturbance—they actually start thinking about it, and many of these people will report they've never spoken to anybody, they've never talked to anybody, they've thought about treatment but don't know where to go—that sort of thing. Obviously, it's going to be imperfect, because not everybody is going to want help. In fact, a lot of them, 40 per cent perhaps, just don't want any help at all. They basically relish what they're doing; they get off on what they're doing. We are reaching out to the people who, say, perhaps are treatment-ready or looking for treatment or worried about the impacts of their interest in children. It's a good way to reach very difficult to reach populations, and we want to do more of that work—for want of a better word—in the deep end in the really aggressively anonymised places where these folks tend to—
Mr CONAGHAN: Just very quickly, is there any data on how successful that pop-up survey is with people going through with ongoing treatment?
Prof. Broadhurst : What it has been successful, in terms of the data, are the numbers who've actually logged onto the redirection program. It looks like it's around—it varies. I haven't got the latest numbers, because the numbers have gone up to around 13,000 self-report users, and we know that about 5,000 of those have gone on to the redirection program. We're still waiting for outcome results. Briefly, outcome results tend to be focused on how much time they've spent, what kinds of worries they have and so on. It's early days, but maybe in 12 months time we'll have more details to give you.
Mr CONAGHAN: Thank you, Professor.
CHAIR: Professor Broadhurst, that brings us to the end of our time. Thank you very much again for taking the time to work with the committee on this. It's an important topic, and we appreciate your passion.
Prof. Broadhurst
: Thank you, Chair. Thank you, committee.
.