Note: Where available, the PDF/Word icon below is provided to view the complete and fully formatted document
Joint Standing Committee on Trade and Investment Growth
10/05/2018
Australia's trade system and the digital economy

MacGIBBON, Mr Alastair, Special Adviser to the Prime Minister, Department of the Prime Minister and Cabinet

Committee met at 08:40

CHAIR ( Mr O'Dowd ): Welcome. I declare open this public hearing of the Joint Standing Committee on Trade and Investment Growth for the inquiry into the trade system and the digital economy. I now call on Mr Alastair MacGibbon. Is there anything you wish to add about the capacity in which you appear before the committee?

Mr MacGibbon : I'm the National Cyber Security Adviser and the Head of the Australian Cyber Security Centre.

CHAIR: As these proceedings are public, are being broadcast and are being recorded by Hansard, if you wish to give evidence in private please let the committee know and we will consider your request. Although the committee does not require you to give evidence under oath, this hearing is a formal proceeding of the parliament. The giving of false or misleading evidence is a serious matter and may be regarded as a contempt of parliament. I now invite you to make an opening statement, Mr MacGibbon.

Mr MacGibbon : I will keep my opening statement brief. I appreciate the opportunity to appear today. As I said, I appear in the capacity as the National Cyber Security Adviser, which is a role within the Department of Home Affairs, and also the Head of the Australian Cyber Security Centre, which is a role within the Defence portfolio and the Australian Signals Directorate. I understand that colleagues from Home Affairs and representatives from other agencies will provide evidence today, so I'll keep my comments around cybersecurity particularly, if that's okay. I'll talk about cybersecurity and cyber resilience sometimes interchangeably, but really what we're aiming for in the Australian government is cyber resilience.

There are a few messages I can leave with the committee. The first is that, in order to succeed from a cybersecurity or a cyber resilience point of view, we have to work closely with industry, with public and government agencies and more importantly, as I said, with the public most broadly. The pace of technological change means that our efforts constantly have to be brought forward and we're constantly challenged.

The Australian government, over the last couple of years, has taken an increased role in direct intervention to protect Australian citizens against malicious cyberactivity and particularly, increasingly so, again cybercrime. The Australian Cyber Security Centre, which I head, takes the lead role in that process. This was increasingly brought about with announcements made by the Prime Minister on 18 July last year, with the announcement that the Department of Home Affairs would be established, which it was on 20 December 2017. With that, the role of lead cybersecurity, strategy and policy moved from Prime Minister and Cabinet to the Department of Home Affairs and my role as well. Also, the Australian Signals Directorate will become an independent statutory agency as of 1 July 2018, with an enhanced role, as the parliament has passed. The computer emergency response team that's part of the Australian Attorney-General's Department will also move into the Australian Signals Directorate as of 1 July.

Importantly, as I said, in order for us to succeed we need to work very closely with industry. The most important strategies that we have are about enabling our economy to succeed so, rather than looking at this in a negative sense, it's growing a digital economy and it's allowing Australians and Australian businesses to do business in what can be a reasonably toxic environment. Part of this is actually changing the narrative. I know Ms Sandra Ragg gave evidence before this committee last year. She talked about the concept of changing the narrative. What I say is that we're moving our discussion from talking about security to resilience, and that's important, because you can't always be secure but you can make yourself more resilient. We're moving away from a compliance culture to a risk culture and we're moving away from talking about cybersecurity as a threat to talking about it as an opportunity. That's what our 33 initiatives that we're pursuing—and we're just over the two-year mark of pursuing this strategy—have led us to do.

Most importantly for this committee, I can focus on a few very quick things before I open myself up to questions. The first is that we have a very successful not-for-profit organisation that was set up under the department of industry funding called AustCyber. AustCyber's job is to find Australian scale-ups and start-ups and expose them to business growth in Australia and offshore. Just a few weeks ago I was in the United States, in San Francisco, with the Minister for Law Enforcement and Cyber Security. We were part of an AustCyber delegation, led by the minister, with dozens of Australian companies in conjunction with Austrade looking to expose those businesses to the huge US and global markets. It's really pleasing to see how far something like AustCyber, which was part of the Cyber Security Strategy launch in April 2016, has come.

In addition, in the Attorney-General's Department, the Computer Emergency Response Team—soon to be part of the Australian Cyber Security Centre—has opened up joint cybersecurity centres in Brisbane, Sydney, Melbourne, Perth and soon Adelaide. Those joint cybersecurity centres are designed to have Australian businesses and state and territory governments come together with the federal government to solve cybersecurity problems, to share threat intelligence and to come to solutions for wicked cybersecurity problems. Those are centres led by committees of Australian businesses telling the Australian government what they need and how we should cooperate with them. The Computer Emergency Response Team, which is formally becoming part of the Australian Cyber Security Centre as part of the Australian Signals Directorate, has been Australia's main business interaction point between the government and the private sector when it comes to cybersecurity. Their information exchange programs, the joint cybersecurity centres they've built and other initiatives go a long way to helping ensure the resilience of Australian business in this world.

I mentioned at the outset that technological change is always going to be challenging us. An internet-of-things world which presents huge opportunities for us in data and automation and greater opportunities for the Australian economy also poses significant threats to us from a cybersecurity and cyber resilience point of view. We'll work very closely with industry to try to reduce the threat that those types of technologies can present us, and allow the significant opportunities for us from the application of those technologies.

On the issue of R&D: in addition to AustCyber, which finds scale-ups and start-ups, there have been significant efforts to grow Australia's research in this area, most recently the Commonwealth funded Cyber Security Cooperative Research Centre. There's Data61, which operates out of the CSIRO. They have about one third of all the post-doc candidates in Australia to do with cybersecurity working with them. We have funded two Academic Centres of Cyber Security Excellence—universities in Australia that lead the way in cybersecurity education—and, in the last couple of months, a national curriculum for TAFEs on cybersecurity education vocational training was agreed by all states and territories; the first time anywhere in the world that a national curriculum of this type has been arranged. That, and the defence science and technology capacity and its research into cybersecurity, puts us in a remarkable position as a nation. Adrian Turner, the CEO of Data61, gives an interesting fact on this. He says that Australia ranks fourth globally when it comes to patent filings on cybersecurity research and network security patents. That is indicative of the type of nation that we are and the types of opportunities we have both from a domestic market point of view and a trade point of view. With those opening remarks, I will leave myself open to the committee.

CHAIR: Thank you. When you're dealing with industry, is it easy to get them on board? What opportunities can you offer them as an incentive to work cohesively with you?

Mr MacGibbon : Michelle Price, the new CEO of AustCyber, tells me that people knock down her door and the door to her organisation when they are looking for the support that they can provide to scale and grow those small, unique Australian cybersecurity companies. So there's no need for any more incentive there. Austrade, I think, is really surprised and pleased by how many companies go on their delegations to expose themselves offshore. As I say, I was most recently in San Francisco at the RSA Conference, where there was a huge Australian contingent of businesses.

When it comes to cooperating with us on information-sharing and solving cybersecurity problems, I've been in this place—sadly, Mr Chair—for quite a while and I can say to you that, in those probably nearly two decades, there's never been the level of industry engagement that we see today. When it comes to the Joint Cyber Security Centres, we have very large and some smaller companies now on a daily basis. I was in our Melbourne Joint Cyber Security Centre yesterday and there were dozens of companies. I was having a meeting with Victorian government health officials, where they brought together their regional cybersecurity personnel from all of their health authorities, and there would have been 20 people around the table. The level of desire for cooperation, the interest in sharing and the huge pressure for us to jointly solve problems are bringing companies without any need for incentive. They actually want to see us win. They see this as a shared problem and an opportunity for them to realise the advantages of a digital economy and minimise the disadvantages.

Mr HART: I serve on the Public Accounts and Audit Committee, and we've had a conversation recently about the issue of resilience versus cybersecurity. I'm very pleased that the conversation is around shifting towards the issue of resilience, because resilience, I think, is vitally important, given the inevitability of compromise or threats of compromise to systems. One of the challenges that we're facing and the constant message that we're receiving in this committee is the fact that there are not two aspects of trade—there's not digital trade and trade generally; all trade is digital. Has that message infected government? Has government taken on board that every aspect of government should, firstly, be conscious of resilience but, secondly, be conscious of the fact that its processes need to be not just digitally enabled but rethought as digitally native?

Mr MacGibbon : I will answer from my narrow cybersecurity perspective, if I may, on the second part of the question. On the first, I'd say that resilience is important, as you know, because, if we talk about security, it's a very binary thing—you're secure or you're insecure. Any types of connected technologies, by their nature, at some point will be rendered insecure. That's just the nature of connected technologies. Our job is to reduce the likelihood of that risk being realised. So, when we talk about resilience, we talk about reducing the likelihood of that risk being realised, reducing the harm that occurs as a result of that risk being realised, reducing the time it takes to recover from the security incident and then, clearly, having ourselves up and running as quickly as possible. And that's not a defeatist attitude; it's a realistic attitude about the complex nature of the devices and systems that we protect, whether their government, corporate or private citizens' computer systems.

'Resilience' as a term is one that we unashamedly took from the physical emergency management folk and, for some years, that has been the right language for us to use. It's slightly different, of course, in how sometimes the risks can manifest. You often don't know that something's a cybersecurity incident until well after the fact; you just know a computer has stopped working or some type of service has failed. Unlike a bushfire or an earthquake, where it's pretty obvious what the cause is, cyber can be a little bit harder. But I think the term holds. Increasingly in government, there's an understanding of resilience—and, more importantly, in business there's an understanding of resilience and a maturation in the conversations I have around boardrooms and executive tables. So that's important.

When it comes to recognition that there is not a huge divide, if any, between what we do online and what we do offline, I spent a large part of my career trying to convince people that that's the case and that their actions offline need to reflect their actions online and vice versa and that there's no difference between the two. Certainly in the space I work in we would take a digital-first strategy—that is common sense given what we do—and I would say generally that the government systems I deal with are designed primarily to deal with citizens electronically first. That is both to help them—it is more convenient—and it's more efficient for government as well. We are quite an advanced economy in that regard. With that increased digital connectivity comes the risk associated with it. My job is to reduce that risk.

Mr HART: If you are pursuing a digital-first strategy then cybersecurity and resilience needs to be a first-order aspect of the planning because it is something that must exist as part of risk management of your processes.

Mr MacGibbon : Security by design is always the preference. It's much harder to retrofit security when it comes to IT architecture. It is similar to when you build a building: it is smarter to think about those things at the time. I would say to you that that is a process of maturation in government, just as it is in business. The tech industry generally has taken a much longer time than anyone would have hoped to build security into the foundational products they have. Of course, government uses the same products that corporates do. We use products that are increasingly fit for purpose. But, by their nature, computer systems are open. They are often vulnerable—and then we need to architect in ways that reduce the likelihood.

If your question is particularly around government, I think it is fair to say that there has been quite an awakening in the last couple of years. As you know, I led the review into the e-census incident. That was a huge wake-up call for government. In 2016, when a significant website failed to function, in a very public way, because of very small denial of service attacks, it really brought home the concept of resilience—the need for systems to be up and running. In the conversations I have with secretaries, agency heads and others has changed a lot in the nearly two years since that time. I'm never going to say I'm satisfied—you have heard my evidence before—but I will say that the expectations of government on its bureaucracy to be delivering resilient systems is a message that is being heard very clearly. My office, among others, is actively pursuing that. From my Signals Directorate side, the Cyber Security Centre, the soon to be Director-General of ASD, Mike Burgess, made very clear in evidence before a committee recently that we will increasingly be advising agencies where we believe they're deficient in their security.

Mr KHALIL: You mentioned that businesses are very open to working with you—which is great—to jointly solve problems and so on. I presume that, in order for you to successfully do that, businesses are happy to give you access to their digital systems, their data, their records and that kind of thing. But I would think that might sometimes include sensitive commercial information or even the records of consumers and clients and so on. Is that right?

Mr MacGibbon : If we break this into the concepts of prevention and awareness raising, they don't need to share anything with us; there is no consumer data or private data that is shared. There is a lot of work done to talk about better ways to architect systems. We will share real-time data with industry on what the threat environment looks like. We will then share quite bespoke reports with industry that are more detailed about a particular threat. That is all very open and public. That doesn't need to have us on their systems. If an incident occurs, depending on who the victim is—at the moment, under the current construct, it is done by the Computer Emergency Response Team or by the Signals Directorate—by its nature, we would gain access to logs and other things. Generally speaking, that wouldn't be confidential customer information; it would be telemetry around what devices have done, the IP addresses and the types of communication—the types of things we would be looking for in order to understand the incident. So I guess the is always a chance that some data would be seen, but it would be done in a very careful way—and the vast bulk of the time not. I would say that there is the need for us to be much more involved in critical infrastructure systems. That is something I think the public would expect us to be involved in. The only time we would ever be more involved in those critical infrastructure systems would be with the permission of the infrastructure owner—and they are largely private sector companies. Again, the vast bulk of the time, that is information sharing and exchange and conversations; and only in extreme cases is it us actually getting access to logs and other such things.

Mr KHALIL: From what I am hearing, the vast majority of the assistance you provide to industry is in that educational advisory type space. It is almost a one way thing—or they give you a bit of a sense that they have got some issues and you give them as much information as possible about how to do things better. Is that what you are saying?

Mr MacGibbon : The saying that prevention is better than cure certainly works in this space. My really strong preponderance is towards us helping reduce the likelihood of there being an incident. The aim of the Joint Cyber Security Centres is that industry will come together and share with each other sometimes—sometimes without us present because they will want to share in a way that doesn't include government. That is fine by me. They might be a public sector and they will talk about a common problem they have so that they can commonly solve it. I will use a tangible example. If there was a new critical vulnerability that was known in a particular type of computer system that is used by corporates today or yesterday those corporates might all stand up a team of between six and a dozen technical folk trying to solve that for their own corporate. It might be a large bank. For the next block, it might be a telco. For the one after that, it might be a large retailer—the same teams trying to solve the same problem. What we are providing in the Joint Cyber Security Centres is an opportunity for them to come together and say, 'How about we all throw a smaller number of people in and come to this protective solution?' The concept of us getting access to information is extremely remote, and it would only be done as a result of inadvertently being in those systems to help protect them.

Mr KHALIL: I understand that that is the majority of the work that is done, and it sounds very good in the way you go about trying to do that—trying to consolidate and break down some silos, I guess, so corporates can work together. I get that. That is the general case. But you mentioned that you sometimes help them where there is a very serious incident and they say: 'We're stuffed here, we've been breached.' Are there protocols in the Cyber Security Centre on what is done with the information that you access or what you are able to see? When you have finished helping them resolve the problem, what happens then? Are there internal protocols? Are there departmental protocols, or what?

Mr MacGibbon : The type of information we will access is logs reporting on the nature of interactions. To do a forensic analysis might mean you have to understand more of what is going on deep into the system. I can come back to you, if I may take it a notice, on the types of protocols that we have. If the question is about how we handle information belonging to a victim, the answer is that we take that extraordinarily seriously.

Mr KHALIL: I am sure you do take it very seriously—and you are helping them in the first instance. But, once that process is complete, do you follow protocols in order to ensure that you are not retaining something? Even in your own internal reporting, you are going to have to talk about what you saw and what you did for them and whether that breaches some of the commercial sensitivities or whatever else.

Mr MacGibbon : I will come back to you if I may—only because we have a few organisations that are emerging at the moment and I want to make sure I give you the full answer.

Mr KHALIL: And I suspect that this issue would not be top of mind for many of the corporates, though they would be very appreciative of the support that you give them when they are attacked. On that front, how pervasive are the external attacks on Australian businesses or commercial entities? You don't have to give me exact examples; I'm thinking more about percentages.

Mr MacGibbon : It is still fair to say that probably the major threat vector for business is an insider. The trouble with connected technologies is that an insider can be affected by an outsider. So the concept of a perimeter—your employees and then everyone else—has kind of broken down. We would assess that the threat environment is increasingly risky.

Mr KHALIL: Are you saying the majority of threats are internal?

Mr MacGibbon : Insiders, yes. Most breach studies, and others, would say that the disgruntled employee is the biggest risk to an organisation. That can be greatly facilitated by technology, by the way—given access to files and things that one may not have had in the past, or data. In nearly every breach or security incident you see from the outside, it is the targeting of an employee—or the boss—that would lead to that breach. It is the concept of getting a human to open up the gate to allow the attacker in. In some sense, insiders are the key to all of this.

Mr KHALIL: They are the weak spot.

Mr MacGibbon : Yes. It is a social problem enhanced by technology—cybersecurity. We need to educate end users, at home and at work, on how to reduce the likelihood of them being that victim. Our assessment of the threat environment is that it is certainly not getting any less risky. The attack tools are cheaper; they are more available to criminals of a less educated nature. As we connect more and more technologies, it increases the threat surface that we are trying to protect. Along with the concept that these technologies evolve—and Internet of Things technologies in particular are not necessarily designed with security in mind, which I think goes to one of Mr Hart's questions—some of these things are just not built for security, and yet we will stitch them together into products. That means corporates, governments and citizens need to be increasingly wary on how to protect themselves in that threat environment.

Mr KHALIL: Would most of the industries or businesses come to you when they have problems, or are there instances where you suspect that, for commercial reasons, they are reluctant to talk about the fact that they have been breached? When you find out about it, are you proactive in seeking to assist? How does that work?

Mr MacGibbon : There is no doubt that we wouldn't have a complete picture of the victimisation environment. The Privacy Commissioner will receive more disclosures because of the mandatory reporting regime that has come into play. So that means through the Office of the Information Commissioner we will see more incidents. We get notified of the range of near misses, sometimes of quite minor matters. It could be just a conversation on the telephone between the team and the company or the government agency to help them do a particular thing to shut down the threat—right through to really serious and long-term mediations, which can take an awful lot of effort and time. I don't think we'll ever be in a position to know all of the incidents. People are not obliged to tell us. It is certainly not pleasing to get victims contacting you. But if people are victimised—government agencies or corporates—it is pleasing that we are contacted.

Mr KHALIL: There is probably another variation of that. I understand that a number of big corporates—banks, for example—will have line items in their end-of-year accounts that are basically in the millions of dollars and they are writing off some of the breaches. They haven't thought about redoing their systems or building them up; that's too expensive, so they're willing to cop the losses from hacks and all the rest of it. It almost becomes part of the cost of business.

Mr MacGibbon : Core banking systems in Australia are pretty good. If you were to look at the critical infrastructure of verticals, of which banking and finance is clearly one, the Australian banks—at least the top tier banks—invest an awful lot of money in the security of their core systems. So I think most professionals would say that the actual banking systems themselves are as resilient and secure as you could possibly hope for. But to say that there aren't cyber incidents that impact banks would be wrong. Primarily, they're at the consumer end. So, whether my credit card is compromised and misused or my bank account is accessed because I gave away my password, those things, at the user end, are where we would possibly see those line items you referred to. And there's no doubt that cybercrime costs this country a huge amount of money. It is money taken out of the economy and from people's pockets, causing significant harm, and that's why, as a centre, we are increasing our efforts when it comes to countering cyber-enabled crime. But we need to distinguish between the core banking systems and, then, how we interact with those systems, which are where the weak point is.

Mr KHALIL: I have one more question. I missed the start; I was late. You have people from all of the different agencies in your team at the centre and under Home Affairs, basically?

Mr MacGibbon : The Cyber Security Centre is an Australian Signals Directorate centre, so we have ASD staff and computer emergency response team staff—and they are mogging, or machinery-of-government changing, into us as of 1 July, so they'll be ASD staff—and Australian Federal Police, the Criminal Intelligence Commission, the Defence Intelligence Organisation and ASIO all contribute staff.

Mr KHALIL: You report up to Home Affairs now.

Mr MacGibbon : This is a role with a couple of hats. I report into Home Affairs as the National Cyber Security Adviser, and, in that role, I'm in charge of the cybersecurity strategy and policy for the Commonwealth. That team is co-located inside the Cyber Security Centre with this ASD asset that has these seconded and attached organisations in it. I would say that that construct—the Cyber Security Centre—as it continues to evolve, is a true national asset. The ambition of government for it and what it can achieve in terms of becoming a single point of truth and of educating the public and business and governments on cybersecurity, as well as being the main response capacity for us, is a national asset, and it'll only continue to strengthen the very strong linkage between Home Affairs and the Signals Directorate, as manifested at the centre, which will give us the maximum opportunity to take the operations to match government policy. It is still early days since Home Affairs was set up, but already I've seen quite a shift in the way in which we do our business.

Mr KHALIL: Thank you very much.

Mr HART: There was something that was mentioned in an earlier answer, with respect to your concern about infrastructure and, in particular, security of infrastructure. We sometimes hear about critical security vulnerabilities within home routers—consumer devices. But, of course, there are also the critical infrastructure equivalents of those—that is, large-scale routers that provide for communications, particularly IP networks. Is there a clear line of sight on any issues as they arise with respect to those security vulnerabilities—in other words, open passwords, passwords that haven't been changed from the default password; that sort of thing? What sort of line of sight does your office have to that sort of critical security?

Mr MacGibbon : Routers are an interesting question. As you right say, there was a time when, if you bought a router—a home router or even an industrial router, if that's the right term—there was a default password and you could plug it in and it would operate. You will note, of course, that now, when you get a new home router, the first thing you have to do is set a new password. Hopefully, it's not one of the top 10 passwords like 'password' or 1-7 or a couple of people's names. Strong passwords matter, but these days you're prompted to change the password on a router. Just a couple of weeks ago, the Australian government, amongst other governments, came out and spoke about a feature inside Cisco routers that was being exploited by Russia. We came out with that information, having already advised last year how to turn off this feature that could be exploited, primarily, in order to help drive change in companies so that they would take notice of the need to turn that feature off. Large-scale misuse of that particular feature could cause problems for us as a nation and, by attributing it to Russia, we believed it was a way of helping drive incentive for that change.

We can identify faults. We, along with our allies, as soon as we know of vulnerabilities, will be pretty quick to come out and educate the public on what they are. We publish those openly, because our aim is to secure systems, frankly, all around the world. It's in my interests to see that same Cisco feature that was causing problems turned off in every country, because my job is to help secure the internet. I do it from Australia, but my interest is to reduce victimisation as well as I can whether it's Australians or others. Obviously, the preponderance is towards Australia. This concept of zero-day threats—that is, zero days between the vulnerability becoming public and your ablity to start writing what a patch or solution is—is not uncommon, sadly. So, we're increasingly seeing vulnerabilities we didn't know about made public, but we're as quick as we can to help advise on how to fix those. Importantly, when we are aware of a vulnerability that is being done by ethical disclosure of those vulnerabilities being found, we will work with companies and educate the public on how to reduce threats.

Senator COLBECK: I have a couple of questions about some of the earlier stuff. You talked about the top 4 in listing for IP. Who are the key competitors in that space globally, and do they align with some of the threat sources?

Mr MacGibbon : Largely not. When it comes to cybersecurity, the United States are clearly world leaders in this area. Israel is often touted as the country to look at. That's largely because of their existential threat, and I'm quite happy that we don't have the same environment that leads Israel to need to be as innovative as it is in cybersecurity. I couldn't tell you what the third country is, but I'm going to find out. Usually, it's Israel and the United States that I'd want to benchmark ourselves against. The point that Adrian Turner makes is that we're a really smart country. We often don't think of ourselves as smart. When it comes to cybersecurity, I can say that our industry—nascent as it is in some parts—is actually quite remarkable. We're one of the most connected nations on earth just in terms of our general economy, and so it's is a good thing that we have a vibrant, and hopefully increasingly vibrant, cybersecurity industry.

Senator COLBECK: What's the life cycle of IP? Because things move so rapidly in that space, how does it align with, say, our IP laws? It would seem that it has the potential to decay very quickly.

Mr MacGibbon : I'm no expert on the half-life of how long solutions will last for; what I can say is that, in this industry, which is one of the fastest-growing industries in the world, there is huge growth. Sometimes it's a bit goldrush-like. There are people who go out to make their fortune on creating cybersecurity solutions. The good thing is that, amongst all of those companies and all of those ideas—most of whom will not succeed, of course, like any area of significant and rapid growth—there are remarkable solutions out there, and we do a pretty good job as a country creating them. The idea of the initiatives that have been supported by the government in the last couple of years is to capture that IP in Australia and maximise its value for those Australian companies and the Australian public more broadly.

Senator COLBECK: Your comment about the Cisco router and the fault or weakness—

Mr MacGibbon : They call it a 'feature'! I don't call it a feature.

Senator COLBECK: Well, that actually aligns with my question, because I was going to ask about threats that are built into systems. So, is there any—

Mr MacGibbon : The concept of back doors built in? Or systems that are vulnerable—

Senator COLBECK: Well, we make decisions about potential participants in builds for us, as a government, for example, or as a nation, because that is a potential issue. So, more broadly within, say, the consumer market or other back doors, if you like—because everything's so connected, and so much is interfaced now—putting an app on your phone opens, effectively, the system, and so much more of the market is happening in that space, and particularly in some jurisdictions.

Mr MacGibbon : I'll answer it—surprisingly I'm sure, for the committee—in a couple of stages, if I may. The first one is the concept of supply chain and third-party risk. In the last year or so we've certainly seen that come to the fore. In complex systems, as you say, that rely upon a whole range of software and suppliers to suppliers to suppliers, the offender doesn't need to be in every part of that system. It could well be that they could be in a third-tier or fourth-tier part of that chain that leads to a vulnerability in an area that you just couldn't imagine. And it is a source of increasing angst, if that's the right word, for my role: just how complex these systems are and how creative some of our threat actors can be. Whether they're nation states or criminals, they are innovative people and they can find where in a system to place themselves in order to create the problems for us. So, the supply chain issue is important—you're right. If it's governments building something, then we'll put a lot of effort into those things. But how far down that chain you can go—a third party provider who has a particular piece of software that links in with you might cause a problem.

Then I'll shift to this concept of the internet of things, devices, and there will be billions upon billions—tens of billions; some would say hundreds of billions—of these devices added to the global connected economy in a handful of years, literally a couple of years. Those devices are extraordinarily cheap. They're largely not designed with security in mind. They may not be able to be updated. They may ship with default passwords. You may not be able to change the password. And that will bring a complexity to us. For a long time our problem has been the confidentiality of information. We'll increasingly come into this problem of integrity of information: can I believe what these devices are presenting to me? Can I use that information and interpret it the right way to go about whatever business it is—driving a car with no hands through to things attached to clothing and other such items? So, internet of things and the lack of security of those is a problem.

And then you mentioned the concept of a back door. I'll answer that in two ways. One is unintentional back doors, or vulnerabilities. These are very complex devices. Your computer that you have in front of you is designed to do a whole range of things that generally you don't want to do. You just want to maybe type some emails and surf the web, and run a few apps. Yet they're designed to be quite open and able to do things that you may not want. That creates sharp edges for us and vulnerabilities that we may not know about. Intentional back doors are different. That goes to the supply chain question—whether or not certain nations or others would build those things into products. They may not need to be back doors. You may just need to be able to remotely update and patch that system, which is a feature of the product you've bought, and that might give someone access into your system.

So, it's complex, and I see the chair looking frustrated. I'm here all day, so it's okay, Mr Chair, but you've probably got other things to do. I'm conscious that there are others behind me.

CHAIR: Thank you very much for your attendance. We could have extended this for another couple of hours, I think. You'll be sent a transcript of your evidence. Thanks again.