Note: Where available, the PDF/Word icon below is provided to view the complete and fully formatted document
Finance and Public Administration References Committee
23/03/2018
Digital delivery of government services

ALEXANDER, Mr Peter, Chief Digital Officer, Digital Division, Digital Transformation Agency

BAMFORD, Mr Daniel, Head of Portfolio Office, Digital Investment Management Office, Digital Transformation Agency

BUNDY, Ms Liz, Acting General Manager, Integrity Modernisation, Department of Human Services

McHARDIE, Mr Charles, Acting Chief Information Officer, Department of Human Services

McNAMARA, Mr Jason, Acting Deputy Secretary, Integrity and Information, Department of Human Services

SEEBECK, Dr Lesley, Chief Investment and Advisory Officer, Digital Investment Management Office, Digital Transformation Agency

CHAIR: Welcome back. In the program we're scheduled now to move to consideration of the robo-debt case study, but I want to ask you about a couple of timing issues with some of the evidence you provided earlier this morning, Mr McHardie, in relation to the other project. You have given evidence that the department has paused implementation of Pluto whilst you undertake this review. When was the decision taken to pause that project?

Mr McHardie : When we say pause, if there are critical defects with Pluto, we will continue to fix those. The IC teams will continue to do that, but any major enhancements being done—if we still had Mrs Bridger here, I could check the exact date business made that decision. We might take that on notice and come back to you.

CHAIR: Was that decision taken on the basis of any external advice or internal advice only?

Mr McHardie : No, it was internal advice. Basically we needed to work out what enhancements needed to be made to the system to make our staff more productive. We needed some clear advice in CO group within ICT as to what were must-haves that needed to happen right now and what enhancements we could probably do without in the near term. It was an internal decision.

CHAIR: Taken last year?

Mr McHardie : I'll check the date with business, but this year we have not been doing any enhancements.

CHAIR: If it is possible to locate the date of when that decision was taken before we finish this afternoon, that would be terrific. Was the minister's office involved in that decision at all?

Mr McHardie : I couldn't tell you that. It was looked after by our business colleagues.

CHAIR: Can you take on notice what role, if any, the minister's office played in either initiating a review of that enhancement program or approving cessation or pausing of enhancements. That would be appreciated.

Mr McHardie : Understood.

CHAIR: Do you wish to make any remarks about the ICI robo-debt case study before we begin?

Mr McHardie : No. As with this morning, we'd like to get started.

CHAIR: I will give some guidance as to what I'm interested in. As per the earlier discussion, there has been quite a lot of public scrutiny of the operation of the program, including a full Senate inquiry, which gave lots of the people affected a chance to tell their story. The Commonwealth Ombudsman has undertaken a review of the ways in which the program didn't meet expectations. I don't intend to go over that ground or revisit that, but I am interested in understanding how from a policy development and project management perspective OCI came to be implemented in the way that it did. I am more interested in the project management dimensions of this than in the outcomes for customers and citizens. We've had evidence that, ideally, digital capabilities ought to inform policy development and go beyond mere smoothing of existing transactions. OCI shows some of those attributes, using data matching and automation, but it didn't deliver, so I think it would be useful to understand from that perspective too why it was unable to deliver in the way that was intended. One of the larger questions for this committee is: should digital transformation be best driven from the centre or in the business units? Is it a top-down process coming from the centre of government, from an organisation like DTA, or is it initiated in the operational areas? Where does OCI fit in the spectrum between a very centrally-driven process and something that came out of an operational agency?

Mr McNamara : I think the online system was very driven by the Department of Human Services.

CHAIR: When was that first proposed? How did that come about?

Mr McNamara : A number of budget measures underlie the income data matching process. The first relevant measure was Strengthening the Integrity of Welfare Payments in the 2015-16 budget. That process then led to an increase in compliance reviews, initially for the 2015-16 financial year. Those reviews were done by officers without the use of an online system. In that financial year the online system was developed and then trialled from July 2016 with a thousand-person pilot before being rolled out more fully from about August-September 2016.

CHAIR: There's a government decision to intensify compliance activity, which is initially rolled out manually by the organisation, but in parallel a decision is taken internally to develop some capacity for automation?

Mr McNamara : The government decision was to undertake these compliance reviews online as well as through a manual process.

CHAIR: What led up to that decision? I'm trying to understand the origin of the idea.

Mr McNamara : It was a budget decision the government took.

CHAIR: However, I'm assuming it doesn't originate exclusively in the minister's office. Someone provides advice at some point that there's an opportunity. Had the department been working on opportunities to better utilise ATO data and automate some part of this process? I'm trying to understand the origin of the idea.

Mr McNamara : We have been data matching for a long period of time with the ATO. As we've explained previously, that data matching hasn't changed. That data matching had showed, though, that our capacity to action the anomalies the data matching was putting up was limited by our resourcing in the area. So we could only do a certain number of reviews relative to the significant number of discrepancies between the two datasets that were being identified. The department had a large log of those discrepancies that weren't being actioned. The compliance measures are really about going through that backlog of discrepancies between the ATO dataset and the Centrelink datasets.

CHAIR: Presumably you started working on that prior to the 2015 budget, to allow announcement in the budget.

Mr McNamara : There was work leading up to the budget announcement, yes. If your question is in relation to the development of the online system, that happened in 2015-16 predominantly.

CHAIR: Let's say there was a headline idea that there might be some way of generating efficiencies in that compliance system by using some sort of additional information. There was a budget allocation made in 2015 and, in the 12 months following that budget allocation, the first detailed scoping and work commenced on the project?

Mr McNamara : Yes. The build of the system happened in 2015-16.

CHAIR: Could you talk me through the procurement process and the project delivery process for that? The scoping was in house originally?

Mr McNamara : The whole project has been done in house. You wanted to talk about project management?

CHAIR: Yes.

Ms Bundy : The department's traditional methodology used was the waterfall methodology—doing joint design sessions with ICT, working through the requirements and then building to those requirements.

CHAIR: Was DTO involved in that process?

Dr Seebeck : No, we were not.

Mr McHardie : No. This build was prior to the Digital Service Standard. Let's look at the key dates. Design and analysis for the system was completed in October 2015. We then completed the build. Our first release happened at the end of October 2015, which was before the DTO's, and then the DTA's, Digital Service Standard.

Senator PATRICK: What date is that?

CHAIR: That is a helpful time line, Mr McHardie. Would you mind repeating it.

Mr McHardie : The complete analysis and design was complete in October 2015.

Senator PATRICK: From a start date of—you have missed the start date.

Mr McNamara : July 2016.

Mr McHardie : Yes. It was when the teams first came together.

Mr McNamara : In terms of starting the online system.

Mr McHardie : And then there was a series of releases.

CHAIR: Okay. So the analysis was completed in—

Mr McHardie : The first piece of analysis and design was in October 2015. The initial build for the first drop of code was at the end of that month. We released at the end of that month and then we went into another analysis and design. Basically, we had completed in July 2016. There is a whole time line of events that sit there in the iterative build that we did here.

CHAIR: Is that a time line that you could provide to the committee?

Mr McHardie : Yes. It is just a series of dates and milestones.

CHAIR: Is it written down in a form that could be photocopied and circulated to us now? That would save a lot of time.

Mr McHardie : It has a lot of other data on it as well. Could we just provide it on notice for you?

CHAIR: No. I think it would be helpful to understand the time line of the project.

Senator PATRICK: Someone could even transcribe it. That would be useful. Thank you.

Mr McHardie : Done.

CHAIR: So the digital service standard wasn't in place. The DTO did exist but the DTO wasn't engaged in the project in any way?

Dr Seebeck : No.

Mr Alexander : We were asked to be engaged later. In January 2017 we were asked just to come and give some advice on how to improve the screens and expert advice on user experience and interaction design. We did that work with DHS in early January '17 and we provided them with a report, which they acted on. Since that date, we've been working with them intermittently on looking at pieces of technology or user research and helping them.

CHAIR: As to the project, there's some initial discussion prior to the budget in 2015. Was it originally seen as an efficiency project—an attempt to free up resources for other activities? Was it conceived of as an innovative approach to utilising automation in the department? What was the frame for that?

Mr McNamara : I think the measures have always been about trying to look through the data-matching backlog and being able to process those. That was the motivation. Yes, we are trying to do that in the most efficient and effective way possible, but that has always been the key motivation—to examine what the differences between the tax office and the Centrelink data are telling us.

CHAIR: Were there other projects on the list? We're resource constrained. There are always things that you'd like to do, in terms of new investment to drive compliance or efficiencies. Was this something that was selected from a menu of projects?

Mr McNamara : Well, there were a number of projects that were part of the '15-16 budget. Strengthening the integrity of welfare payments is not just about data matching, so there were a number of projects in that actual measure that were compliance related. There is the AUSTRAC measure, from memory—I've got a list of them here; there's quite a number of compliance measures in that larger budget decision.

Senator PATRICK: In previous evidence, we heard that your accounting system does break down into different line items each different project.

Mr McNamara : Yes. Well, it does, if we're doing it by project, but there are, essentially, six separate measures that relate to income data matching, in terms of budget announcements, because there were budget announcements in the '15-16 budget, the '15-16 MYEFO and the '16-17 MYEFO. So there are several elements of—

Senator PATRICK: But you will have one project manager looking after this aspect, or maybe a couple of the aspects of it, and they would be assigned an allocation of resources which would include people and money?

Mr McNamara : Yes, broadly, that's a fair way to—

Senator PATRICK: That's typically how project management works.

Mr McNamara : Yes, and that's how these projects were managed. So there was a program board. There was a senior responsible officer. As Ms Bundy said before, there was the normal governance that surrounds our projects.

Senator PATRICK: So, in terms of tracking how much was spent on this particular project, you'd be able to give us the details of how much was actually spent?

Mr McNamara : Yes. We took that on notice in the estimates hearings that have just gone past, so we are preparing those figures for the Senate at the moment.

CHAIR: To go back to the selection of the project in the first place: was there any comparative evaluation of the worth of making an investment in this kind of initiative, relative to other initiatives that might also be useful for the department?

Mr McNamara : This is a government decision within the budget context. Government would have been making that decision on the basis of the information in front of it.

CHAIR: Yes. It's just that I'm interested in how the department selects between IT options before it. There is a limited pool of money. There's an infinite demand for things to be improved or worked on. I'm trying to understand whether there's any methodology that we can know about where the department lines up the menu of 35 things you might like to do and decides which five things you will do.

Mr McHardie : Usually, from an ICT perspective, it's a balance of government-directed initiatives, the capacity that we have to deliver, and also those departmental internal measures that we would like ICT to work on. It could be provision of new desktop devices or rollout of a new printer fleet or replacement of a mainframe computer. We are asked by business, 'What is your capacity to deliver these projects?' We sit down and look at the portfolio throughout the year, and decisions are then made on what we're going to move ahead with. That's basically it.

CHAIR: Who's the decision-maker? Is it your area, Mr McHardie, or is it at the executive level of the organisation generally?

Mr McHardie : There are committees that are run in the department. One of them is the implementation committee that looks at things like supply and demand and capacity to deliver projects. That's probably the most formative body within the department. That looks at supply and demand and capacity issues which ICT reports to.

CHAIR: Was that the committee that recommended the OCI program?

Mr McHardie : From an ICT perspective, we would have been able to say that we had the capacity to do the build in that fiscal year, which was the case.

CHAIR: So you did—not would have but did?

Mr McHardie : We did.

Mr McNamara : But that decision is a decision of government and, if you like, the department is implementing the government's decision rather than the department driving that decision-making. Our interaction with government is more about our capacity to do what's being asked. So we will provide advice, as is the standard and the process, to say to government what our capacity is to undertake a project.

Senator PATRICK: When you say 'government', what do you mean? That's a big organisation. Are you talking about the minister's office or the secretary's office? Government, to me, is not a defined thing. There's got to be a process by which someone comes up with an idea and works it through and it gets presented to somebody.

Mr McNamara : Yes. Part of the standard budget process is that ministers bring proposals to the cabinet. That's a standard budget process.

Senator PATRICK: Yes, they bring proposals from somewhere. The minister normally doesn't think it up and draft a cabinet submission. Someone does that. In this instance, I think where we're trying to get to is: where did the idea come from within government—so not government; that's too broad, because it must have come from somewhere—and what pathway did it go to get to a cabinet submission?

Mr McNamara : I'd have to take that on notice in terms of the origins of that. I wouldn't have any knowledge of the origins, if you're asking in that way.

CHAIR: Mr McNamara, I understand that the content of a cabinet process is not able to be discussed, and people aren't asking you to do that. What we're asking for is an understanding of the processes that occurred in the department in the lead-up to this being included in the budget, any preparatory work that was done, and any process by which the opportunities associated with this project were compared to opportunities associated with other possible projects. We're trying to understand the advice that was provided to the minister or ministers and any other kinds of interactions that might have taken place. Is it in the context of interdepartmental working committees? How did we get to a point where the project was included in the budget? The reason that it matters is that, as it turned out, it wasn't a project that went that well, so it would be useful to understand what scrutiny and checks and balances were able to be provided by the Public Service prior to the point where it's implemented—I'm sure a decision is taken to proceed—and then, as we'll discuss later, also what checks and balances and scrutiny are applied to the project so that it works out as intended during the implementation phase. They're the interests of this committee. We're not trying to re-prosecute blame games about robo-debt, but we are trying to understand how good decisions might get made about ICT. That in part involves understanding what went wrong in projects that didn't go that well, like this one.

Mr McNamara : We wouldn't agree with the proposition that it didn't go that well.

CHAIR: Really? You would not agree?

Mr McNamara : No.

CHAIR: You think it went well?

Mr McNamara : Yes. We've made a submission to the Senate inquiry. We've made it quite clear that we think the project has gone quite well. We've delivered lots of savings. We have quite a number of views already undertaken and we have changed some aspects of the system—we've improved aspects of the system. But I don't think we'd agree with the proposition that the project hasn't gone well.

CHAIR: Mr McNamara, it was a disaster. It produced incredible anxiety for a very large number of citizens.

Senator PATRICK: When you say 'well', Mr McNamara, I understand that some elements went well—you're talking from a project perspective. But there are different metrics to a project, and one of them must be in terms of the ultimate objective. You can say a project was timely, so, from that perspective, it went well. It was within budget, so, from a cost perspective, it went well. There are a whole bunch of metrics that are used. So when you say it went well, what metrics are you using?

Mr McNamara : I'm looking at savings.

Senator PATRICK: So you don't look at end-user interface with the community. Was the effect on the community included in your metrics when you made your submission to the Senate inquiry?

Mr McNamara : Yes. I think what we said in our submission to the Senate inquiry was that people have always been required to tell us changes in circumstances. That's always been the case. The fact that some people have not, the fact that our data matching has shown that and the fact that that is some period after which they didn't tell us means we still have a responsibility to assess that and understand that and review that.

Senator PATRICK: No-one doubts that. I think where the chair is coming from is: there are a bunch of users here, and a number of them were affected adversely. Was that in any of your metrics included in your submission to the Senate inquiry?

Mr McNamara : What we're saying—and I think the Ombudsman's report backed this up—is that it was quite reasonable for us to ask people—

Senator PATRICK: You're repeating that. I'm talking about from the customer experience—the people who were adversely affected—not whether or not you think it's reasonable to collect money appropriately, because I get that; that is a reasonable thing. I'm specifically narrowing in on one metric that I might use, which is to look at the constituent experience. Was that metric ever scored in your submission to the Senate inquiry?

Mr McNamara : No.

Senator PATRICK: Then how did you score yourself?

Mr McNamara : We said that we could have improved in terms of the communication aspects. I suppose the key change that we made was in relation to some of our letters that we responded to, where, while we did extensive testing of our processes—including our online processes—things could have been clearer in our letter processes. That's why we made a number of changes to things in February 2017, in terms of both letters and the online system, to improve clarity so that people could understand things a bit better than what we'd originally done. But I think it is important that we did have that very much in mind, which is why we did a lot of user testing prior to the system being released. The system wasn't to scale straight away. We did do the thousand-person pilot to test the system from a user perspective—

Senator PATRICK: Sure, but, once again, you're talking about the project process. I'm talking about the constituent experience. I think that's what the chair was trying to get to.

Mr McNamara : That's what we are testing. We are trying to test the user experience through that pilot. It's not just about 'it works' in terms of the IT works, although that is part of the process—we are testing the IT works. But we are also testing the user experience.

CHAIR: I am a little startled by your evidence that you thought it was a good process. The Ombudsman said:

Poor service delivery was a recurring theme in many complaints received by our office. Customers had problems getting a clear explanation about the debt decision and the reasoning behind it. As the compliance helpline number was initially excluded from letters and was not obvious in the system, customers called general customer service lines resulting in long wait times. They could not always get clear information and assistance to use the online system. Service centre staff did not always have sufficient knowledge about how the OCI system works, highlighting a deficiency in DHS' communication and training to staff. In some instances, a more thorough manual intervention by a compliance officer would have saved the customer time and effort.

That is an unusually direct criticism, and that's just one part of the executive summary of the Ombudsman's report. I am startled that you would describe it as a success.

Mr McNamara : We didn't say that we haven't improved the system. We have said—and we said in the context of the Ombudsman's report and we've said it in the Senate inquiry—that we weren't as clear as we needed to be in some of our correspondence. The phone number wasn't on the original letter, and we needed to put that on the letter. That's one of the key changes we made. We introduced registered post to make sure people had been contacted, we made changes to the online screens in February 2017 to make sure there was greater clarity for people and we continue to try and improve the system from a user perspective—that's what we are trying to do at the moment.

Senator PATRICK: Sometimes it's just easier to say, 'You know what? We got that bit wrong', and then the committee can move on. Seriously, as a senior public servant, taking responsibility is part of the job. Most people in our constituencies get greatly frustrated when asked a simple question—it is not a fatal answer to say, 'We did that wrong. We did that poorly. Other parts of the program went well'.

Mr McNamara : I think we've made clear that we—

Senator PATRICK: You've danced around it.

Mr McNamara : Yes. But, in the criticism, I don't think that there has been anything that people have pointed to that's an issue with the online system.

Senator PATRICK: You're talking now or in the statement the Ombudsman referred to?

Mr McNamara : In the statement of the Ombudsman. The criticism that we heard was primarily about clarity and access to the phone number. I think they were things that we did accept that we didn't do. We should have put the phone number up-front in the letters so that people could have contacted us if they were having difficulty. There always was a dedicated compliance line, and we should have made that more transparent in the original letter.

Senator PATRICK: Ms Macleod, at the Sydney hearing, said:

We know that, in our earlier discussions with the department, it was a conscious choice not to have that helpline phone number in the letters and on the OCI, and that came about through the involvement of behavioural economists and nudge factors in the design of the OCI.

So it was a conscious decision at the start not to put the telephone number on the system or in the letters?

Mr McNamara : Yes. And that's what I'm saying. We tested things. We didn't just decide something. We were actively looking at how to make this effective from the user perspective. We accept that, once we'd rolled it out and we'd seen it at scale, it was more useful for people to have access to that number than what we'd previously thought.

Senator PATRICK: Sure, but you would confirm the evidence of the Ombudsman's office that it was a conscious choice not to have the helpline phone number in the letters and on the OCI?

Ms Bundy : Sorry. It was on the OCI; it wasn't in the letters.

Senator PATRICK: It was?

Mr McNamara : Yes.

Senator PATRICK: Going back to the question I asked, would you agree that it was a conscious decision not to have it on the letters?

Mr McNamara : Yes, it was. It was part of our testing and, yes, we did decide that it would be better without it.

Senator PATRICK: Okay.

CHAIR: In terms of the design of the program, DTA has indicated that it wasn't involved. Did you make a conscious decision not to notify DTA or DTO that you were initiating a project of this kind?

Mr McHardie : There was no DTO at the time. When it first kicked off, the DTO didn't exist.

CHAIR: In 2015.

Mr McHardie : DTO was stood up—when did you stand up? Was it in late 2015 or early 2016?

Mr Alexander : Yes.

CHAIR: And you didn't engage at any subsequent point, until things started to go wrong? What about the ATO: were they involved in the design and implementation of the system?

Mr McNamara : No, the ATO weren't involved, to my knowledge.

CHAIR: Why was that?

Mr McNamara : I'm not too sure why we'd involve the ATO?

CHAIR: The data being provided is being provided from the ATO. I'm just interested to understand—

Mr McNamara : As we've given in evidence before, the data matching with the ATO had already been undertaken prior to the measure. The data matching had been a longstanding process with the ATO. So, if you like, the discrepancies have already been undertaken and we have knowledge of that, so that data matching just hasn't changed. The online compliance system and the compliance measures are—the data matching itself hasn't changed as part of the process. There was no change to the interaction with the ATO as part of this process, or as part of these measures. That hadn't changed.

CHAIR: In the submission from the CPSU, they provided a quote from a staff member, who said:

It was obvious at the briefing when OCI had just commenced that the process had basic flaws which meant nearly all debts would be wrong. There was an audible gasp from the room when we first heard of the averaging concept. It was obvious some would be wrong by a few dollars some were not debts at all. The group of staff I was in clearly knew this would not work. Management minimised this concern and moved forward. I suspect they knew but had no control over things happening in Canberra.

Another staff member said:

This was always going to happen. I knew this as I used to be one of the human beings that used the data matching information they now have the system calculating automatically.

A third person said:

The OCI program was rolled out without my team in Compliance ever having had the chance to look at it or understand the details—had we been consulted, we could have pointed out many problems (some of which have been addressed in later updates, months down the track).

The final comment was:

Once again, the systems are implemented for "live testing" by staff. We make it work because we have to ensure the customers continue to receive their correct entitlement, or any entitlement, within a timely manner. Systems are constantly "tweaked" as staff feedback problems and issues. The majority of staff genuinely care about the customers and are appalled at the pushing through of changes without real consideration of the impact on both customers and staff.

To what extent were frontline staff involved in the design of the project, and at what point in the project?

Ms Bundy : Staff were involved from May 2015, when the measures started, and there were around 200 staff across five compliance sites that were involved in that activity.

CHAIR: What activity was that?

Ms Bundy : That was the rollout of the measures that started from 1 July 2015, and the manual process that attached to the measures.

CHAIR: One of the principles in the new digital standard is that you involve, early on, people who have deep familiarity with the business process in designing any digital solution to enhance a business process. I'm trying to understand to what extent the staff were involved in designing the business processes that were going to be digitised.

Ms Bundy : Those business processes were part of the underpinning of the online service that commenced from 1 July 2016. The measures started from 1 July 2015. Staff were involved in the design of those processes quite extensively. They were improved over that year. Then the staff themselves were involved in working with ICT to design the system.

CHAIR: Were they? Stop there, because that's what I'm trying to ask you about. How were staff involved in assisting ICT to design the system?

Ms Bundy : It is actually compliance staff. There's a team up in Brisbane within the compliance division that wrote the requirements and worked with some of the operational staff up in Brisbane, and then worked with ICT to design the system. Then during the pilot—

CHAIR: Can we just stop there. So, there are a group of staff—how many are there? There is a whole team? There are three? There's one person?

Ms Bundy : I'd have to take that on notice in terms of the actual numbers of staff. We've got some answers from the Senate inquiry, where we were asked about this, so I can also provide those.

CHAIR: So you've already been asked about it by another group of senators but you can't tell me today?

Ms Bundy : Not specifically the numbers of staff involved but the staff consultation that occurred in the development of the online service—we have been asked about that and answered that.

CHAIR: Have you prepared an answer to that earlier question?

Ms Bundy : In terms of the specific numbers of staff?

CHAIR: Here's another way around it. Why don't you describe to me, in the best way you can, in what ways you leveraged the knowledge and expertise of compliance staff and frontline staff to design the system? Because I'm having difficulty understanding it from the evidence you've given so far.

Ms Bundy : The compliance staff were responsible for designing the requirements. These are staff that sit within the compliance division that sit next to compliance staff, who do compliance reviews.

CHAIR: And, when you say they were involved, what does that mean?

Ms Bundy : They were responsible for the design of the system.

CHAIR: Were they in a project team?

Ms Bundy : Yes.

CHAIR: So they were in a project team. And that was with staff from your area, Mr McHardie?

Mr McHardie : From ICT. We had 136 staff involved in some capacity during the build cycle from the 2015-16 fiscal year—not 136 dedicated throughout, but we had 136 people from ICT involved in the project that year.

CHAIR: From ICT alone?

Mr McHardie : Yes.

CHAIR: And then there were other staff from the compliance area who were involved, Ms Bundy, but you don't know how many?

Ms Bundy : No, not specifically. I think the compliance division has currently got around 2,000 staff; at that time, I'm not sure how many staff it had.

Mr McHardie : We'd have to take that on notice.

CHAIR: Right. Should I take it that you consider that you complied with that element of the digital service standard?

Mr McNamara : The digital service standard didn't exist at the time, but, yes, staff were part of the process—operational staff were part of the design process. They're part of enhancements we make to the process now. The compliance division is a very large division. It's unlikely that the 900-odd people who would have been in the division at the time were each individually consulted. It's not an efficient way to design something.

CHAIR: Sure.

Mr McNamara : But I don't think it's true to say that operational staff weren't involved in the design. I think we would counter that and say operational staff were involved in the design of the system.

CHAIR: On notice, can you please provide a more detailed description for the committee than you've been able to today about how they were involved, what the nature of that involvement was, how many personnel were involved, what kinds of roles they were drawn from and that kind of information.

Senator PATRICK: This time line is relatively short for a project, so I presume that in effect you had an existing system and you inserted some capability or algorithms into that system—is that correct?

Mr McHardie : That's correct. I mentioned the ISIS system, which is the entitlement assessment system, which processes all claims from a legacy perspective in the Centrelink master program, which is what we are focused on. The ISIS system, based on a customer's circumstance that we're told, is able to calculate a customer's eligibility for a claim type and then how much they're entitled to based on their relationships with other claims or maybe other entities in the Centrelink master program such as relatives or maybe other claims that they've been on in the past. It's the heart of Centrelink. It's the entitlement assessment. It's got all the rules in it.

Senator PATRICK: And it's developed in-house?

Mr McHardie : Thirty years ago—so it was rolled out in 1989.

Senator PATRICK: You have the source code, the libraries, the compilers—everything. You do it all in-house? How many coders do you have that look after that system?

Mr McHardie : On a day-to-day basis?

Senator PATRICK: Typically. You said you do it in-house. I presume you didn't bring in contractors to do this work?

Mr McHardie : No, we're supplemented by contractors. All of our builds are led by public servants and supplemented by contractors.

Senator PATRICK: So these are software engineers and system engineers and so forth. Is that a contract with one particular company?

Mr McHardie : No, a myriad.

Senator PATRICK: So lots of companies. So this is coming out of your—what did you call it? BAU budget?

Mr McHardie : No, this was a project. So all of these resources here would have been billed against the OCI project

Senator PATRICK: And that's that activity from 15 October to July 2016?

Mr McHardie : Correct, remembering, at the same time, there would have been people working BAU looking after the standard ISIS system. When you say it was a very short runway to build a capability out as big as this, a lot of it was working on customer-facing and staff-facing screens that were required. We weren't touching the back-end entitlement calculation system at all.

Senator PATRICK: But there's some matching going on here and some algorithm being developed. Was it an in-house algorithm or something new you introduced into ISIS?

Mr McNamara : No, there's no algorithm. The data matching was a process that was something happening prior to this. All it's doing is asking what's in the Centrelink database and the ATO database and coming up with a list of anomalies.

Senator PATRICK: Around some window of error, I presume.

Mr McNamara : Yes. We're not trying to chase small anomalies. We have lots of rules within that system in trying to data match and, once we've got that pool, we then have a selection process to take people out of that pool based on what we think is the probability that they actually have been overpaid. So we're trying to select people out of that pool that we think have the highest probability. All that process hasn't changed.

Senator PATRICK: So that was already in existence except now someone got an alert on a screen or something happened that flagged that someone had met the threshold in the matching: 'Here's Mr Bloggs and here's the circumstance as to why they were pulled from the matching.'

Mr McNamara : So, prior to the measure, based on the nature of the measure, we would have sent out a small number of letters to say to a customer, 'Can you please explain this anomaly.' The measure allowed us to send out a lot more letters.

Senator PATRICK: Instead of having a human involved, you now had—did it automatically go?

Mr McNamara : No.

Senator PATRICK: Was there any—

Mr McNamara : No, it was the same, Senator. There was no change to the process. To the point where a letter was sent to a customer, it was the same as we'd always done.

Senator PATRICK: Just coded now instead of—

Mr McNamara : No.

Mr McHardie : So probably the easiest way to explain it—

Senator PATRICK: I'm just trying to understand the activity that took place.

Mr McHardie : So contact is made with a customer. Traditionally, a customer would then interact with a compliance officer on the telephone or face to face. We would then extract from the customer their set of circumstances over the period in question. We'd input that into the ISIS system, and we would say: based on the circumstance the customer has just told us, compared to what ISIS knows from what they told us at a time in the past, how do they marry up? That's at the heart of what we're doing here. But, rather than sitting with a compliance officer, you are entering those figures online yourself. The comparison that was then done in the back end between what the customer tells us and what the customer told us previously—those calculations were all done in ISIS. There's no new algorithm; they're the same rule sets.

Senator PATRICK: Sure. I understand what happens there—except that humans are pretty good at picking up some anomaly, where we go, 'The customer's probably got that one wrong.' So there would have been some thresholds where you said, 'It falls outside that threshold, therefore it's first stage. We need to look at it.' And then, if it's so far out that it's got to be an error, you would have another threshold there, I imagine, or something like that.

Mr McNamara : Yes. There are a number of things. First of all, if someone had reasonably complex financial arrangements, we wouldn't send them to the online system, to start with. Second of all, with the online system, if you put in a large amount of income, for instance, in one month, relative to average weekly earnings, the system wouldn't calculate your outcome. It would actually ask you to ring us. That's still the case today. So we have lots of rules in the system. As I've previously said in evidence, the online system is predominantly for people with relatively simple arrangements. If someone has something a bit more complex, the online system isn't the way we want them processed.

CHAIR: Mr McNamara, the Senate committee, in their inquiry, were very critical of one particular aspect of this change, and they argued—they found, in fact—that previously the department enabled collection of data where an anomaly was present, but that the system you implemented shifted the onus far more heavily on to the recipient to provide additional information where there was an anomaly. That's correct, isn't it? That was the nature of the efficiency that you found, in some way?

Mr McNamara : That change had happened outside the system, in a sense. First of all, we've always required the customer to provide us with the information.

CHAIR: Correct.

Mr McNamara : The way we do reviews when we're looking at a change is we have to sort of assess where it sits on the compliance-fraud spectrum. When we were doing a small number of compliance reviews—you could think about it in terms of it being more evidence-gathering—the number of anomalies we were looking at was quite small, and the anomalies tended to be quite large, tending to make us quite certain that the person had been overpaid, and, obviously, there was a chance that that was deliberate, rather than a person making a mistake. Once you move down the pool and you get to a further spectrum, we're actually capturing, generally, people who've just made a mistake. That's generally what we're capturing now.

So what we required was that people give us that information. We always required that, and we still do now. It's the same now in the sense that, if you are dealing with one of my compliance officers and you're able to explain how you haven't been able to obtain the information, or that it would cause you financial detriment to actually do it, we are able to gather that information. So it was always up to the individual compliance officer who was undertaking a review whether they went and essentially got the information for you or whether they required you to do it.

CHAIR: Mr McNamara, you're putting forward, I think, a normative basis for why this is fair and reasonable. But the practical consequence of this new system design was that a whole range of people failed to meet that request for additional information and then had a debt raised against them, and that produced a great deal of confusion and distress in the community. That is the problem around communication that the Ombudsman was referring to. So there were things in the design that produced very different system outcomes from what was happening before under the manual system. Was that intentional? Was that what you expected?

Mr McNamara : What the government has talked about before with that is that it's not the department who has that burden; it's the employers who have that burden. It is an important aspect of red tape. If we don't ask for that information from the person, we have to ask for it from their employer, so there is a burden on the employer if we go to them and say, 'Can your payroll section please provide all the information on this customer over a period of time,' as opposed to the customer, who actually has that information already in some cases, providing that information to us. So I think it's important that any cost shift is between employer and employee and between business and the employee; it isn't between us and the employee.

Senator PATRICK: Is the system working well now?

Mr McNamara : We would say the system is working well now, yes, but I would say we are looking—

Senator PATRICK: In the context of the metric I mentioned before, not your metrics. In terms of customers, from a constituent perspective, do you—

CHAIR: Citizens, we used to call them!

Senator PATRICK: Citizens, yes. Is their experience better than what the Ombudsman described? Has the system improved somewhat?

Mr McNamara : Yes, I think we did improve the system. I think we have improved the customer experience through the changes we made through February 2017. But, as we outlined at estimates, we are looking to improve the system even further and make more changes, because I think we can improve the customer experience. I think one of the things we are very focused on with the customer experience is to continually improve it. It's not a matter of saying, 'Well, we did okay: people are happy at the moment,' because one of the things we find generally in the department is that people's expectation of service continues to go up, so we have to be able to match that going forward as well.

Senator PATRICK: Sure. Where I'm trying to get to—and thank you for describing that—is, firstly, that this project time line only goes up to July 2016. You said that February 2017 was a key date in the improvements.

Mr McNamara : Yes. In January and February 2017, we made a number of enhancements and changes to the letters. Essentially, we clarified a lot of the letters. We simplified the language.

Senator PATRICK: I'm just curious as to why it's not on the time line. That's all.

Mr McHardie : That was the original system build—what we call the OCI, the online compliance intervention system. The follow on, which is what we now call employment income confirmation, is what Mr McNamara is talking about—the build that was undertaken in January-February.

Senator PATRICK: The same project team but just a new project and a new name?

Mr McHardie : And we involved the DTA.

Senator PATRICK: Okay. There you go. There's some credit to you guys. I think this takes us to where we're trying to get to in that, as you've approached this to do something, there have been some bumps in the road and you've ended up now with a much better system. I come from an engineering and software development background. Obviously there's a bunch of testing that goes along the way. This is the heart of where we're trying to get to in some respects for the committee: how do we make sure, when we roll out services, that we don't have these bumps in the road? I'm suspecting one of the areas you could have gone to was testing. Rather than testing on the public, I'm interested in what test regime you had in place. You said you had a sample, and I get that. How did you go from that sample of 1,000 people to full deployment? When we poll, we typically use those sorts of numbers, so I'm guessing it's a good number to work with. How did we go from a testing regime of 1,000 people and not getting problems to rollout and getting lots of problems? I'm trying to get to the testing aspect because I think that's part of the solution to this, and maybe it becomes a recommendation out of the committee process.

Mr McNamara : I think there are a couple of things. As you can appreciate, a thousand can give you a sample but it can't give you the full spectrum of people. When you roll out, you get a different reaction when you go to different people. One of the things we've found in the compliance review space, I think it's fair to say, is that people who really don't think they've done the wrong thing and who have genuinely tried to comply with the system do find the idea of talking about their previous history quite confronting. It doesn't really make any difference—the nature of our system. They see it as an integrity issue, and I can accept that. I think that's quite appropriate. If you've been giving us information quite often and you've been doing it quite diligently, for someone to turn up and say, 'I want to essentially audit what you've told me,' can be quite confronting.

In this particular case, there's quite a number of examples we have within the system where, for instance, people quite diligently told us their net pay, but they told us they'd put it in gross pay on our online app. They're diligently telling us the wrong information. Therefore, five years later, when we come along because we now have the capacity to look at the data match and say, 'Hang on—you haven't quite told us the right information', people quite rightly become upset. The difficulty for us is they were overpaid at the time, and some people find this quite confronting as a process. As we've rolled that out, we've understood that nature—that some people are in that category, where other people are in the category of: 'I just needed the money at the time. I've told you the wrong thing. How do I pay it back?' My compliance officers deal with a spectrum of people at the moment, and the online system has to deal with that spectrum.

Senator PATRICK: Sure. The design of the online system has to deal with that, and the testing of the online system has to deal with that.

Mr McNamara : That's right.

Senator PATRICK: That's what I'm trying to get to. Maybe the solution to this was to have a psychologist as an input to the design process and the testing process.

Mr McNamara : What I'd say, following on from that and probably from your questions earlier, trying to understand how we've improved things, is we've done a lot of user testing, but it's the nature of how user testing interacts with your design. One of the things that we've evolved to—and Mr McHardie talked about the customer division and those sorts of things that we've introduced—is we're a lot more interactive with the customer early on in the elements of the design. One of the things you could say is: you could talk to people, build a whole system and go test it with them. I think the Digital Service Standard is a bit more about an interactive way of doing things as we go on that spectrum.

Senator PATRICK: This looks quite iterative to me anyway.

Mr McNamara : But in terms of the design, I'd say our current process is a lot more interactive at that design stage, and we're hopeful that it delivers a good outcome in terms of clarity from the start.

CHAIR: Maybe a better outcome? We could concede that we're hoping for a better outcome than last time, surely?

Mr McNamara : Yes.

Mr Alexander : And the Digital Service Standard won't solve all the problems of the world. It is a set of 13 processes that, if people follow them, and then they follow the agile design method—which is to do their discovery; understand the problem properly; do the alpha bit of work, as we call it; come up with prototypes and a hypothesis; test them with users as well; build a beta, which is a test version of it; test that; and then go live with the system but also iterate through it—it solves a lot of problems. You avoid some problems that are pretty obvious through the life of the journey. So, we have a government-mandated standard for agencies with the Digital Service Standard.

Senator PATRICK: That's not in your submission to us, is it?

Mr Alexander : The Digital Service Standard will be in our submission.

Senator PATRICK: No, the advice you were giving there. That's the sort of thing that flows out—

Mr Alexander : But that is the Digital Service Standard, which the government has mandated for all agencies to use with all their digital programs and activities. It wasn't there at the start of this program, but it is there now, and agencies doing these programs—in fact, DHS is a great advocate of it. Their senior executive and their secretary have band-aided it into their organisation and are doing good practice. I'd say the positive is that we are starting to see way better outcomes, not just from DHS but across government, because agencies know the right questions to ask in thinking about the types of things they need to do.

Dr Seebeck : You can test in alpha and move to beta but until you thoroughly test something in the wild you are not going to have a full understanding of it.

Senator PATRICK: Sure. That's where the thousand people came on—

Dr Seebeck : This also goes back to understanding your test data, which is the case whether you are automating a service or doing AI—really understanding the data you are building things on, the user group and so on. This is exactly the same problem that is emerging, which is leading into things like human-centred design, which is focused very much on the human, not merely people as a user—it is actually trying to understand the user. But that's further down the track.

CHAIR: Dr Seeback, do you think that the approach taken in the OCI program would form a model that you'd recommend other departments or projects to use?

Dr Seebeck : The learning over time, yes. Again, what we are seeing here is a good example of the automation we are seeing through. Pick up and learn from their errors and mistakes or what they've understood, and pick up and learn from it. I wouldn't say to do exactly what they did to start with. But, again, you can tell from what they did with the testing and so on that everything was done with the best of intent. It is putting it in the wild, learning to go through the beta and coming back and speaking to us about the user-centred design. Mr Alexander is quite right. DHS has been one of the best advocates of the digital service standard in that user-centred design approach. They've picked up and learnt from the work they've done in the past.

Senator PATRICK: That was all done perfectly, on the evidence we had, so maybe they didn't learn anything!

Mr Alexander : My team did that review with them in early January. It would be our view that the forms—online and letters—weren't great. They were confusing and complex. With the discussions we've had with DHS since then, they've taken that and kind of indoctrinated it through the organisation to say, 'We need to follow the digital service standard, not old waterfall testing models.'

Dr Seebeck : Just listening to the conversation, there are some things here—the importance of the end-to-end service. It is not merely a fire and forget. Keep coming back and getting the feedback as well. The importance of that testing, as you've pointed out, Senator, and the diversity of that test set are really important. Those are the sorts of lessons that come out in these things.

Senator PATRICK: The first thing I'd do when I went to a test is get the test document and put it in the corner and do all the things they didn't want you to do. You'd almost always get a bug within a minute or two.

Dr Seebeck : Yes.

Mr Alexander : And there is a substantial difference between user testing and user research. User testing is putting someone in front of a screen asking, 'Do you like this screen?' User research is asking, 'How do we solve this problem with you?' It is not contextualised to technology; it is about going through the problem and getting an outcome for a person and applying technology potentially as an enabling service.

CHAIR: I suppose that wasn't really the origin of this project, was it, because the origin of this project was budget savings? Given the story that's been told so far about the origin of the project and its implementation, the user experience wasn't the primary purpose for undertaking the project.

Mr McNamara : The integrity of the welfare system is what we're about in the compliance area. That's what our—

Senator PATRICK: We went from a $3.7 billion estimate of savings to $300 million of purported debts to an actual recovery of $24 million. I don't know how that got so wrong. That's unbelievable.

Mr McNamara : Well, to date we've saved $900 million through income matching—through these measures—and we've recovered nearly $270 million.

Senator PATRICK: It might be that the data we have is wrong then.

Mr McNamara : Your data would have been right at that point in time. Those figures sound like they would have been the right figures 12 months ago.

Senator PATRICK: The $3.7 billion was the projected savings. Over what period?

Mr McNamara : The measures run to—

Ms Bundy : 2021.

Mr McNamara : 2021.

CHAIR: Do you think you're on track to make those savings?

Mr McNamara : Yes, we're confident we'll make those types of savings.

CHAIR: I think we are ready to move on to the discussion around WPIT. That may mean a change of personnel at the table. Thanks very much for your time.