Note: Where available, the PDF/Word icon below is provided to view the complete and fully formatted document
Joint Committee of Public Accounts and Audit - 21/09/2011 - National funding agreements

ANDERSSON, Ms Catherine, Research Manager; Secretariat for the Steering Committee for the Review of Government Service Provision, Productivity Commission

McDONALD, Mr Lawrence, Assistant Commissioner, Social Infrastructure Branch; Head of Secretariat for the Steering Committee for the Review of Government Service Provision, Productivity Commission

Committee met at 11:23

CHAIR ( Mr Oakeshott ): Welcome. I declare open today's public hearing of the Joint Standing Committee of Public Accounts and Audit inquiry into the national funding agreements. Given the short time available, statements and comments by witnesses should be relevant and succinct. Although the committee does not require you to give evidence under oath, I advise you that these hearings are formal proceedings of the parliament and warrant the same respect as proceedings of the respective houses. The giving of false or misleading evidence is a serious matter and may be regarded as a contempt of parliament. The evidence given today will be recorded by Hansard and attracts parliamentary privilege.

Before we begin, I welcome Captain Will Martin. I gather you are on secondment to Deb. Enjoy your time watching the Joint Standing Committee of Public Accounts and Audit at work.

Thank you for coming in. This will be really valuable to us. We are getting down to the deliberation stage. This is our final public hearing, so your input will help us a lot. Before we proceed to questions, do either of you want to make an opening statement?

Mr McDonald : I have a very brief opening statement. I note that, although we are Australian government public servants, for the purposes of our role as a secretariat we provide a neutral secretariat to the steering committee. The Australian government's interests are represented by its own steering committee members around the table, so we provide a neutral service as a secretariat. We understand the committee's terms of reference. If it pleases the chair, we propose to focus on the third dot point, which relates to the COAG Reform Council's role in the national performance reporting system, because our role is basically as a support to the COAG Reform Council.

To give you a little bit of background, the steering committee has responsibility for four streams of reporting for the Council of Australian Governments. The oldest of those is the Report on government services, which compares the performance of the Australian state and territory governments providing a range of mainstream services to Australian people, across health, education, justice and community services. That report has been produced annually since 1993. We also produce the regular Overcoming Indigenous disadvantage report. The fifth OID report was released in August this year. We also produce the Indigenous expenditure report which estimates expenditure on services to Indigenous Australians. It maps to the areas covered by the Overcoming Indigenous disadvantage report, so you have high-level information about outcomes and expenditure on services. We also have a role in supporting the COAG Reform Council in the national performance reporting system, and that is what I will be focusing on in the rest of my statement.

Under the Intergovernmental Agreement on Federal Financial Relations, the steering committee has a specific role in collating the performance data under the six national agreements for the COAG Reform Council. We are currently midway through our third cycle of performance reporting on the national agreements. We completed the National Education Agreement and the National Agreement for Skills and Workforce Development at the end of June, and we are partway through the National Indigenous Reform Agreement, the National Healthcare Agreement, the National Disability Agreement and the National Affordable Housing Agreement. They are due to the CRC by the end of December this year.

We also, potentially under the IGA, have a role in reporting on national partnership agreements to the extent that they support the objectives in the national agreements. To date, the CRC has not requested the steering committee to include any information about performance under NPs in our national agreement reports to the CRC. The steering committee also has a role in commenting on the quality of the data reported under the national agreements using data quality statements provided by the data providers themselves. So the data providers provide a data quality statement according to the Australian Bureau of Statistics' data quality framework. The steering committee then summarises that information and adds some of its own commentary to that, in what are called 'comments on data quality'.

In relation to the national partnership agreements, we do some reporting. We are directly referenced in one national partnership—the National Partnership Agreement on Hospital and Health Workforce Reform. The steering committee has a very limited role there. The jurisdictions are required to provide certain data to the steering committee. The NP does not say what the steering committee should then do with that data. In the interests of accountability and transparency, the steering committee places those reports on its website with a brief comment about whether or not jurisdictions have met the timetable set out in the NP. We have also been requested by the COAG Reform Council to collate the performance data for four national partnerships that have reward funding attached to them. They are the elective surgery NP, which has now been completed—reporting under that NP has now finished; the NP on essential vaccines; the NP on youth attainment and transitions; and the steering committee has recently agreed to a CRC request to report on the forthcoming improving public hospitals NP.

In some slightly tangential work under the national performance reporting system, the Heads of Treasuries Committee on Federal Financial Relations back in 2009 asked the steering committee to draw together information on data gaps under the national agreements and provide a report to that committee. We last provided an update on the data gaps reports in September last year to the HoTs committee. They have not requested a further update since that time.

Under some federal financial circulars that set out some of the practical procedures for implementing the national performance reporting system, the steering committee is also referenced or the secretariat is specifically referenced. It is recommended that people who are developing new national partnership agreements with associated reward funding consult with the secretariat in developing performance indicators under those NPs.

As you are no doubt aware, following the COAG review of the national agreements in 2010, in February this year they agreed to a rolling series of reviews of the performance indicators in the national agreements. The secretariat has been asked to be an observer on the various working groups undertaking those reviews, largely to provide our expertise on the technical issues underlying the data. I am happy to talk in more detail about the indicator and data issues associated with performance reporting if you would like me to, or you might like to ask me specific questions.

CHAIR: I think we will pick that up in questions, because data is one of the issues that we are focusing our attention on, thanks to a lot of evidence in that regard. So we can come back to that in questions, but if there is anything more you wanted to add—

Mr McDonald : Otherwise, that summarises our formal role in the national performance reporting system. As you can see, it is largely a supportive, technical role. We have not really played any part either as a secretariat or as the Productivity Commission separately in the policy development of the new national performance reporting system.

CHAIR: Okay. Catherine, is there anything you wanted to add?

Ms Andersson : No, no further comments.

CHAIR: All right. Before we go to questions, I will just acknowledge there is a visiting delegation from Indonesia in the room. The very important audit office and our equivalent public accounts committee from Indonesia, whom the committee met with this morning. I am sure you will see all around the building today. Welcome, and thank you for the work you are doing in Indonesia. We appreciate your attendance. We also have in the gallery two members of the Civil Liberties Australia group, so thank you too for coming along; we look forward to any conversations about parliamentary oversight or scrutiny in regards to COAG processes and national partnership agreements—something I know you have talked about previously.

Moving to questions, you mentioned data-gap work that you have done. Can you provide the committee with any of the detail of the figures in regard to the data gaps that you identified in September last year?

Mr McDonald : I might need to take advice on that. It is just that we produce that at the request of the heads of treasuries committee. In the past, they have certainly given us permission to share that with some other groups of people, although they have declined us permission to make it publicly available on our website. So I would like to seek their advice first.

CHAIR: That is fine. We have, as I mentioned before, been getting a lot of evidence to suggest data is a significant problem. Those within the beltway of COAG seem to be suggesting that everyone is aware of it and onto it and that a process is now well underway to address that. Would that be a position shared by the Productivity Commission? Is there confidence that decision makers are now sufficiently aware that timely and robust data is a problem and that there needs to be a process to address that for better public policy?

Mr McDonald : If I could make one clarification: I can speak for the secretariat but I am not really in a position to speak for the Productivity Commission as a whole. I certainly would agree that most of the players now are very aware of the nature of the data gaps and where they lie, and the constraints that they place on the performance reporting system, but I would also say that one of the benefits of the reforms has been to highlight those gaps and create very strong incentives for them to be closed. I have been involved in trying to do comparative national performance reporting for almost 10 years as part of the secretariat, and it was often very slow going to get change to data to make it more comparable, to make it more timely, because there was no overriding strong incentive. We relied on consensus and building agreement across jurisdictions. We have seen a great improvement in data quality, the availability of data and the timeliness of data. That has come about because there has been a big financial commitment to improving data systems as well.

CHAIR: So your evidence to us is that we should have confidence that the matter is being addressed in an appropriate way?

Mr McDonald : There is significant progress being made, although the reviews of the performance indicator frameworks are still underway. They are looking at whether the indicators chosen were appropriate and will ever be able to be conceptualised and reported against. Some of the data gap issues came about because indicators were chosen that were conceptually flawed or were never going to be able to be measured. So the first step is actually identifying indicators that are meaningful, in that they measure what you want to measure, but also practical, in that you can measure them in a practical sense. I think a lot of progress has been made in that area.

CHAIR: In your evidence you also mentioned the COAG Reform Council, which is also an interest that the committee is looking at. Do you have any reflections on why some pretty strong recommendations by the CRC were arguably left unaddressed for a long time? They had to continually come back to the question of data before there seemed to be some pick up by the system.

Mr McDonald : I can only say that I have shared the CRC's frustrations. We are responsible for collating those data. I am always disappointed when I cannot provide a full report to the CRC, but I am afraid I cannot speak as to why.

CHAIR: It is a question of the broader reflection of the teeth of the CRC, the independence of the CRC and the gravitas of the CRC. I would have thought that, on behalf of Australian taxpayers, if the CRC identifies a problem and they make a recommendation, COAG as a process in an ideal world would respond with vigour. But that does not seem to be reflected in this data store in the way that we would all like.

Mr McDonald : I am afraid I cannot talk to COAG's processes. The CRC sits at the top of a hierarchy, a pyramid. Although we generally agree with their recommendations—and many of their recommendations flow out of the comments on data quality and the statements that the steering committee has included in its reports—sometimes there is a lot of work going on under the surface that people higher up the pyramid may not be aware of. For example, with the National Affordable Housing Agreement, we have a huge conceptual issue just counting the number of homeless people. There is no agreement amongst experts as to how to count the number of homeless people, which means many of the indicators in that report cannot be reported against or can only be reported against using very imperfect proxies. So the CRC is very right to point out that we are meant to be assessing whether jurisdictions are achieving the desired outcomes and objectives of reducing the number of homeless people and people at risk of homelessness. But we cannot do that because we do not have the appropriate data. They are very right to make that statement. But we are involved down at the working group level with people who are working hard at that initial conceptual level to try and develop a robust methodology so that, for future reports, we will have that data. So I think you need both those levels working at once. You do need the CRC to keep the heat on, but people are actually doing the ground work at this other level to try and address the issues.

CHAIR: Related to that is the question of the welcome shift from input management to outcome management. There are some significant differences in what we are looking for and why we are looking for it when it comes to outcomes, but, culturally, is the system aware of those differences and ready to focus on an outcomes strategy in these national partnership agreements? If not, what are we going to do about it?

Mr McDonald : I know less about the national partnership agreements, and sometimes the national partnership agreements by their very nature focus more on some outputs and inputs because they can be quite specific to a particular reform or aim to facilitate a particular project. So sometimes outputs are appropriate there. What people are being paid for is to do a particular thing. The national agreements are meant to be pitched at that higher outcome level, and I think there is increasing awareness of that now. These reviews of the national agreement performance indicator frameworks are taking place against a conceptual framework developed and endorsed by the Heads of Treasuries Committee on Federal Financial Relations. It makes quite clear that distinction between outcomes, outputs and inputs, that the focus should be on your high-level outcomes and that sometimes you may need proxy indicators that are outputs but, if you do, those outputs must be clearly linked to the outcomes that you want. There must be clear a evidence base that shows why you are using an output as a proxy. So I think there is growing awareness of the distinction and these reviews are really attempting to put that into practice.

CHAIR: In recommendations we make, is there any more that we can do to help culturally push for that outcomes focus? Or do you think it is happening over time with the HoTs reviews and other mechanisms that are driving that anyway?

Mr McDonald : I think it always helps to reinforce the message because there is an understandable desire, often at the line agency level, to be held accountable against things they feel they directly control. Agencies can directly control many outputs, but they feel they have less control over many outcomes—and that is true. Outcomes can be affected by a lot of external contextual factors. An agency can say, 'I don't control the unemployment rate, and the unemployment rate is actually a major factor in homelessness, so you can't hold me responsible for that high-level outcome.' So it does help to continually reinforce that policymakers and governments appreciate that you may not control this entire outcome but it is still what you are trying to influence and we are still going to measure it and then ask you to what extent you contributed to changes in that outcome.

Ms O'NEILL: I think perhaps in that answer you have articulated one of the concerns that was raised: at the highest level the outcome drive is very, very strong and the problem with KPIs is that they do not necessarily bridge the gap between the outputs that might be what we are used to measuring and the outcomes that we seek. Regarding the KPIs, that is one of the roles that you said you provide advice on. How confident are you that the KPIs you are creating now are shining more light on outcomes and shifting from outputs? In what way are the KPIs changing to achieve that end?

Mr McDonald : I am a little uncomfortable being too explicit there because in the current reviews the secretariat is an observer on those working groups undertaking the work and we provide our best advice to those working groups. We are not decision makers in that process. The jurisdictions all have representatives on those working groups and they make recommendations to a steering committee of those reviews, not the steering committee for the review of government service provision but a different committee—I think it may be called the steering group. We are not a decision maker in those processes, so I can only say that within the process itself there has been very robust discussion about the distinctions between output measures and outcome measures and we have made very strong statements about our belief that what is being requested by COAG and by the reviews is a focus on the higher level outcome measures.

Ms O'NEILL: In a report from the battlefield, who is winning?

Mr McDonald : The fight continues. The reviews are not finished yet, so there are still very robust discussions around the table. I do not think it would be appropriate for me to try and second-guess what the conclusions are going to be.

Ms O'NEILL: Could you explain the driving philosophy about the sorts of KPIs that you are bringing to the table and those discussions and how they differ from the current ones? If we are looking for KPIs that reflect the best practice that you are obviously offering and the ones that still might be continued from another regime, what would be the characteristic differences between those?

Mr McDonald : We do not so much bring specific indicators to the discussion; in some instances we do if we think there is a better way of measuring something. Many of these indicators had COAG endorsement when the NAs were signed, so there is a limit to the extent to which we would be proposing very strong changes in direction to a framework endorsed by COAG outside the application of the conceptual framework to the indicators, which we have been asked to do. That is what the working groups have been asked to do—to take the conceptual framework and run the existing set of indicators against it.

In the key messages that the steering committee has put in its reports and in its data gaps report to the HoTs committee, its highest level principles are, 'First of all, be conscious of why you are doing the performance measurement in the first place—what incentives are you trying to create? what is the objective of the reporting system?—and then, when you start looking at specific indicators that you want to include, you have to place as your highest priority the appropriateness of the indicator and addressing your conceptual indicators.' I think that, in the past, we did have a bit of a tendency to say 'we can measure this, so we should put it in—it is one measurable thing that we've got' rather than say, 'What does this inform for us? How does this help improve our understanding of whether or not we are achieving the objectives?' So first of all you have to do that conceptual work.

The steering committee also, for the first round of NA reports, was a little bit critical. Many of them seemed to lack a conceptual framework. They were collections of indicators, and you could understand why each indicator was there—you could say 'yes, I can understand that is a measure of health'—but it did not fit within a conceptual framework that told you where it sat relative to the other indicators; which objective was it meant to be informing? That varied across the different agreements—some of them were more structured than others—and that is one of the reasons that this review is trying to apply a broad, consistent conceptual framework to the indicators. That can also help you cull the number of indicators. The national health agreement has around 70 indicators in it, which is just very difficult to make any sense of—you can get lost in the morass of information.

So, once you have addressed those broad conceptual issues, you can come down to the IGA itself, which includes a set of principles that indicators are meant to address, and they are fairly high-level conceptual issues that should be addressed. Then, below that, you come down to a third level which is really your data quality issues—'Have you got the raw data that will inform those indicators?' So it is sort of a hierarchy of issues. Firstly, there is the question of why you are doing the performance reporting in the first place and whether you have a conceptual framework to assist you to decide whether or not an indicator should be included. Then you have a second set of principles: 'Here's a proposed indicator. Is it meaningful, appropriate, measurable, unambiguous'—those IGA type characteristics?' Then, once you have agreed, 'Yes, in principle that this is the kind of indicator we want to measure' you come down to your data issues: are data available? Is it timely? Can it be disaggregated by the different subpopulations of interest that we have?

Ms O'NEILL: I have two more questions. The first is, as these things progress and the changes are made, would it be possible to have an overview audit to identify whether those principles have been embedded?

Mr McDonald : In our initial data gaps report to the HoTs committee, there was a broad table that assessed the indicators. You could add a couple of additional criteria to that and it would be a pretty rough but reasonably high level look at the characteristics of the indicators and how they fit against those various principles. It is certainly doable. It is something that the secretariat and the steering committee did for the around 400 indicators in the Report on government services during last year, so it is certainly a doable practice.

Ms O'NEILL: Would you say that that would be one way for us to see how well these changes are being adopted—that we could use that as a tool to have a bit of a look and see?

Mr McDonald : Yes.

Ms O'NEILL: Great. Here is the second question. You mentioned in your opening remarks the improvement in data collection. How advanced is that? Do we have the technologies out there now to gather and process the data, or are there still some impediments to high-quality data collection?

Mr McDonald : It varies across the six NAs—perhaps leaving to one side the NIRA, because there are quite specific issues relating to getting robust and reliable Indigenous data across the board. In the health area there has been a lot of rapid progress made—much more than we had seen, perhaps, in the previous five years of Report on government services reporting. Within two or three years there has been extremely rapid progress made both in the quality of the data and in the actual measurement of things that previously were not being measured, and there have been great improvements in the timeliness of data. That has come about both through system changes at the jurisdiction level, where jurisdictions are doing things differently, and through significant changes by the main collector or manager of the health data, which is the Australian Institute of Health and Welfare. They have done a really good job of making more data available more quickly.

In the education area we were always fairly well served. There have been good collections there, and we have the national testing regime, which is quite good. We do not have good data there on things like retention rates—whether kids who started at year 7 are still there in years 8, 9 and 10.

Ms O'NEILL: And student attendance.

Mr McDonald : And student attendance data. COAG requested that jurisdictions produce nationally consistent attendance data, but we currently cannot compare the attendance data for the government sector with that for the non-government sector. It is very difficult to compare across jurisdictions. We cannot add up a national total. It is kind of a 'point in time' collection for one week, and with the numbers you get out there are things you cannot work out. Say you had 10 per cent nonattendance. You do not know if it is that 10 per cent of kids never attend or if, over time, a different 10 per cent of kids are not there on any given day. It is very hard to interpret the data. I do not want to sound too critical, because we have wanted nationally comparable attendance data for a good 15 years. We have finally got it, but it is not perfect data.

The disabilities area is fairly poorly served; we do not have very good disabilities data, because we have to rely primarily on the Survey of Disability, Ageing and Carers. Previously that was only a six-yearly survey, so you had very long gaps between survey outcomes. I understand that is now moving to a three-yearly cycle and it will be able to inform reporting much more consistently.

As I said earlier, we have a few issues in the affordable housing area. We do not have good measures of homelessness and we do not have good measures of the match of housing supply and demand in the private market to try and measure affordability of private sector housing. So it is different across the different areas, but there is work going on in all of those areas as well.

Ms O'NEILL: Towards the same end?

Mr McDonald : In each area, towards addressing the highest priority data gaps in that area.

Ms O'NEILL: Thank you.

CHAIR: To round out this conversation around outputs and outcomes, alongside the outcomes that you have just been talking about, the next logical step is these questions around—there are all sorts of variations of the themes—indices. I went this morning to a briefing from the Institute for Economics and Peace, talking about the US Peace Index and the implications in an economic sense of a 25 per cent reduction in violence in a number of indicators in the US. There are wellness indicators; there are happiness indicators. Is there any thought or work being done in using these indices or an Australian equivalent with an outcomes focus in mind and using those as tools for better public policy and better outcomes generally?

Mr McDonald : I am certainly aware that there is work going on in a lot of different spaces about alternative measures of welfare, even as far as alternative measures of gross domestic product to try to measure national progress—alternative measures of national progress. I can only speak personally. It is not something we have been requested to do as a secretariat or as the steering committee. I am always cautious about indices. I know how to interpret a specific indicator. I know what the numerator and denominator were. I can work out the data quality issues and the context and I can make an informed assessment. When you start putting an index together, you have to be very careful because the debate then becomes about what is included in the index, what you left out of the index and how you weighted the different components of the index. I come from Melbourne. Apparently we are now again the world's most livable city, but if you break that down you find that we have become the world's most livable city for some very odd reasons that you would not necessarily associate with being the world's most livable city.

CHAIR: Coffee, coffee, coffee!

Mr McDonald : I think the number of solar panels on roofs contributed to it. Do not take that as gospel. It is that sort of thing: if you do not know the components of an index and how it is calculated, to me it does not add much value.

CHAIR: That is valuable.

Ms BRODTMANN: As you can see, we are very interested in these outputs and outcomes measures. You mentioned that you have a framework in place which outlines all the conceptual guidelines. That is all well and good, but I am wondering what is happening at the working level in terms of the communication about this framework, how it is being embedded in the culture and whether there is training going on, or are you just sending this out, 'Please follow and tick a box'? Can you give me some idea about what we are doing to enhance the skills and the understanding of that framework at the working level?

Mr McDonald : I suppose there is a range of levels. I can only talk down a couple of levels. COAG agreed to the reform to federal financial relations and the HoTs Committee on Federal Financial Relations is trying to implement that. They agreed on the conceptual framework to be applied by a high-level steering group and then by the specific working groups looking at each agreement. At that level there is good, coherent, consistent knowledge about what is trying to be done and how it is being implemented. Once you go a step below that where you start to need to talk to, for example, the ministerial council technical data working groups—they are the real engine rooms and know the data back to front—you will find their focus often is on the technical data issues. I do not want to disparage them. They do really important and very good work, but their focus is, by necessity, often not on that broader policy area. They can tell you whether you can measure something, how you should measure it and what the limitations of the data that you have are, and sometimes they are even reluctant to make a comment about whether that is policy relevant or not. As you get further down into the technical aspects of the data and then as you get further down into line departments or service delivery agencies—

Ms BRODTMANN: That is the level I am thinking of.

Mr McDonald : At that level I would only be speculating because we have very little to do with that level as a secretariat. We deal with secondary data that we get from the primary data collectors. I would only be speculating.

Ms Andersson : Each of the working groups that sit under that COAG implementation steering group has a central agency representation and also line agency representation—Commonwealth and state. If you talk about the service line agencies, they are actually involved in those working groups as well.

Mr McDonald : It comes down to a matter of practice within each jurisdiction. Once they have agreed to these indicators and they are going to report against them, you hope that you have put in place an incentive so that each jurisdiction wants to improve their performance against those indicators. So in cascading, it puts down incentives within its own structures to achieve the outcomes. I do not know necessarily whether it is expressed to the service delivery people as, 'We must achieve this outcome.' It may be that they try and develop a policy or a program that in its implementation is expected to lead to improvements in that outcome.

Ms BRODTMANN: It would be ideal if from go to whoa, we get the focus on the outcome of what we are trying to achieve so that everyone is clear about why they need to look at particular KPIs and why they need to report on KPIs.

Mr McDonald : I certainly agree with you and it is the way to get consistent, improved performance in an organisation. I just do not know to what extent it is happening in each jurisdiction. I think it may happen more closely with the national partnerships because they are very specific and quite targeted—you have targeted dollars that you must spend in a particular area in pursuit of a particular objective or outcome.

Ms BRODTMANN: Does anyone have visibility of the level of understanding at that service delivery level in terms of the broader outcomes, the broader guidelines and the broader framework? It sounds as if you have some understanding of the various layers but at the grassroots level, so to speak, people are not clear on that.

Mr McDonald : Again I am just speculating, but I think it is taking some time to filter down. The reforms were quite significant reforms to federal financial relations, and they were implemented relatively swiftly for such significant reforms. I think that, understandably, it is taking some time for that to filter down to the agency level, where they see that their funding comes directly from their state or territory budgetary process and they do not necessarily make the link back to the federal financial relations.

Ms SMYTH: Many of my broad questions have been covered in your evidence today; thank you for that. I have one question, really relating to your comments about the National Disability Agreement and the question of your evidence about the poor disability data. I just wondered how that agreement and the data collection and the gaps and quality issues had been addressed most recently in the context of us embarking on the pursuit of an NDIS, because I would like to think that the starting point for a significant reform such as that would be looking at those gaps, revisiting them and getting your advice about ways of addressing those issues.

Mr McDonald : The introduction of the new NDIS is a challenge and an opportunity. It is going to need a very robust set of data in order to inform the development of policy and the implementation of the scheme, but the implementation of the scheme also provides an opportunity to collect a much richer administrative dataset which will then inform future policies. So it is a 'chicken and egg' sort of issue: you need the data to set up the system. One of the issues you need to keep in mind when you develop a system is also what administrative by-product data we want this system to generate to then inform continuous improvement of the system.

The issues with disability data just go down to the fact that we do not have very good prevalence data. We do not have very good information about what proportion of the population actually have different forms of disability and what sorts of constraints or limitations those disabilities place on people's potential to participate in the community, in paid employment and so on. Then, when you start talking about Indigenous people with disability, it becomes an even greater issue.

Ms Andersson : There are three, I suppose, key developments that are happening in that space in terms of disability data. The first one is that the ABS Survey of Disability, Ageing and Carers has now moved from six-yearly to three-yearly, so we had a survey in 2009 and we will have one in 2012. They are doing a lot of development work on that. In the administrative data space, the AIHW manages a Disability Services National Minimum Data Set, and that is going through a significant redevelopment to move to what we did not have, which is an actual person-based collection. It is going to have a long time frame to get there, but they are on the path of doing that. The other thing that is, I suppose, coming up for us at the moment is COAG's endorsement in February this year of the National Disability Strategy and the focus there on being able to capture information on people with disability in mainstream services. So we know through our involvement with relevant disability subcommittees through the community and disability services ministers council that they are looking at ways to be able to capture information in mainstream services as well. So there is actually quite a lot going on in the space.

Ms SMYTH: It would be a fairly long-term process, I would imagine.

Mr ADAMS: That is like access to transport?

Mr McDonald : Or people with disabilities accessing education through the mainstream education system or their access to health services and so on—or housing services and a whole range of social services.

Mr ADAMS: Who comes up with the ideas on the measurement on how we get the KPIs and the different ways of the methodology? Earlier you mentioned unemployment and said that homelessness had a role in being unemployed—you become homeless et cetera. How do we come up with the measurements? Some of these can get pretty difficult in how you get a measurement for an outcome, because there are a lot of social issues in it. You might be able to give me that. The other one is the analytical approach—who analyses the KPIs at the end. I think we might have had a question over here about how we get that decent data that comes through to us or whatever so that we can look at that and say, 'This is what's happening.'

Mr McDonald : This is a question about how the measures were developed. The high level indicators were developed through a series of COAG working groups that were established to start implementing the COAG National Reform Agenda. They developed the indicators, except for the national disability agreement where the relevant ministerial council developed the indicators. They were then endorsed by COAG. But many of those indicators were expressed in quite broad terms. It did not say 'X over Y'; it just said 'housing affordability'. The steering committee was then in a position of having to do the data collation against these fairly broadly stated indicators and so the process it adopted was to directly consult with all of the relevant ministerial council data subcommittees to get their advice on indicator specifications. It also consulted with its own series of review working groups, which are made up of representatives from every state and territory and the Australian government in each of the service areas covered by the Report on government services.

We also consulted directly with all of the data providers that we thought would be potential providers of data for that specific national agreement. We also informally consulted with the CRC in this process, because there was not much point in the steering committee coming up with a technical indicator that the CRC was then not happy with. At the end of that process, because we could not find any other decision maker, the steering committee endorsed the specifications to be included in its report. As it is the author of the report to the CRC, it is responsible for its content. The steering committee endorsed the technical specifications and we then collated the data. The secretariat collated the data against those specifications and provided it to the CRC. Over time the CRC can, as part of its role, make recommendations to COAG about changes to indicators and changes to technical specifications. It also gives regular informal feedback to the secretariat—for instance, 'We would like an additional disaggregation' or 'Can you tweak this measure to get us more information about a specific subpopulation or a different conceptual take on an indicator?' As far as practicable, the steering committee then takes those recommendations, consults with all of the groups that I have previously mentioned and comes up with a new set of specifications. The steering committee considers those, endorses them and then we collate data against those revised specifications for the next report. So we have a very transparent and consultative process and we try and get the input of any relevant body that we think might have an interest in a particular area. But, at the end of the day, the steering committee takes responsibility for the content of its reports and signs offer on the technical specifications.

In relation to who analyses the data at the end, that is largely the role of the COAG Reform Council. That is their role under the IGA. As part of its analytical process, it is required to consult with the jurisdictions for a period of one month before it provides its report to COAG. So the analysis is completely in the hands of the COAG Reform Council for the national agreements. The COAG Reform Council also has a role in relation to national partnerships that have a reward component. It advises Treasury on whether—

Ms Andersson : Jurisdictions have met a benchmark within NP.

Mr McDonald : You have to use the right words because the CRC is very careful to say that it does not determine whether or not a jurisdiction receives a payment. It advises Treasury on whether they have met the benchmark in the NP. Treasury or the Treasurer then makes a decision on whether or not a payment will be made.

CHAIR: As Dick has no more questions, a final question from the deputy chair.

Mrs D'ATH: Thank you and my apologies for being late. You talked about gaps in data—disability, attendance, education and homelessness. You said that work is going on in all of these areas. Is that work being done in some formal way? And is it being coordinated between federal and state governments, or are the states just working on it themselves? I am interested in knowing whether the issues of not just data collection but consistency across the jurisdictions are being addressed. Also, are there any time frames for this work that is going on?

Mr McDonald : In virtually in all of these areas, the work is being done cooperatively in a collective fashion across the jurisdictions. Sometimes we have the data in each jurisdiction; it is just not nationally consistent. The main stumbling block is getting a consistent collection. Nearly all of the data required for the national agreements, the national partnerships, comes from a collection managed or held by one of our major national data agencies. The Australian Bureau of Statistics or the Australian Institute of Health and Welfare manage those collections and the data improvement processes.

What is difficult to see is whether there is an overarching sort of direction in prioritising that this gap in the disability agreement is more significant than this gap in the education agreement and allocating resources in that fashion. Most of this data development work is being done out of the service specific or the portfolio specific resources. So education is improving the education data; health is improving the health data and so on. That said, some of the agencies, the AIHW and ABS, got specific funding for national agreement purposes. I understand that there were elements in their policy proposals that said explicitly what they were going to do in return for those additional funds, but that is not something which I am privy to the detail of. There seems to be a bit of gap—that highest level of someone prioritising which of the gaps are most important.

Mrs D'ATH: Do you have any formal time frames to address it?

Mr McDonald : Again, it varies across the different areas. Some of them are more academic pieces of work that need to be done. For example, developing a new methodology for counting the homeless is an academic piece of work and it is taking academic time frames to be resolved. It is quite a difficult conceptual issue and you want it done right. A commitment has been made that when that is completed the ABS will back cast homelessness data for the 2001 census and we will have a bit of a time series. So I think it is worth waiting to get that done right.

CHAIR: I am conscious of time. We are running a little bit late now. Data is normally an incredibly dry and some might say boring topic. Thank you for helping us to try and excite the Australian parliament and the Australian people on the importance of data and its place in better public policy. If you have been asked to provide additional information—and there was that initial question around data gaps—could you get that to the secretariat as quickly as possible; that would help a lot. Also, we have a question on notice that I might just read in so that you have it: if you could change two things in the next year to improve the quality of the data you receive and the work you do, what would those two changes be?

Mr McDonald : Only two?

CHAIR: You can have more if you want. That is for you to think about.

Mrs D'ATH: Feel free to make a longer list.

CHAIR: Thank you for your time today. It has been valuable for us in helping the process.

Resolved (on motion by Ms Smyth):

That this committee authorises publication of the transcript of the evidence given before it at public hearing this day.

Committee adjourned at 12:13