Note: Where available, the PDF/Word icon below is provided to view the complete and fully formatted document
Higher Education Council commissions report on university funding, including performance-based; results of Third International Mathematics and Science Study.

JANE FIGGIS: We're looking first at a report about the way universities are funded. It's a significant report, commissioned by the Higher Education Council and written by respected academics. It says that if you get the funding mechanism wrong you can damage universities. And, the report suggests, that may be happening right now. The funding mechanism under consideration is 'performance-based' funding, which sounds infinitely sensible. Instead of giving universities money on a straight body count - you know, you have this many students, we'll give you this much money - performance-based funding says: We'll give you money for teaching students when you've shown you teach well; we'll give you money for doing research when you've proved you're good at research.'

The logic is so clear that governments around the world have jumped at it, most using it, as we do in Australia, only at the margin. So in our case, 95 per cent of the operating grant to universities is based on student numbers. Only the final 5 per cent is tied to the universities' research performance. So my first question to Don Anderson, who led the report project team, was whether such a small fraction could be that important, whether it could drive the whole system.

DON ANDERSON: Experience has shown that you can get an amplified return from a small investment of funds. If it's only a small amount of money at stake, the behaviour changes can be considerable. Putting it crudely, some university administrators will crawl over broken glass to get another 2 per cent or 3 per cent onto their budget. That's the first point. Secondly, in what we call the Scandinavian system, where there is a large proportion of the base budget at stake - in Sweden it's 60 per cent, in Denmark it's really 100 per cent of the base budget for undergraduate teaching is related to student progress, or student pass rates.

JANE FIGGIS: And what's been the effect of that?

DON ANDERSON: In Scandinavia, no evaluation yet. That's another interesting thing that we found out about performance based funding: it is very difficult to find a rationale, a published formal statement of why it's being done; and secondly, there are very few evaluations. So what we found out about Sweden and Denmark and the Netherlands, we had to do by probing and asking people and asking officials and people in the university system.

The Danes are very comfortable with their system, and they have had a high drop-out rate, and they say this has had the effect of focusing the attention of universities and of university departments on the teaching process, and on counting drop-outs, and finding out why students drop out. And they told me that, previously, a department may not even know how many drop-outs they had, they didn't know that someone had left until the numbers were counted next year, and then the students weren't missed. So it's had that effect because students dropping out means a reduction in funds. The Swedes say something similar to that.

JANE FIGGIS: I was going to say, my impression from reading your report is that if you have an objective that's that targeted, that clear for example reducing the number of drop-outs, then this kind of mechanism may work. It seems to me that your concerns arise when the goal of having performance measurement is very unclear.

DON ANDERSON: If, to come back to Australia, if we were to evaluate the research quantum and the associated composite index which is used to allocate that quantum, we'd first have to invent the reasons for having it. There's no public statement in Australia of why we have performance space distribution of about 5 per cent of base funding according to a research composite index.

The research quantum has certainly helped focus attention on research in a university culture which was already preoccupied with research. The last 50 years have been quite extraordinary in the long history of universities in the focus that's been put on what I call capital 'R' Research - getting grants, getting publications, getting publications counted. This has meant that academics increasingly have had their attention directed away from other things. I think that scholarship has suffered because of this. The reflective, critical view of what's happening in the discipline; the writing of a scholarly book which might sort of take 10 years of thinking and discussion and so forth - that isn't rewarded under this system.

A second thing - and I noticed this when I was chairman of CAUT ...

JANE FIGGIS: Which is the Committee for the Advancement of University Teaching, so it was looking at university teaching.

DON ANDERSON: That's right. And when I went round as chairman, speaking at most universities, I found that those academics - and they were a minority then, a large minority, but a minority whose chief interest really was teaching and being good teachers and the excitement of interacting with students and so forth - they felt that the game was stacked against them, in that the rewards, the status, the promotions, went to those who were productive in research. I believe that the research quantum and the associated competition for it, has contributed to this. So that it is very difficult in universities to advance the status and the importance of teaching to the central place which it should have.

JANE FIGGIS: When you talk to people about research, is there anybody who is questioning the real value of the research itself? I mean, my sense is that the research is valued for its prestige rather than for what it actually delivers.

DON ANDERSON: No, I can't .. I think it is generally accepted, and I think someone needs to say, 'Hey, look at this, the Emperor's got no clothes!' and the pressure to build up a record of published research in recent decades has just led to - in my view, and this is a personal view, my colleagues in the report mightn't agree - but in my view, it's led to a vast pile of trivial articles and books. And I think really we need to stand back and say 'What's happening here? Are we getting the type of research done which we want in universities? Do people choose research topics that they know can be brought to closure within a year, and a few articles out of it, so that they can then apply for more grants, rather than harder research which may not lead to any significant publications for a long time?' Those questions seem to be not being asked, and it is terribly important I think, that we start asking them in Australia.

JANE FIGGIS: Professor Don Anderson. The report, published by the Higher Education Council - is titled 'Performance-based Funding of Universities' and is available from the Australian Government Printing Service. I spoke about it next to John Mullarvey from the Australian Vice-Chancellors' Committee - because it's actually the Vice-Chancellors who have pushed very hard for having their research money awarded on the basis of performance - the so-called research quantum is their idea.

JOHN MULLARVEY: Yes, in the form that it is now and it's been pushed very hard by the Vice-Chancellors, even pushed hard by those Vice-Chancellors whose universities might not perform as well under the new formula, as under the old one.

JANE FIGGIS: Why is that?

JOHN MULLARVEY: Well, they're firmly of the view that this small component of the operating grant, 5 per cent, needed to be based on performance and not history.

JANE FIGGIS: It looked to so many outsiders at least, that this was government bureaucratic hoops that the universities were jumping through to get this $210 million-$220 million, but you're saying that these are hoops that the universities constructed themselves?

JOHN MULLARVEY: Well, I suppose since we were the ones that pushed it very hard, yes, I suppose we did develop it ourselves. I'm conscious that not all academics or even administrators within universities would support what the Vice-Chancellors have done, but I think it has resulted in a more equitable distribution of this portion of the operating grant, and that's what the Vice-Chancellors have been on about.

JANE FIGGIS: But the universities do feel that there's too much paperwork involved in the hoops that have been constructed?

JOHN MULLARVEY: There is, and one of the tasks that we're currently doing is review that to see whether having now collected the data for a number of years, we can develop a proxy which represents the overall publications that are produced by universities, and the AVCC will be doing that in conjunction with the ARC and the HEC early in the new year.

JANE FIGGIS: How are you going to go about doing that?

JOHN MULLARVEY: Well, in fact looking at the data so far, we find that there's a very strong correlation between total publications and - if you take into account three or four categories like books, journal articles and conference publications - if you take those three components, it's almost 100 per cent equivalent to the total publications reported by universities. So if we were to move to a proxy for the national collection of data, it would streamline tremendously the amount of work that universities have to do for this collection.

JANE FIGGIS: But it would certainly make it look like it was simply a numbers game, wouldn't it?

JOHN MULLARVEY: No, well, it would at the moment, because one of the problems we've had when we first developed its index, we wanted a component for quality. Now, we've done some work on it, and the ARC has now got a project out on this whole issue, and I think you'll find once that report comes in early next year, that'll give us something to move forward to look at, not just quantity, but quality.

JANE FIGGIS: I guess one of the criticisms of course that Don Anderson's committee has made, is that performance-based funding works well if the objectives are very clear. How would you describe the objective of having performance funding for the research quantum? What is it really trying to accomplish?

JOHN MULLARVEY: What it's trying to accomplish through the formula is to come up with the measure of the overall research performance of the university. That's what we're trying to do: by taking into account not only income, but publications as a means of output, and research completion. So it's measuring the overall research performance of the university, and since the research quantum is about providing money to support research performance, it seems to be the appropriate method for distributing those resources.

JANE FIGGIS: When you say research performance, does that mean largely the amount of it, that there's a lot of activity, that lots people are doing it? I mean, what is actually the aim underneath that, what do you want to see happen?

JOHN MULLARVEY: Well, clearly we want to see the university's contribution to research in Australia to increase.

JANE FIGGIS: You're really saying that every bit of research is valuable, that research is good in its own right.

JOHN MULLARVEY: Research is good in its own right because it expands our knowledge base and that's what's important for universities, and for the broader country.

JANE FIGGIS: John Mullarvey, from the Australian Vice-Chancellors' Committee. Julie Wells is a research officer with the NTEU - the National Tertiary Education Union. It is her members who have to do the performing in performance-based funding. So how are they feeling?

JULIE WELLS: Their performance is largely being judged by their capacity to do research and to generate research which is recognised on the composite index. But if they're employed on short-term contracts, it's quite difficult for them to plan and engage in long-term research projects. It's quite difficult for somebody on a 12-month contract to take on post-graduate research students for example, because they don't know if they're going to be around when those students' work matures. So that's one issue for our members.

In relation to the rating attached to different sorts of research contained in the composite index, we've got a number of issues from members working in different discipline areas. For example, I had some correspondence from a member who is a very well-respected film maker, and working in a faculty of creative arts. This is Ross Gibson, who created 'Camera Natura' which is a history of white Australian landscape. And that film was the product of five years' research, and it's extensively used by schools, by policy makers, by galleries, as well as being screened for the general public. And yet, on the composite index, a major written or recorded work is only afforded a ranking of 0.4 compared with an authored book, which has a rating of 5. We do have to think more carefully about what research is.

JANE FIGGIS: Is there a tension within universities in that there is all this emphasis on research. At the same time that the senior management universities, as you point out, want people on contracts and fewer people on tenure, that the two seem almost to be diametrically opposed policies.

JULIE WELLS: Yes, there are many contradictions within our university system in terms of policy objectives and the mechanisms whereby those policies are being implemented. We currently have a case before the Industrial Relations Commission in relation to non-continuing employment, and many of the senior members of university management who are giving evidence and arguing for reduced tenure and greater flexibility within the sector are people who don't know what it's like to be on rolling, short-term contracts, who could be assured of applying for funding with a rising three to five years down the track.

JANE FIGGIS: Julie Wells. Now, let's turn to the Third International Survey of School Students' Achievement in Maths and Science, known as the TIMSS survey. It is, to quote the book jacket, 'the largest, most comprehensive and most ambitious comparative study ever undertaken, with half a million students from 45 countries tested'. The cost to Australia will run to $1.2 million.

In every country, three groups of students were tested: 9-year-olds, 13-year-olds and students in their final year of secondary school. Last week, the first results for the 13-year-olds was announced, and most of the headlines were favourable: 'Our Kids Figure Well'; 'WA Students Measure Up'; 'Australian Pupils Make Top Ten in Maths and Science' - which indeed they did. But higher up than us in the top 10 were Singapore, Korea and Japan, and that is what some newspapers chose to feature. 'Our Students Are Lagging Behind Those in Asia' - and what the Commonwealth Minister, David Kemp highlighted in his response to the survey, worrying that we had to raise our standards significantly to keep up with our competitors.

Jan Lokan from the Australian Council for Educational Research which was responsible for the Australian part of the testing program - but also, she was on the international committee that chose the test questions. What I asked her first was whether 13-year-olds around the planet all studied the same maths and science.

JAN LOKAN: Actually, they don't. There are other parts of the TIMSS study that perhaps don't show up in the tests themselves, particularly the study that was done by Germany, Japan and the USA, where they sent people actually into classrooms and videotaped a lot of proceedings that were happening in math classes. They found from that that the Japanese students were being introduced to topics very much earlier than the Americans and the Germans. So we've actually got some real evidence from that part of it to show that they don't necessarily study the same math and science.

Obviously, when you're only giving students 90 minutes worth of questions, you can't possibly cover everything that everybody does. You can't be equally fair to everybody, in the sense that the test can cover everything that everybody teaches to 13-year-olds, but they took the view that the best way to be fair was to be, if you like, equally unfair.

JANE FIGGIS: When you describe the tests as being equally unfair, the fact that Australian students answered better than anybody else on the planet - the two questions they did so well on in science is about UV rays from the sun, and about introduced species. I mean, that kind of makes sense. I mean, you could well imagine that people in the far north of Canada aren't really going to be as keenly aware of some of those things.

JAN LOKAN: That's true. The people in the far north - I don't know how many we had in our sample from the far north of Canada, but we certainly had Canada in the study, and Russia - they didn't do hopelessly on those questions, but of course I think we'd be worried if we didn't do up there with the best in the world on that kind of thing.

There were some other questions: one about snowballs, one about snow on the tops of mountains - I guess that's the kind of thing I mean by trying to spread the unfairness around. They had very stringent quality control procedures to make sure that all the students did the same test in all countries, so that that involved two separate translation verification teams, they were called; they back-translated all of the translations. In fact, a couple of questions had to be dropped, because it was found - one example: there was a question about what a herbivore is, and when that question was translated into Swedish, it turned into grass-eating, so that gives away what the question is all about. So obviously that wasn't a useful question to use around the world.

JANE FIGGIS: Now, if the results in Australia had been different, if we'd come out rather lower than we did, would that have surprised you?

JAN LOKAN: I really didn't have much of a feel for where we were going to come. We know from the earlier science study that was done in the early 1980s that Australia didn't do too badly. We know things about education systems and attitudes towards education in countries like Korea and Singapore, which made me not surprised that they came ut as well as they did. I think our teachers really should feel very chuffed about how well Australia has done internationally. If we'd come out on top of countries like Singapore, Korea and Japan, it would have been a very big surprise to everyone. I think we do perhaps more diverse things in our schools - the more that we get concerned with problems in society, we have additional courses like consumer education, career education and so on, and we haven't really increased the time of the overall school day. Nor do our students internationally spend a lot of time doing homework. That may be a factor that's worth exploring further, because it certainly came out in the TIMSS study that the students did more homework than ours did, and their teachers spent more time marking tests, marking homework, and giving feedback to students - about double the amount of time that our teachers did.

JANE FIGGIS: If we looked at this as not just a comparison of one country with another, or one State with another, but sort of as an absolute standard, were Australian students by and large getting about the amount right that you would have liked? Or would you like them to be doing better - I guess that's silly, we always want students to do better - but do you have a feeling from a test like this whether you're actually satisfied in an absolute sort of way about the maths and science that our students are learning?

JAN LOKAN: The TIMSS tests weren't put together as measures of some kind of absolute benchmark. That wasn't the thinking behind them. In the development .. part of the process was to give them to fairly big field trial samples and then to pitch the level of the tests so that slightly more than half of the students could succeed. The range of questions went from, say, 20 per cent at the most difficult, through to 90 per cent, 95 per cent as the easiest. That's the kind of test construction practice that produces a good test that can discriminate among the people who are taking the tests.

I think if we'd taken the other tack, to measure some kind of absolute benchmark, you have a different philosophy, because what you want there is things that 80 per cent to 85 per cent of the students can succeed on, and that wasn't the view taken in the development of the TIMSS test. It could well be that with the way that education is going now, where people are very interested in outcomes-based education, it could well be that any future study actually used a different kind of test.

JANE FIGGIS: Jan Lokan. There were significant differences between the performance of students in the different Australian States. Western Australia, South Australia, Queensland and the ACT did quite well - WA actually ranked number two in the world in science. Victoria, Tasmania and the Northern Territory were the lowest. New South Wales came out in the middle, so I asked the Minister for Education there, John Aquilina, how he felt about the results, whether they surprised him.

JOHN AQUILINA: Certainly it's something to think about. I'm not quite sure whether I'm surprised or not, because I really didn't know how New South Wales would end up performing. But I have had some concerns over the years about the way in which we have approached, not only maths and science study, but also the range of what I would regard as being core subject activities. And one of the things I'm keen to ensure is that we don't have a crowding out of the curriculum, that we actually do focus more specifically on the traditional subjects, if I could put it in that way.

But on top of that, of course, I need to look very carefully about our teaching methods. And as much as there is a need for contrasting the teaching methods, say, in South Australia and West Australia with those in New South Wales, I'm also very keen to have a look at the performance of some of the Asian countries who out-performed Australia as a whole, particularly Singapore which I think in the very first of these surveys ranked number 14 back in about 1983, from memory, and yet on this occasion ended up out-performing all other 44 countries.

I know that in Singapore they do have much more class time devoted to the study of maths and science than we do in New South Wales. They also have a much stronger homework ethos than what we do. Those things are important, and we need to see whether or not by changing our approach to those areas, whether or not we can actually improve our results.

JANE FIGGIS: And that's something that you can control, that you can actually decree hours and how much homework and things like that.

JOHN AQUILINA: Oh, absolutely, yes. A greater focus on homework, perhaps a suggestion that we may go back to some streaming of students in classes, as is the case in Singapore. And I think also the importance of having a higher expectation in the performance of our students, and perhaps look at what impact the starting school age, the entry age into school, may have. There is some feeling that perhaps we may be starting children a bit too early with their formal schooling, and that this is having perhaps an adverse impact later on in their learning.

JANE FIGGIS: You're putting actually a great deal of emphasis on this one test. You're going to need to collect other data, aren't you, before you make any major changes?

JOHN AQUILINA: Oh look, I mean I'm using this as a useful yardstick. I'm not saying that a survey like this, a study like this is the be-all and end-all, but I think it does have its lesson.

JANE FIGGIS: I guess one of the most troubling results from the study is that even though our students in Australia by international comparisons did well - and that would include New South Wales - that our teachers are so disheartened, that they don't want to be teachers.

JOHN AQUILINA: Yes, I agree with you, and I think a lot of that is more to do with public perception than perhaps with reality. There are claims, for example, that universities have had to lower their tertiary entrance ranks so as to attract more candidates into teaching. Yes, that is true. But if one looks at the average tertiary entrance rank here in New South Wales for teacher candidates, in fact the average hasn't diminished at all. And I think now that we've also .. I mean, we had a very difficult period during the salaries disputes for New South Wales - it was drawn out, eight months of quite, at times, bitter confrontation, but at the end of the day what we've been able to offer teachers in New South Wales is a very substantial salary - a starting teacher on $35,000 a year; the majority of teachers by 1999 will be earning over $50,000 and principals will be able to earn in excess of $81,000. And I'm quite confident that we will continue to attract young people who are top quality candidates for teaching, and that we will inject a lot of enthusiasm, a lot of new blood, into the profession.

JANE FIGGIS: The New South Wales Education Minister, John Aquilina.

How does the President of the Australian Science Teachers' Association feel about the results? Well, here's Debra Smith.

DEBRA SMITH: We're very happy with the results, although .. I mean, if you read it, it actually says that Australia hasn't necessarily advanced from the previous two surveys. That might be a little bit disturbing but, I mean, compared nationally, where other countries have actually gone down, it's great to see that we're at least maintaining the level of achievement in both maths and science.

JANE FIGGIS: With the differences between the States, do you have any feeling for what's causing that?

DEBRA SMITH: Well, in actual fact I rang a colleague of mine in Perth last night and said, 'Can you tell me why Western Australia is so far up?' And he said, 'There's a whole variety of reasons'; I mean, Western Australia has put a lot of money and time and effort into science education, but his view was perhaps it's because there's more primary teachers involved in doing science education over there. And the flow-through to the lower secondary is then that you get students which are able to achieve better in science.

JANE FIGGIS: I guess one of the disturbing results though is that the science teachers and the maths teachers don't really seem to be enjoying their jobs.

DEBRA SMITH: No. I don't think that they're not actually enjoying their jobs, I think personally it's great to be in a classroom with the kids and it's the students themselves that appreciate the teachers. But I think one of the main .. the most disturbing thing is the perception about the societal recognition of the value of teaching - how teachers are actually rated, you know, as professionals within a society. And I think a lot of that contributes to the feeling, 'Well, people don't feel that I'm doing a worthwhile job, what's the point of being here?' Even though the students recognise it.

JANE FIGGIS: The press release put out by the Commonwealth Minister, David Kemp, instead of congratulating teachers, pointed out that we're still quite distinctly behind many of the Asian countries. What's your response to that?

DEBRA SMITH: Well, I mean I think that is just another contributing factor to the teaching results, where teachers felt, you know, 'Why am I here? Why am I doing this if nobody appreciates it?' And I don't think the politicians actually contribute to their feeling of worthwhileness as well. It's easy to say, 'Lift your game,' it's hard to say, 'This is how you do it, and we'll support you in it.'

JANE FIGGIS: Debra Smith. In next week's Education Report, we're looking at vocational education and training - really a review of the changes that have been put into effect this year - quite interesting.