Save Search

Note: Where available, the PDF/Word icon below is provided to view the complete and fully formatted document
Education, Employment and Workplace Relations References Committee-Senate Standing Effectiveness of the National Assessment Program - Literacy and Numeracy Interim report, dated June 2013


Download PDF Download PDF

The Senate

Education, Employment

and Workplace Relations

References Committee

Effectiveness of the National Assessment Program - Literacy and Numeracy

Interim report

June 2013

© Commonwealth of Australia

ISBN: 978-1-74229-883-2

This document was produced by the Senate Standing Committee on Education, Employment and Workplace Relations and printed by the Senate Printing Unit, Parliament House, Canberra.

iii

MEMBERSHIP OF THE COMMITTEE

Members

Senator Chris Back, Chair, LP, WA

Senator Gavin Marshall, Deputy Chair, ALP, Vic.

Senator Alex Gallacher, ALP, SA

Senator Bridget McKenzie, Nats, Vic. (Substitute for Senator the Hon. Ronald Boswell, Nats,

QLD)

Senator Sue Boyce, LP, QLD

Senator Penny Wright, AG, SA (Substitute for Senator Lee Rhiannon, AG, NSW)

Secretariat

Mr Tim Watling, Secretary

Ms Bonnie Allan, Principal Research Officer

Ms Nerissa Stewart, Senior Research Officer

Mr Isaac Overton, Research Officer

Ms Sarah Bainbridge, Administrative Officer

PO Box 6100 Ph: 02 6277 3521

Parliament House Fax: 02 6277 5706

Canberra ACT 2600 E-mail: eewr.sen@aph.gov.au

TABLE OF CONTENTS

MEMBERSHIP OF THE COMMITTEE ...................................................... iii

CHAPTER 1 ........................................................................................................ 1

Introduction and overview ....................................................................................... 1

Referral ................................................................................................................... 1

Conduct of the inquiry to date ................................................................................ 1

Background ............................................................................................................. 2

The need for an interim report ................................................................................ 3

Note on references .................................................................................................. 3

Acknowledgements ................................................................................................ 3

CHAPTER 2 ........................................................................................................ 5

Key issues................................................................................................................... 5

Is NAPLAN achieving its objectives? .................................................................... 5

Unintended consequences ...................................................................................... 8

Publication of results on the My School website .................................................. 11

International best practice for standardised testing .............................................. 12

Potential improvements to the NAPLAN program .............................................. 13

Conclusion ............................................................................................................ 14

APPENDIX 1 ..................................................................................................... 15

Submissions received .............................................................................................. 15

APPENDIX 2 ..................................................................................................... 19

Witnesses who appeared before the committee ................................................... 19

CHAPTER 1 Introduction and overview

Referral 1.1 On 15 May 2013 the Senate referred the following matter to the Senate Education, Employment and Workplace Relations Committee for inquiry and report:

The effectiveness of the National Assessment Program - Literacy and Numeracy (NAPLAN), with specific reference to:

(a) whether the evidence suggests that NAPLAN is achieving its stated objectives;

(b) unintended consequences of NAPLAN's introduction;

(c) NAPLAN's impact on teaching and student learning practices;

(d) the impact on teaching and student learning practices of publishing NAPLAN test results on the My School website;

(e) potential improvements to the program, to improve student learning and assessment;

(f) international best practice for standardised testing, and international case studies about the introduction of standardised testing; and

(g) other relevant matters.1

Conduct of the inquiry to date 1.2 Notice of the inquiry was posted on the committee's website and advertised in The Australian newspaper, calling for submissions by 7 June 2013. The committee also wrote to stakeholders to notify them of the inquiry and invite submissions. The committee published a total of 93 submissions, as listed at Appendix 1. This appendix also includes information on documents tabled by the committee during the course of the hearing. A number of submissions were redacted prior to their publication to protect personal details.

1.3 The committee conducted a public hearing in Melbourne on 21 June 2013. A list of witnesses who gave evidence before the committee is at Appendix 2. Copies of the Hansard transcript from the committee's hearings can be accessed online at http://aph.gov.au/hansard.

1 Journals of the Senate, 2013, pp 3928-3929.

2

Background 1.4 NAPLAN is an annual assessment of Australian students in years 3, 5, 7 and 9 that tests students in reading, writing, language and literacy. The test has been conducted in May each year since 2008, and results are available four months later in September. Since 2010 results have been available publically on the My School website at an individual school level.2

1.5 This committee completed an inquiry into the administration and reporting of NAPLAN testing in November 2010.3 The terms of reference for that inquiry were:

(a) the conflicting claims made by the Government, educational experts and peak bodies in relation to the publication of the National Assessment Program - Literacy and Numeracy (NAPLAN) testing;

(b) the implementation of possible safeguards and protocols around the public presentation of the testing and reporting data;

(c) the impact of the NAPLAN assessment and reporting regime on:

(i) the educational experience and outcomes for Australian students,

(ii) the scope, innovation and quality of teaching practice,

(iii) the quality and value of information about student progress provided to parents and principals, and

(iv) the quality and value of information about individual schools to parents, principals and the general community; and

(d) international approaches to the publication of comparative reporting of the results, i.e. ‘league tables’; and

(e) other related matters.4

1.6 The committee majority made twelve recommendations targeted at reforming the NAPLAN assessment program.5 Recommendations included reforms to the publication and representation of test data, arrangements for students with a disability, provision for students with a language background other than English, measures to ensure the integrity of the testing process, reforms to the My School website and management of publication of league tables in the media. Government Senators and the Australian Greens also appended dissenting and additional comments to the report. The Australian Government, in consultation with the Australian Curriculum, Assessment and Reporting Authority (ACARA) and the relevant COAG council, has since implemented a number of the recommendations and introduced changes to the

2 Department of Education, Employment and Workplace Relations, Submission 69, pp 7-8.

3 Senate Education, Employment and Workplace Relations References Committee, Administration and Reporting of NAPLAN testing, November 2010.

4 Journals of the Senate, 13 May 2010, p. 3490.

5 Senate Education, Employment and Workplace Relations References Committee, Administration and Reporting of NAPLAN testing, November 2010, pp xi-xii.

3

My School website.6 However, submissions to the committee suggest that a significant number of important recommendations have not yet been implemented by the government.7

The need for an interim report 1.7 Since its introduction in May 2008, and the subsequent publication of results on the My School website from 2010, the NAPLAN testing program has been the subject of discussion, research and controversy. This deep interest has been reflected in the current inquiry, which has seen the committee receive a high volume of submissions from individuals and organisations, including parents, teachers, principals, academics, and schools.

1.8 When referring the inquiry to the committee, the Senate set 27 June as the date by which the report should be tabled, and given that this is likely to be among the last sitting days before the general election, it is not practical to extend the reporting date. The evidence provided in submissions, combined with evidence provided by witnesses during the committee's hearing on Friday, 21 June 2013, demonstrate that the committee requires more time to adequately discharge its reference and present a properly considered report on this very important matter of public policy.

1.9 Therefore, Chapter 2 of this Interim Report provides a snapshot of the key issues identified thus far, but does not delve any deeper into the different views put by submitters, or come to any conclusions on them. However, the committee does note the possibility that the Senate may re-adopt this inquiry in the 44th Parliament, following a likely recommendation to that effect from this committee once it is reconstituted. Such a course of action would ensure that the issues raised by submitters could be properly and comprehensively examined and reported on.

Note on references 1.10 References in this report are to the proof Hansard. Page numbers may vary between the proof and the official transcript.

Acknowledgements 1.11 The committee extends its gratitude to the large number of individuals and organisations who made submissions to this inquiry, and to witnesses who offered their time to give evidence at public hearings and provided additional information. Both contributed greatly to shaping the committee's deliberations and report.

6 Australian Government Response to the Senate Education, Employment and Workplace Relations References Committee Report on the Administration and Reporting of NAPLAN Testing, August 2011.

7 For a discussion of the 2010 inquiry and outstanding recommendations see: Australian Education Union, Submission 57, p. 3.

CHAPTER 2 Key issues

2.1 The committee has received a large number of detailed and carefully prepared submissions in relation to this inquiry, and has also received evidence from witnesses at its public hearing on 21 June 2013. It is apparent that the effectiveness of National Assessment Program - Literacy and Numeracy (NAPLAN) is a matter of some controversy and there are a full range of views on this issue. This chapter briefly identifies some of the key issues that are apparent to the committee thus far, as well as areas where further research and inquiry are necessary.

Is NAPLAN achieving its objectives?

Objectives of NAPLAN

2.2 The Department of Education, Employment and Workplace Relations (DEEWR) advised that five objectives have informed the development of NAPLAN testing:

1. that the reporting of literacy and numeracy test results is reliable and nationally comparable;

2. that the proposed national literacy and numeracy tests be rigorous;

3. the central aim of national assessment should be finding out what students can or cannot do and lifting the performance of every student in every school;

4. the tests should focus on the diagnosis of each student’s strengths and weaknesses as a means for planning educational interventions; and

5. the development of new standards to cover the full range of student achievement.1

2.3 The Australian Curriculum, Assessment and Reporting Authority (ACARA) submitted that the National Assessment Program 'is the means by which governments, education authorities and schools can determine whether or not young Australians are reaching important educational goals for literacy and numeracy'.2 On its website ACARA advises that the primary objective of NAPLAN is to provide the:

[M]easure through which governments, education authorities, schools, teachers and parents can determine whether or not young Australians have the literacy and numeracy skills that provide the critical foundation for other learning and for their productive and rewarding participation in the community.

The tests provide parents and schools with an understanding of how individual students are performing at the time of the tests. They also provide schools, states and territories with information about how education

1 Department of Education, Employment and Workplace Relations, Submission 69, p. 9.

2 Australian Curriculum, Assessment and Reporting Authority, Submission 58, p. 4.

6

programs are working and which areas need to be prioritised for improvement.3

2.4 ACARA noted that the NAPLAN tests are only one part of the assessment and reporting processes conducted by each school and do not replace 'extensive, ongoing assessments made by teachers about each student's performance'.4

2.5 Submitters to the inquiry varied in response to the objectives of NAPLAN. Some submitters supported the objectives in full, but argued that NAPLAN was unable to meet them, while others suggested that the objectives need to be revisited. Still further submitters suggested that NAPLAN was being used for a much broader range of purposes than originally anticipated, a viewpoint elaborated on below.5 Finally, some submitters variously questioned the utility of NAPLAN, as well as its expense and its potential to actually cause harm to students, the implication being that the NAPLAN regime should be discontinued.6

2.6 The Australian Education Union submitted that it is timely to reconsider the purpose of NAPLAN, and its style of delivery. During the Melbourne hearing the committee was advised:

In our most recent correspondence with the federal minister, we have argued and suggested that it is time for a thorough re-examination of the purpose and processes of delivery of NAPLAN. We do that because we do not believe there is any clarity about the purposes of NAPLAN. It is now starting to become the be-all and end-all for anyone, depending on what their hobby horse is at the time. For example, we are told it is a diagnostic test and then we are told it is not a diagnostic test. We are told you can prepare for it and then we are told you cannot prepare for it.7

2.7 It seems to the committee that over time the purposes of NAPLAN have expanded. This is in part because NAPLAN data is our only nationally consistent data for educational outcomes. The Australian Education Union noted that:

[O]ne of the great concerns for educators across the country [is] the fact that there is a lack of clarity around the purpose of NAPLAN, and there are increasing references to NAPLAN and the results of NAPLAN being used for a wider and wider variety of purposes…

3 ACARA website, Frequently asked questions, http://www.nap.edu.au/information/faqs/naplan--general.html (accessed 28 May 2013).

4 ACARA website, Frequently asked questions, http://www.nap.edu.au/information/faqs/naplan--general.html (accessed 28 May 2013).

5 See for example: Independent Education Union of Australia, Submission 41, paragraphs 53-55 (objectives of NAPLAN are unclear); Queensland Catholic Education Commission, Submission 42, pp 1-2.

6 See, for example, Ms Lorraine Wilson, Submission 15; Mr David Hornsby, Submission 35; Mr Phil Cullen, Submission 83.

7 Mr Angelo Gavrielatos, Federal President, Australian Education Union, Proof Committee Hansard, 21 June 2013, p. 10.

7

…Data is important; it is the misuse of that data, or the incorrect use of that data, that concerns us greatly. That, in itself, contributes to growing anxieties, if you like, and the high stakes associated with the NAPLAN program.8

Achievement of objectives

2.8 The committee heard a range of concerns expressed by witnesses and submitters about the ability of the NAPLAN tests to achieve the original objectives. Among other criticisms, NAPLAN was criticised for testing a very narrow field of Australian curriculum, having extremely limited diagnostic ability, lacking rigour and containing a high margin of error.9 A discussion of the full range of views is beyond the scope of this interim report, however some examples follow.

2.9 A number of submitters questioned whether NAPLAN testing could achieve its objectives, on the basis that NAPLAN data is limited and at best provides an estimate of each child's literacy and numeracy at a particular point in time.10 The committee heard that for these reasons NAPLAN was not a particularly useful tool to assess student ability and that a more useful assessment could be provided by teachers.11 The committee also heard that the four month wait before NAPLAN results are released inhibits the usefulness of the test as a diagnostic tool, as does the time of year that the test is conducted.12

2.10 Many submitters and witnesses expressed concern that NAPLAN data was being used for purposes for which it was not originally intended. Dr Gloria Latham observed that while NAPLAN results are described by ACARA as providing a snapshot of children's learning, the results were being used by the Australian government and by schools for other purposes, including as a measurement of learning and to assess the quality of schools and teachers. 13

2.11 Dr Suzanne Rice observed that much of the criticism around NAPLAN was 'not necessarily a disagreement with testing per se but rather with the uses to which the testing data might be put that is central to some of the key debates'.14

2.12 Following this theme, submitters also were critical of the decision to use NAPLAN data to determine the Schooling Resource Standard.15 The Australian

8 Mr Angelo, Gavrielatos, Federal President, Australian Education Union, Proof Committee Hansard, 21 June 2013, p. 12.

9 See for example, David Hornsby, Submission 35; Dr Alyson Simpson, Submission 64.

10 See for example: Dr Gloria Latham, Submission 2, p. 1.

11 See for example, Spensley Street Primary School staff and School Council, Submission 76; Dr Gloria Latham, Submission 2, p. 1;

12 See, for example, Association of Heads of Independent Schools Australia, Submission 56, p. 9; Independent Education Union of Australia, Submission 41, paragraphs 56-60.

13 Dr Gloria Latham, Submission 2, p. 2.

14 Dr Suzanne Rice, Private Capacity, Proof Committee Hansard, 21 June 2013, p. 31.

8

Primary Principals Association told the committee that NAPLAN data was not fit for this purpose and the use of the data in this way 'privileges NAPLAN data to an almost unbelievable level'.16

Unintended consequences 2.13 The committee heard that a range of unintended consequences have emerged as a result of NAPLAN testing. These include: adverse impacts on students, narrowing of the curriculum, creation of a NAPLAN preparation industry and the development of NAPLAN into a 'high stakes' test.

Impact on teaching and learning practices

2.14 The committee received evidence that the NAPLAN testing regime is having an adverse impact on teaching and student learning practices.

2.15 The committee heard that the curriculum has become narrowed as teachers teach to the test.17 Submitters reported that a focus on NAPLAN preparation has resulted in limitations on creative learning in classrooms. For example, some school children are spending a disproportionate amount of time learning how to master persuasive writing pieces.18 In response to this particular criticism, ACARA has decided not to give advance notice of the writing style to be tested in future years.19

2.16 These submissions are consistent with international academic studies on the impact of national testing regimes, particularly in the United Kingdom and the United States. Professor John Polesel summarised that this literature:

…has found that the testing can have distorting influences on the way in which teachers teach, changing teachers' pedagogy but also changing the way in which schools assign value to different parts of the curriculum. For example, the research has found that things like the arts, drama and music are given less prominence in the curriculum because they are regarded as further away from the main focus of the testing, which is literacy and numeracy. The other thing that came through from the international research is that the focus also had unintended consequences—for example, in the way in which schools recruit kids from different areas of society, and

15 The Schooling Resource Standard is the new funding approach for schools introduced by the Australian Education Bill 2013.

16 Mr Norm Hart, President, Australian Primary Principals Association, Proof Committee Hansard, 21 June 2013, p. 19. See also, Christian Schools Australia, Submission 37, p. 5.

17 See for example: Ms Jane Hunter, Submission 7, p. 2. See also, Association of Heads of Independent Schools Australia, Submission 56, p. 2; The Whitlam Institute, Submission 26, p. 6; Dr Alyson Simpson, Submission 64.

18 Ms Lorraine Wilson, Submission 11, p. 11. Ms Jane Hunter, Submission 7, p. 2.

19 ACARA, Submission 58, p. 11.

9

parents using the results of the testing, because it is high stakes and it is made public, in ways which were unintended.20

A 'high stakes' test and impact on student well being

2.17 Numerous submitters provided evidence to support their claim that NAPLAN has become a 'high stakes' test, and that this has led to negative impacts on students, teachers and schools.21 ACARA acknowledged that literacy and numeracy are high stakes, but argued that NAPLAN itself is low stakes for individual students.22 Depending on the perspective of an individual or organisation, NAPLAN testing may be considered to be high (or low) stakes, either for themselves or for another stakeholder.

2.18 The committee heard that the test has become high stakes, in part, because of the publication of individual school results on the MySchool website and subsequent development of league tables by the media.

2.19 In response to these concerns The Whitlam Institute, along with its partners at the University of Melbourne, is conducting a project titled: The Experience of Education: The impact of high stakes testing on schools students and their families. The project approaches this question from the perspective of the best interests of the child. The two objectives are (i) to determine the positive and negative impacts of NAPLAN on children and (ii) to examine the significance of these impacts for students and their learning environment.23 Significantly, the Institute observed that 'almost 90 per cent of teachers reported students talking about feeling stressed prior to NAPLAN testing and significant numbers also reported students being sick, crying or having sleepless nights'.24

2.20 ACARA questioned whether NAPLAN alone is the cause of student stress, suggested that the way the Whitlam Institute constructed its survey questions may have led to a particular outcome, and that a more expansive survey that looked at student stress across the entire school year may deliver comparable results to those reported.25

2.21 Mr Phillip Heath, currently a Principal at a Canberra private school, advised that even when schools take a low-key approach to NAPLAN, children can still become stressed:

20 Professor John Polesel, Private Capacity, Proof Committee Hansard, 21 June 2013, p. 31. See also, for example, School of Education (University of South Australia), Submission 52, pp 12-15.

21 See for example, Association of Heads of Independent Schools Australia, Submission 56, p. 2.

22 Mr Robert Randall, Chief Executive Officer, Australian Curriculum, Assessment and Reporting Authority, Proof Committee Hansard, 21 June 2013, p. 42.

23 The Whitlam Institute, Submission 26.

24 The Whitlam Institute, Submission 26, p. 7.

25 Mr Robert Randall, Chief Executive Officer, Australian Curriculum and Reporting Authority, Proof Committee Hansard, 21 June 2013, p. 43.

10

My school does not talk about NAPLAN at all. We do not publish the results. We keep it very much to what it is designed for—that is, to give feedback to us and to an individual student. But, for about half of those present, their parents are giving them tests at home to prepare for the experience. That really surprised me. That is in a context where we say nothing, as a school, and I would suggest that is a pretty common picture around the country. Parents at home who are used to a testing regime—that is how they grew up—consider this a very high-stakes experience, much higher than, in fact, it was intended ever to be.26

2.22 Mr Heath was careful not to judge parents for this response, explaining that the way that NAPLAN is reported encourages some parents to view the test as high stakes:

If you set up a test in which the children get the results and their results are shown across bands with an arrow, an aspirational parent, particularly from a culture that values education very highly and achievement very highly, will want to see their child's arrow at the top and will do whatever it takes.27

2.23 The Independent Education Union of Australia agreed, noting that the very nature of NAPLAN made it high stakes - for children and teachers. During the Melbourne hearing Mr Chris Watt explained:

The fact that it involves people's children immediately makes it high stakes, because every parent wants the best for their child and if they are using these tests and their results for enrolment purposes other than the child's health it is hard to think of anything more high stakes than your child's education. By definition, testing is, whether you like it or not, high stakes. So, yes, there is unquestionably pressure being put on individual teachers, and some do not want to take those classes for that very reason.28

2.24 This evidence suggests that it may not be NAPLAN itself that causes stress, but rather the influence of parents, schools and, in particular, the media. The inclusion of NAPLAN data in the assessment of the School Resource Standard for schools under the new funding arrangements is likely to contribute to the perception of NAPLAN being 'high stakes' for state and territory education jurisdictions.29

Growth of the NAPLAN preparation industry

2.25 ACARA advised that detailed preparation for NAPLAN testing was not necessary. However, evidence to the committee suggests that an industry of NAPLAN preparation has developed in Australia over the past 5 years. The Australian Education

26 Mr Phillip Heath, Director and Incoming Chair, Association of Heads of Independent Schools of Australia Ltd, Proof Committee Hansard, 21 June 2013, p. 2.

27 Mr Phillip Heath, Director and Incoming Chair, Association of Heads of Independent Schools of Australia Ltd, Proof Committee Hansard, 21 June 2013, p. 4. See also, The Victorian Association for the Teaching of English, Submission 74, p. 3.

28 Mr Chris Watt, Federal Secretary, Independent Education Union of Australia, Proof Committee Hansard, 21 June 2013, p. 26.

29 See for example, Australian Council of Parents & Citizens, Submission 70, p. 2.

11

Union pointed to fish oil supplements, study aids and tutoring support all targeted at NAPLAN preparation.30

2.26 ACARA acknowledged that it was aware of instances of 'excessive test preparation' and wanted to work with Principals and teachers to support them in their important role.31 ACARA also acknowledged that feedback from stakeholders indicated 'the need to restate things like the purpose, to counsel and provide information about what it going on'.32

2.27 ACARA also advised a number of reforms that would be implemented in the next several years, including: online delivery of NAPLAN, linking NAPLAN to the National Curriculum, reducing the time gap between testing and results and introducing flexible delivery of the tests.33

2.28 Further investigation and inquiry is necessary to fully consider the impacts of the NAPLAN testing regime and to determine the appropriate action (if any) that should be taken to address any adverse consequences.

Publication of results on the My School website 2.29 While most submitters, particularly teachers and principals, offered in-principle support for NAPLAN testing and the careful34 use of data, in conjunction with teacher assessment, strong criticism was reserved for the publication of individual school data on the My School website.35 The committee heard that many of the unintended consequences discussed earlier arose not from NAPLAN itself but from the publication of results on the My School website.

2.30 Professor Joy Cummings, an experienced educator and expert consultant, observed that publication on My School 'has made NAPLAN high stakes for schools…Pressure to meet targets is top-down on schools from authorities, from principals to teachers, from teachers to students'.36 Professor Cummings observed that some students attending schools with a balanced approach to NAPLAN are still

30 Australian Education Union, Submission 57, pp 9-10. See also, Independent Schools Queensland, Submission 73, pp 3-4.

31 Mr Robert Randall, Chief Executive Officer, Australian Curriculum, Assessment and Reporting Authority, Proof Committee Hansard, 21 June 2013, p. 43.

32 Mr Robert Randall, Chief Executive Officer, Australian Curriculum, Assessment and Reporting Authority, Proof Committee Hansard, 21 June 2013, p. 43.

33 Mr Robert Randall, Chief Executive Officer, Australian Curriculum, Assessment and Reporting Authority, Proof Committee Hansard, 21 June 2013, p. 43.

34 The committee has previously discussed the importance of adequate statistical literacy among those who are required to deal with NAPLAN (and other applicable) results: Senate Education, Employment and Workplace Relations Committee, Inquiry into teaching and learning - maximising investment in Australian schools, May 2013, paragraph 3.17.

35 See for example, Australian Council of State School Organisations Inc, Submission 81, p. 2; Robert Hassell, Association of Independent Schools of Western Australia Submission 46, p. 1.

36 Professor Joy Cummings, Submission 24, p. 24.

12

reportedly stressed 'due to exposure to media reporting on NAPLAN and My School outcomes'.37 These criticisms of the My School website were repeated by a large number of submitters to the inquiry.

2.31 The committee heard that the publication of data on the My School website encouraged the misuse of data. For example, Mr Norm Hart, President of the Australian Primary Principals Association, told the committee:

I have no problem with the data being collected and I have no problem with the proper use of the data. My problem, and the problem of primary school principals, is the misuse and inappropriate use—I think there are two different things here—and sometimes it is almost mischievous use, of the data. In the terms of league tables, that is the case. I think they are just so wrong that they should be stopped because a high NAPLAN score does not equal high efficiency. It does not equal high quality. It could well be an element of both, and probably is, but it is not equal. 38

2.32 ACARA has been asked by the COAG Standing Council on School Education and Early Childhood to provide an assessment of any perverse incentives and unintended impacts as a result of the publication of NAPLAN data on My School.39 Further inquiry into the purported benefits and negative impact of the My School website is necessary.

International best practice for standardised testing 2.33 Both DEEWR and ACARA submitted that NAPLAN represents international best practice for standardised testing and cited Organisation for Economic Co-Operation and Development (OECD) publications to support this statement.40 However, a number of submitters disagreed.

2.34 The Australian Primary Principals Association praised the Finnish system because of its strong results and because of the trust it places in teacher professionalism.41 Sample testing was praised by others because it makes more sophisticated that tests higher order testing skills possible, and enables governments to collect data without interfering with the role of the teacher in providing feedback.42

37 Professor Joy Cummings, Submission 24, p. 24 (emphasis in original).

38 Mr Norm Hart, President, Australian Primary Principals Association, Proof Committee Hansard, 21 June 2013, p. 19.

39 Department of Education, Employment and Workplace Relations, Submission 69, p. 15.

40 Department of Education, Employment and Workplace Relations, Submission 69, pp 30-32; Australian Curriculum and Reporting Authority, Submission 58, pp 20-21. See also, Dr Amanda Day, Acting Branch Manager, Proof Committee Hansard, 21 June 2013, p. 44.

41 Australian Primary Principals Association, Submission 19, p. 10.

42 See for example, Ms Lorraine Wilson, Private Capacity, Proof Committee Hansard, 21 June 2013, pp 38-39, 41. For a comparison of OECD PISA and NAPLAN test questions in mathematics see Professor Kaye Stacey, Submission 6, pp 2-3.

13

2.35 The committee is also aware of recent reforms to census testing in the United Kingdom to allow quicker feedback and greater use of teachers' professional judgement, including marking of exams by the classroom teacher to allow for prompt feedback.43

2.36 The committee believes that closer consideration of international best practice is necessary in order to properly assess the NAPLAN assessment program.

Potential improvements to the NAPLAN program 2.37 The committee heard a range of innovative measures which it was submitted had the potential to improve the NAPLAN testing program. Chief among these were:

 Timely return of NAPLAN results to teachers and parents, to enable the tests to be used for diagnostic purposes;

 Changing the time of year that the tests are conducted to either the beginning of the year or the end of the year;

 Removal of school level data from the My School website (with data provided to teachers, principals, parents and education authorities);

 Close alignment between the National Curriculum and NAPLAN;

 The ability to disaggregate data on the My School website so that schools can distinguish between different cohorts of students in the same year grouping.44

 Reforms to NAPLAN to ensure that it measures broader learning outcomes, not just narrow fields of literacy and numeracy;

 Use of sample testing instead of census testing to enable more sophisticated testing regimes, and also to avoid some of the unintended consequences of the discussed earlier;

 Online testing to enable prompt return of results, and more sophisticated testing methodology;

 Introduction of a testing window, so schools can select the best time in their school calendar for students to undertake the test

 Providing students with the ability to 'pause' a NAPLAN test and return it the next day.

 Addressing concerns about the suitability of the current NAPLAN testing regime for Indigenous students and students with a language background other than English.

43 Mr Norm Hart, President, Australian Primary Principals Association, Proof Committee Hansard, 21 June 2013, p. 22. See also, Australian Primary Principals Association, Submission 19, p. 8. For a discussion of the research into the impacts of standardised testing in the United Kingdom see Australian Literacy Educators' Association, Submission 66.

44 Association of Heads of Independent Schools, Submission 56, p. 9.

14

2.38 The suggestions above are a representative sample of reforms suggested by submitters. The committee has not had sufficient time to closely consider these and other suggestions. The list provided, however, illustrates the broad range of changes that could potentially be made to the administration of NAPLAN to promote its effectiveness. The committee heard that ACARA and DEEWR were already acting on some of these reforms, but the committee was unable to explore which ones and the extent to which their implementation has been effective.

Conclusion 2.39 Every year over a million Australian students complete three separate tests over a week, and four months later these results are published online at a school level as well as being reported in the media. The results are touted as an accountability measure for teachers and principals, and as a useful tool for parents. Given the large numbers of Australian students involved in NAPLAN, and also the numbers of teachers who are often judged by the results of their students, it is of the utmost importance that the Parliament is able to determine the extent to which NAPLAN is achieving its objectives.

2.40 In conducting such an inquiry on behalf of the Senate, the committee requires time to discover, explore and assess all relevant viewpoints as they impact on the Terms of Reference. Given the time constraints on this inquiry, such an analysis will only be possible if the committee is given the opportunity to continue its inquiry in the 44th Parliament.

2.41 In this context, the committee makes no recommendations of substance in relation to NAPLAN, but notes the potential for the committee to recommend to the Senate the re-adoption of this inquiry early in the next Parliament.

Senator Chris Back

Chair, References Committee

APPENDIX 1 Submissions received

1 Dr Van Davy

2 Dr Gloria Latham

3 Mrs Edith Knight

4 Mrs Patricia Buoncristiani

5 Dr John Ridd

6 Prof Kaye Stacey

7 Ms Jane Hunter

8 Aitken College

9 Ms Eunice Bailey

10 Mr Les O'Gorman

11 Ms Lorraine Wilson Attachment 1 Attachment 2

12 Mr Trevor Stockley

13 Mrs Jane Wenlock

14 Mr Leon Voesenek

15 Kingsgrove High School

16 Mr Brian Joye

17 Better Education Pty Ltd

18 St Philip's Christian College Gosford

19 Australian Primary Principals Association Attachment 1

20 Professor Patrick Griffin, University of Melbourne

21 Epping Heights Public School

22 Queensland Association of State School Principals Inc

23 NSW Primary Principals' Association

16

24 Professor Joy Cumming

25 P and C Federation

26 The Whitlam Institute within the University of Western Sydney Attachment 1 Attachment 2

27 Dr Kerry Hempenstall

28 Ms Maureen Anderson

29 Department of Education, TAS

30 Australian College of Educators

31 Fintona Girls' School

32 Ms Denise McKee, Suzuki Piano Studio

33 Ms Muriel Johnson

34 Mr Noel Bourke

35 Mr David Hornsby Attachment 1

37 Christian Schools Australia Limited

38 Ms Rachael Sowden

39 Westralian Association for the Teaching of English to Speakers of Other Languages

40 Australian Association for the Teaching of English

41 Independent Education Union of Australia

42 Queensland Catholic Education Commission

43 Steiner Education Australia

44 Catholic Education South Australia

45 School of Education, Deakin University

46 Mr Robert Hassell, Association of Independent Schools of Western Australia

47 Mr Mark Ammermann

48 P and Cs Qld

49 Ms Yvonne Meyer

17

50 Mr Tony Stokes

51 School of Education, The University of Queensland

52 School of Education, University of South Australia

53 Mr. Glyn Parfitt

54 SA Primary Principals Association

55 Australian Parents Council Inc.

56 The Association of Heads of Independent Schools of Australia

57 Australian Education Union

58 Australian Curriculum, Assessment and Reporting Authority

59 Dr Nicole Mockler

60 Multicultural Development Association and Townsville Multicultural Support Group

61 Mr Ken Woolford

62 Mr Brad Ahern

63 MultiLit Pty Ltd

64 Dr Alyson Simpson

Attachment 1 Attachment 2 Attachment 3 Attachment 4

65 Mr Ben Zonca

66 Australian Literacy Educators' Association

67 Australian Association of Mathematics Teachers Inc.

68 Ms Remana Dearden

69 Department of Education, Employment and Workplace Relations

70 ACT Council of Parents and Citizens Associations

71 Mr Leonard Freeman

72 Mr Derek Synnott, Yarralumla Primary School

73 Independent Schools Queensland

18

74 The Victorian Association for the Teaching of English

75 Silkwood School

76 Spensley Street Primary School

77 ACT Government

78 NSW Parents' Council Inc

79 Australian Council of TESOL Associations Attachment 1 Attachment 2

80 Department of Education, Training and Employment

81 Australian Council of State School Organisations

82 Ms Denise Angelo

83 Mr Phil Cullen

84 Name Withheld

85 Name Withheld

86 Name Withheld

87 Name Withheld

88 Name Withheld

89 Name Withheld

90 Name Withheld

91 Name Withheld

92 Name Withheld

93 Name Withheld

APPENDIX 2

Witnesses who appeared before the committee

Melbourne, Friday, 21 June 2013.

CULL, Ms Kim, Chief Executive Officer, Association of Heads of Independent Schools of Australia Ltd

DAY, Dr Amanda, Acting Branch Manager, School Performance and Improvement Branch, Department of Education, Employment and Workplace Relations

DEVEREAUX, Ms Jennifer, Federal Research Officer, Australian Education Union

DULFER, Ms Nicole, Private capacity

GAVRIELATOS, Mr Angelo, Federal President, Australian Education Union

HART, Mr Norm, President Australian Primary Principals Association

HEATH, Mr Phillip James, Director and Incoming Chair, Association of Heads of Independent Schools of Australia Ltd

MACMILLAN, Ms Robyn, Acting Director, National Assessments and School Evaluation Section, School Performance and Improvement Branch, Department of Education, Employment and Workplace Relations

POLESEL, Professor John, Private capacity

RANDALL, Mr Robert, Chief Executive Officer, Australian Curriculum, Assessment and Reporting Authority

RICE, Dr Suzanne Margaret, Private capacity

WATT, Mr Chris, Federal Secretary, Independent Education Union of Australia

WILSON, Ms Lorraine, Private individual