The pros and cons of NAPLAN
NAPLAN has been used as a benchmark for student performance against national averages since 2008, but just how effective is the test? And what are the repercussions for students and teachers?
As the Federal Government moves to introduce mandatory phonics testing, questions have been raised as to whether another test is need to lift Australia’s international ranking.
Education Minister Simon Birmingham has called on an expert panel to develop mandatory phonics, maths and literacy testing.
In January, Mr Birmingham said the short assessments of year one students’ literacy and numeracy would provide early identification of students who are behind, allowing them to be targeted with interventions before the achievement gap grows.
“This panel will also consider existing examples from Australia and overseas, such as the Year 1 phonics check used in England that involves children verbally identifying letters and sounds in both real words and made up words to show a child’s understanding of how language works.
“Similar numeracy checks see children undertake tasks such as simple counting, recognising numbers, naming shapes and demonstrating basic measurement knowledge.”
However, the plan has been criticised by the Labor Party and Australian Education Union, who argue Gonski funding commitments are the best way to improve educational outcomes.
“Schools don’t need another national test to work out which children need help… they need the resources to cater for those children,” Australian Education Union president Coreena Haythorpe told ABC News in January.
But how have other national tests helped inform decision making in schools?
NAPLAN IN A NUTSHELL
One of the most well known tests in the education space is the National Assessment Program – Literacy and Numeracy (NAPLAN) test.
Introduced in 2008, NAPLAN was designed to measure whether Australian students are meeting educational outcomes.
NAPLAN is an annual assessment for all students in Years 3, 5, 7 and 9. (NAPLAN, 2016).
The tests cover reading, writing, spelling, grammar and punctuation, and numeracy. The assessments are undertaken every year in the second week of May.
The benchmark is used to inform policy development, resource allocation, curriculum planning and intervention by governments, as well as being measured by education authorities, schools and the community.
(Harrington, 2013) notes that NAPLAN testing and the My School website, which include results, have led to concerns that the tests could have a negative effect on students and schools.
Research by (Thompson, 2013) concluded that a majority of teachers felt pressured to perform and be ranked highly, effecting their style of pedagogy, learning opportunities and curriculum choice.
His research, which looked at a survey of teachers in Western Australia and South Australia in 2012, concluded the effects of NAPLAN could work against the benefits of accountability and transparency in improving outcomes in the Australian education system.
Senate committee inquiries into NAPLAN in 2010 highlighted that the My School website allowed the government to identify disadvantage by discovering 110 struggling schools and addressing funding shortages.
During the hearings, Professor Geoff Masters, CEO of the Australian Council for Educational Research, emphasised that NAPLAN is firmly grounded in 20 years experience through state literacy and numeracy testing programs.
A qualitative study of 70 students, 29 teachers and 26 parents by (Wyn, Turnbull, Grimshaw, 2014)
found 70 per cent of parents surveyed believed information provided by NAPLAN to be useful.
It showed more than 50 per cent of parents were in favour of NAPLAN.
School principals also found NAPLAN data to be useful in various ways, including providing information that would enable teachers to develop more individualised approaches to teaching and identifying students with slow progress.
A 2013 senate committee looked at a range of adverse consequences emerging from the
NAPLAN, including narrowing the curriculum, and the development of a NAPLAN preparation industry – creating the perception NAPLAN is a ‘high stakes’ test.
The committee concluded that educational authorities need to be aware of this issue when providing support to schools.
The Australian Primary Principals Association found in a survey that two thirds of respondents believed NAPLAN had a negative impact on student wellbeing.
With that in mind, the committee recommended that the Australian Curriculum, Assessment and Reporting Authority (ACARA) monitor NAPLAN results to ensure they deliver targeted funding to schools and students who require support.
Dr Bronwyn Hinz, Policy Fellow at the Mitchell Institute, Victoria University, told Education Matters that some schools and parents have overemphasised the importance of the NAPLAN tests, which she says is not a high-stakes test.
“NAPLAN is a point-in-time test of a just a few, albeit important – subjects which can be compared to the same data collected at other times and around Australia, to help work out, among other things and alongside other data, the effects of different education programs and policies, and the places where additional resources could make the greatest impact.
“NAPLAN does not replace the much deeper, more sophisticated and more frequent formative or summative assessments of student learning done by school teachers, nor does it provide judgement on how “good” a student, teacher or school might be,” Dr Hinz says.
“However, some schools and families add their own high stakes to it, and overemphasise or misunderstand it.”
She says over-inflating NAPLAN’s importance had the potential to lead to over preparation at the expense of deeper learning of key subject matter and a rich, broad and engaging curriculum.
Dr Hinz says that while standardised testing has existed in Australia for some time, NAPLAN is the first test where the results of schools in different states could be easily compared and were also available to parents and the public.
In the Education and References Committee’s final report in 2014, ACARA argued perceived unintended consequences were a misconception of what NAPLAN was trying to achieve.
“By way of example, the teacher survey undertaken by Murdoch University in 2012 invited participants to respond to statements such as: ‘NAPLAN promotes a socially supportive and positive classroom environment’ and ‘NAPLAN has meant that students have control over the pace, directions and outcomes of lessons in my class’.
“Both of these aspects of classroom environment and curriculum planning are clearly the responsibilities of teachers,” ACARA said.
Dr Suzanne Rice, Senior Lecturer in Education Policy and Leadership at the University of Melbourne found in her analysis of the national 2016 report that overall Australian achievement had stagnated.
In a state by state breakdown, ACT topped reading results in each of the assessment groups, which include Years 3, 5, 7 and 9.
The highest achievers in narrative writing were from Victoria, with the state also taking out the top result in all grades in numeracy, except Year 9.
Dr Rice told Education Matters that the NAPLAN report can’t identify what is happening at a school level, or the reasons behind the general lack of improvement in achievement.
She notes that supporting teacher professional learning, a funding system more tightly targeted to high needs schools, and policies requiring all schools to take a percentage of disadvantaged students would be a good place improve the system.
“I’m not against standardised testing, it’s a question of how you use it,” Dr Rice says.
She says that NAPLAN should be used to identify areas for improvement at a school or system level in areas such as mathematics and spelling, and not as a tool for measuring teacher performance.
“It is a useful tool for policy makers, but its limitations need to be acknowledged by politicians. This includes using it to identify who is or isn’t a good teacher – at times – performance pay keeps cycling back into educational discussions, and the tests are not designed to evaluate teacher performance,” she says.
Dr Rice expresses frustration at a tendency for educators, the media and governments to compare the results by state.
“It’s frustrating at times, because we know ACT will tend to come out on top, because it has the highest proportion of middle class students. NT tends to come down the bottom – this is unsurprising given much higher levels of student disadvantage… we need to be cautious about attributing results to particular policies, it’s easy for states to say we improved because of a certain policy when there is a range of factors influencing why results have come up or down.”
Dr Rice says that in an age of vast information the mainstream media has a temptation to succumb to simplistic headlines drawing attention to fluctuating results.
“I think the difficulty for the media, is that there is always this temptation to slide across something that’s simplistic.
“It’s about trying to maintain delity to the complexity of what we’re looking at.”
In terms of improving student achievement, she points to the work of education researcher Professor John Hattie, who notes the importance of teachers becoming self-evaluating.
“It’s not just about looking at how a student learns, but how much of an impact you have had as a teacher on your learners.
“Using it to improve planning and practice – what have students learned?…And switching focus to what’s been learned, over what’s been taught and the impact of this.”
Dr Rice calls for a funding system that’s skewed more towards high needs students and schools.
In an article in The Conversation, Dr Rice highlighted research by Rumberger and Palardy, (2005) that found individual student background and diversity of students had an impact on achievement.
(Watson, Louise and Ryan, Chris 2010) also found a higher proportion of public school students came from low socio-economic status backgrounds.
“The other question around funding and the cost of NAPLAN, is whether it is the best use of the money. Would we be better off to do less standardised testing and put more money into education funding?”
Commenting on the idea of compulsory phonics testing, Dr Hinz says phonics testing is already occurring across the country.
“Teachers are already doing these things. I’m not convinced a 7 or 15 minute quiz of phonics in the second year of school is the best use of limited resources, given that schools already have this information from their own assessments. What we
need is to put resources into schools to enable them to better support those children that are struggling.”
A NAPLAN ASSESSMENT
However, ACARA CEO Robert Randall told Education Matters many schools are very well placed to interpret assessment data – whether it is NAPLAN or school-based assessment data.
He says NAPLAN provides an external, national reference point to assist teachers to make judgements about student achievement and where they can do better.
“In addition, valuable data are used for forward planning, allocating support and resources and tracking the progress and achievements of individual students, as well as an entire group of students, over the course of their educational journey. All NAPLAN reports come with detailed interpretation guides for school leaders, teachers and parents. Individual Test Administration Authorities supplement this information with additional reports.”
He says Governments and school authorities have an important role in supporting school leaders and teachers to develop their skills in analysing data.
“NAPLAN tests are constructed to give students an opportunity to demonstrate skills they have learned over time through the school curriculum, and NAPLAN test days should be treated as just another routine event on the school calendar,” he says.
On the topic of state by state comparisons causing complications, Mr Randall says by comparing national and school averages against agreed national standards, parents and caregivers could better understand if their child is ‘on track’.
“NAPLAN data and the My School website enables schools to compare the results of students at their school against national results for the overall population and for schools with students from the same socio-educational backgrounds. This enables schools to determine whether they should be satisfied with the results and/or whether they can do better,” Mr Randall says.
“Comparisons can help drive improvement and help ensure all Australian students are achieving important national goals. ACARA encourages stakeholders and policy makers to interpret data beyond simplistic league tables.
“Schools should be compared based on the student population they serve. The Index of Community Socio-Educational Advantage provides a means of allowing fair comparisons. In addition, schools that achieve high levels of gain are identi ed and should be studied to learn how they achieve their success.”
When asked about targeting disadvantage and funding schools accordingly, Mr Randall says it’s clear that schools are using NAPLAN data to drive improvement.
“At the system level, the National Assessment Program provides school leaders with information about the success of their programs.
“NAPLAN also provides data that can be used to monitor the success of policies aimed at improving the achievement of different student groups, such as Indigenous students or girls and boys. Policy makers are encouraged to look at multiple achievement measures in making decisions. For NAPLAN that means looking at both current year achievement along with growth within and across cohorts.
“The outcomes of the NAPLAN assessments are used to inform future policy development, resource allocation, curriculum planning and, where necessary, intervention programs.”
He says from 2017, federal, state and territory education ministers have agreed that NAPLAN would move online over a three-year period, from the current paper-based tests to computer-based assessments.
“NAPLAN online will provide better assessment, more precise results and faster turnaround of information.
“NAPLAN online will use ‘tailored testing’, which gives students questions better suited to their achievement level, resulting in better assessment and more precise results. The precision and improved timing of the results will help teachers tailor their teaching more specifically to student needs,” Mr Randall says.
“Once all schools are online, ministers can expect ACARA and state and territory authorities to seek further improvement. Such improvement might include broadening the nature of the questions and the scope of the assessments, resulting in even more authentic assessment of student curriculum knowledge and skills.”
But Mr Birmingham says a mandatory phonics would complement other resources such as NAPLAN to allow for early intervention.
“Well early on is indeed when you expect a number of the very basic foundational skills to be established.
“So in terms of phonics, the learning of the sounds in the alphabet, the 26 different sounds of the alphabet but then the 42 different phonetic letter sounds that are essential to be able to construct words or deconstruct words, for children to work out how to read them as one of the many skills in relation to developing good, sound literacy skills, well they are important things to be developing in that first year or two of a child at school.
“Left until a child is in year three and getting on to being eight or nine, if there are problems that haven’t been identified until then, intervention becomes so much harder and the likelihood is that child is so much further behind.
“That’s why so many dyslexia advocates in particular have been calling for this type of skills check to be put in place at an early level for many, many years because they know that you can get much, much earlier identi cation, intervention, and therefore assistance to help ensure that the early years of a child’s education are successful which then enables them to be successful for the rest of their schooling.”
Mr Birmingham says the expert panel, which is due to report at the end of April, consists of dyslexia experts, a speech pathologist, a teacher and a principal who’s already applied phonics in his own Victorian government school.
Copyright (2016) Why NAP. Available at: https://www.nap.edu.au/about/why-nap
Copyright, N. (2016) About. Available at: https://www.nap.edu.au/about
The Australian Primary Principals Association (2013) Primary Principals: Perspectives on NAPLAN & Assessment. Available at: https://www.appa.asn.au/wp-content/uploads/2015/08/Primary- Principals-Perspectives-NAPLAN.pdf
CHAPTER 3 – parliament of Australia (2015) Available at: http://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Education_and_Employment/Naplan13/Report/c03?print=1 (Accessed: 23 February 2017).
Australia, C. of A. (2013) Improving school performance – parliament of Australia. Available at: http://www.aph.gov.au/About_Parliament/Parliamentary_Departments/Parliamentary_Library/pubs/ Brie ngBook44p/SchoolPerformance
Australia, C. of (2013) Improving school performance – parliament of Australia. Available at: http://www.aph.gov.au/About_Parliament/Parliamentary_Departments/Parliamentary_Library/pubs/ Brie ngBook44p/SchoolPerformance
Thompson, G. and Murdoch (2014) ‘NAPLAN, MySchool and accountability: Teacher perceptions of the effects of testing’, International Education Journal: Comparative Perspectives, 12(2).
Government senators’ additional and dissenting comments – parliament of Australia (2015) Available at: http://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Education_Employment_ and_Workplace_Relations/Completed%20inquiries/2010-13/naplan/report/d01#anc1
CHAPTER 3 – parliament of Australia (2015) Available at: http://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Education_and_Employment/Naplan13/Report/c03
Rumberger, R. and J Pallardy, G. (2005) ‘Does Segregation Still Matter? The Impact of Student Composition on Academic Achievement in High School’, Teachers College Record, 107(9).
Wyn, J., Turnbull, M. and Grimshaw, L. (2014) ‘The Experience of Education: The impacts of high stakes testing on school students and their families A Qualitative Study’, Whitlam Institute, .