The Higher School Certificate has long been considered the jewel in the crown of the NSW education system. It is viewed internationally as a rigorous measure of student achievement, comprising both school-based cumulative assessment and externally-set exams.
Every year, the release of HSC results attracts interest and revives debate over the merits of testing.
The HSC has survived numerous threats and is currently facing a new one, with an increasingly widespread view that exams are outdated in the digital information age, a criticism echoed by opponents of NAPLAN.
The anti-exam argument goes like this: memorising facts is not as important as being able to think critically and creatively; and exams are too stressful and contribute to mental health problems.
Yes, exams are stressful. Millions of students over millennia have been anxious about exams. For the majority, it is productive anxiety that provides a motivating force to get the best possible result. For some, however, stress levels around exams have to be managed carefully.
Students who aspire to university have to learn to deal with exam stress because it will continue to be part of their lives. As for the argument that exams are no longer a suitable way to assess academic achievements and capability, there is no evidence to support this. In essence, learning is understanding plus memory.
Students should of course understand what they are learning, but if the information has not been retained in their memory, they haven’t really learned it.
Well-designed exams determine whether students remember the knowledge and skills they have been taught, and if they can independently apply them to an unseen task.
In this way, exams are arguable more equitable than other forms of assessment. Achievement in an exam is less dependent than an assignment on a student’s home and school resources and more closely a product of individual capability and effort. Once you get into an exam room, your parents and teachers can’t help you anymore.
Tests are the only way to get an objective measure of system and national performance. The purpose of national assessments like NAPLAN and international assessments like PISA is not to satisfy a “neoliberal fixation with quantification” as New South Wales Education Minister Rob Stokes has put it. It is to get a clearer sense of achievement and progress over time, with improvement being the goal.
Mr Stokes is correct that simple comparisons of PISA performance with countries like Finland are not always very useful. There are significant demographic and cultural differences that affect test scores, above and beyond the influence of schools and teachers. Yet it remains the case that Australia has been backsliding in PISA since 2000 — not just relative to other countries, but relative to our own previous PISA performance.
That 15-year-old students had lower levels of reading literacy in 2015 than they did in 2000 — after billions of dollars spent on school education, and countless hours of teaching and learning — should be of concern to everyone. This sort of information can only be obtained through a test that is carefully standardised and moderated and quantified.
Given that we do have this information, the salient question is what to do about it. On this point, Minister Stokes is also correct. Reviewing the Melbourne Declaration on Educational Goals for Young Australians is unnecessary. It will have no effect whatsoever on ensuring that all of the struggling readers around Australia receive the gift and the right of literacy.
What will have an effect is using the data available and acting on it to ensure that all children are getting the most effective evidence-based teaching methods, and achieving at the highest standard of which they are capable. Hopefully the state and federal ministers can all at least agree on that.
Dr Jennifer Buckingham is a senior research fellow at The Centre for Independent Studies