PISA education ranking challenged by questions over student motivation

BJ education STEM science maths teaching student 1There is an annual furore when Australia continues to show a disturbing decline in one of the crucial international education measurements.

The Program for International Student Assessment (PISA) conducted every three years produces a wealth of data that can be used in a variety of ways — some more educationally valid than others.

The most common use of the data is to compare and rank countries based on their average scores, akin to an education World Cup.

However there are legitimate concerns about this practice in terms of the policy lessons that might be applied. It is interesting to look at country ranks, but given that the number of participating countries changes from cycle to cycle, rankings can be misleading.

It is also inappropriate to ascribe differences in country rankings entirely to differences in their education systems. Cultural, demographic and out-of-school factors are also influential. These include but are not limited to the prevalence of after-school tutoring, and the cultural emphasis on educational achievement.

Cultural differences are related to differences in behaviour in school, motivation to learn, and motivation to do well on tests.

Since the PISA tests are ‘low stakes’ assessments for the students and schools that participate—there are no benefits or consequences for high or low performers—motivation to apply effort in the tests is not assured.

A number of researchers and commentators have speculated in recent years that performance on PISA may be a function of student motivation to do well on the test, and that the level of motivation may differ between countries. Students in countries that place a high value on academic achievement — both personally and as a society — may have higher test performance than students in more individualistic cultures where social obligations are not as strong.

A new study from the National Bureau of Economic Research provides some evidence to support this theory. The authors hypothesised that intrinsic motivation to do well on the test would be higher among students in China — traditionally a more collectivist culture — than in the US —traditionally a more individualistic culture. To test this hypothesis, they conducted an experiment in which a sample of students in each country was offered an extrinsic reward for higher scores in a test constructed of PISA items and compared their performance to a control group of their peers.

As they had anticipated, the extrinsic reward made little difference to test performance in Shanghai, whereas among US students, those who were offered an extrinsic reward performed significantly higher. This is because students in Shanghai were already performing at close to their optimum level of effort and achievement. Students in the US attempted more items and were more likely to give correct answers if offered an extrinsic reward for their effort.

While this is only one study involving only two countries, and therefore generalisations to other countries — including Australia — are tenuous, it raises questions about the interpretation of PISA ranks. The authors estimate that the increase in US score achieved by students in the extrinsic reward condition would translate to moving 17 places up the country rankings.

It is also supported by a studies that find students from Chinese and other East Asian backgrounds in Australian schools achieve significantly higher in PISA tests than their peers.

This is not to say that country level PISA data is meaningless. Very high performing countries like Singapore demonstrate what is achievable and it is instructive  to investigate what educational policies and reforms might have contributed to their performance, even though they cannot necessarily be adopted wholesale in different countries with different structures of government and governance.

The performance of countries over time is equally important. PISA has now been conducted for 15 years and, in that time, Australia’s performance has declined substantially in real terms. In reading and mathematical literacy, from 2000 to 2015 the mean scores of Australian students have dropped by the equivalent of a year’s worth of learning. Only 61% of students achieved the National Proficient Standard in reading and 55% in maths in 2015.

Students’ care factor might explain some of the variance in PISA country rankings but is unlikely to explain our own decline. For that, we will have to look closer to home.

Dr Jennifer Buckingham is senior research fellow at The Centre for Independent Studies www.cis.org.au