¾«¶«´«Ã½

A close up of a laptop with students studying in the background. Photo: UOW
A close up of a laptop with students studying in the background. Photo: UOW

Are Australian students really falling behind?

Are Australian students really falling behind?

It depends which test you look at

Ask anyone about how Australian students are doing in school and they will likely tell you our results are abysmal and, more importantly, getting progressively worse.

This narrative has been reinforced by sustained reporting within and the . It has with the release of the 2022 Programme for International Student Assessment (PISA) on Tuesday evening.

But is this accurate and fair?

This year we independently both published papers looking at Australian students’ results. These papers both reached the same conclusions: students’ scores on the vast majority of standardised assessments were not in decline.

What tests do Australian students do?

Australian students sit multiple . These are tests that are set and scored in a consistent manner. Importantly, scores from one assessment round are statistically “matched” with those from previous rounds, meaning comparisons of average scores over time are possible.

Australian students do NAPLAN in Year 3, Year 5, Year 7 and Year 9. This is a national test that looks at literacy and numeracy skills.

Australian students also sit several international tests. aims to measure 15-year-old students’ application of knowledge in maths, science and reading.

They also sit (PIRLS) which looks at Year 4 students’ reading comprehension skills and (TIMSS), which assesses maths and science knowledge in the curriculum in Year 4 and Year 8.

measures students’ science literacy in Year 6 and Year 10. NSW students also complete (VALID) assessments in science based on the NSW syllabus in Year 6, Year 8 and Year 10.

Sally’s research

documented average scores in the four major standardised assessments in which Australia’s students have participated since 1995.

All but one assessment program (PISA) showed improvements or minimal change in average achievement.

In particular, primary school students’ scores in some of the standardised literacy and numeracy tests, including NAPLAN, PIRLS and TIMSS, have notably improved since the start of testing in each program.

For example, for PIRLS, which tests Year 4 reading skills, the average score for Australian students increased from 527 in 2011 to 544 in 2016 and 540 in 2021 (the difference between 2016 and 2021 is negligible).

Since NAPLAN testing began in 2008, average Year 3 reading achievement has increased by the equivalent of a full year’s progress.

In high school, students’ NAPLAN and TIMSS results have stayed largely the same over the same time span.

Helen’s research

explores the assumption there is a real and significant decline in Australian students’ achievement in science. It looks at assessments of students’ science literacy, including PISA, TIMSS, NAP-SL and VALID.

NAP-SL has no historical data but between the other three assessments, for PISA.

For both TIMSS and VALID, average scores remain stable, though TIMSS reveals improvements during the period PISA scores appreciably decline. Analysis on PISA scores for NSW public school students also reveals no decline.

What does this mean?

So when we talk about a “decline” for Australian results, we are really just talking about a decline in PISA results. While these do indeed show a decline, there are other important factors to consider.

First, PISA is one of many assessments taken by Australian students, each providing important but different information about achievement. As 2023 , PISA receives a lot more attention than other international tests. While there is no definitive reason for this, researchers suggest

the OECD purposefully set out to [give it more attention], branding and marketing the study in such a way to maximise media, public and policy attention.

A 2020 paper also the “growing body” of around PISA.

This includes doubts over whether PISA actually measures the quality of education systems and learning, or if it measures something distinct from existing tests.

Comparing scores and ranks is also because countries’ scores are not exact. For example, in 2018, Australia’s reading literacy score (503) was considered “” from ten other countries, meaning its rank (16th) could potentially be as high as 11 or as low as 21.

Why we should be cautious

Australia needs to be cautious about an over-reliance on PISA results.

For example, last month a from educational consultancy Learning First called for an overhaul of Australia’s science curriculum. In part, it based its argument on “deeply disturbing trends” around “sliding performance” on declining PISA results.

So we need to be careful about what these results are used for and how they may be used to justify big changes to policy.

Perhaps most importantly, however, is that the decline narrative diminishes and minimises the difficult and amazing work teachers do. While improvement should always be on the agenda, we should also celebrate our wins whenever we can. The Conversation

, Senior Lecturer in Science Education, and , Lecturer,

This article is republished from under a Creative Commons license. Read the .


UOW academics exercise academic freedom by providing expert commentary, opinion and analysis on a range of ongoing social issues and current affairs. This expert commentary reflects the views of those individual academics and does not necessarily reflect the views or policy positions of the ¾«¶«´«Ã½ of ¾«¶«´«Ã½.