Beyond the Top 100: Unpacking Australia’s School Rankings

In the latest NAPLAN results, Tasmania, the Australian Capital Territory, and the Northern Territory found themselves without a single school in this year’s coveted and much-reported “Top 100 schools list”. For many in these communities, this revelation has sparked questions about their education systems, the work of teachers, and broader implications of poor literacy and numeracy skills.

In this post, I delve into the nature of lists like the “Top 100 schools” to shed light on whether education ministers, school leaders, and teachers in some states and territories should be losing sleep over this phenomenon.

What Does It Mean to Be in the “Top 100”?

For a school to be ranked in the highest scoring 100 primary or secondary schools means it has outperformed the vast majority of Australian schools when you lump together the scores of all their students across all five domains of NAPLAN (i.e., reading, writing, spelling, grammar and punctuation, and numeracy). This does not say a great deal about individual student performance (you can have high-scoring students at any school) but simply reflects a given school’s average score across all tests for all its students.

As I’ve discussed in research articles and other blog posts (like this one), it’s worth noting that schools with more female students have tended to fare better statistically when you aggregate all students and scores in this way. This is because four of the five NAPLAN domains test literacy skills, and there are substantial gender gaps in favour of girls for all literacy tests (a phenomenon found in countries across the world). More girls generally mean higher average literacy scores for schools, resulting in higher aggregate NAPLAN scores.

So which Schools Tend to Make the Cut?

Perhaps unsurprisingly, given their populations of ~8.3 million and ~6.8 million respectively, New South Wales and Victoria dominate every year’s “Top 100 schools list”. For 2023, these states accounted for 86 of the top primary schools and 79 of the top secondary schools. Queensland and Western Australia fought for the scraps (with just 6 of the highest-scoring primary schools each, and Queensland having 12 of the highest-scoring secondary schools compared with 8 in WA). South Australia snuck in with a couple of primary schools and one secondary.

Dominating the list each year are elite private schools and selective public schools. Elite private schools can charge upwards of $50k per child per year in NSW and VIC and this quite remarkable level of resourcing affords smaller class sizes and (as many of their websites suggest) more differentiated and personalised learning. Selective public schools are free to attend but only admit the highest achieving children from a given community (hence being selective). This enrolment process results in schools made up of essentially only high-achieving children, leading to higher-than-usual results on NAPLAN for these schools when all students are lumped together.

Meanwhile, Tasmania, the Australian Capital Territory, and the Northern Territory have historically struggled to secure spots in such top school lists. This can be explained by various factors, with the most obvious being their minuscule populations (~572k, ~466k, and ~252k respectively), private schools with comparatively low fees and resourcing, and the absence of selective schools that filter in the “best and brightest”.

Tasmania and the Northern Territory also face several contextual challenges, making their absence from top 100 lists easier to understand. With smaller populations comes fewer high-performing students pushing up the topmost extreme. Moreover, the prevalence of students attending schools in regional, rural, and remote communities, who statistically perform lower than their urban counterparts, further impacts the overall NAPLAN outcomes of schools in these areas.

“Top 100 schools” lists

With all this said, should there be concerns everywhere apart from NSW and VIC about lists like this each year? I don’t think so. For a school to make the grade in a region that does not have the most elite of private schools or selective schools would be quite remarkable when you consider the schools with which they are competing.

Personally, I think ACARA’s list of high-performing schools in each state and territory is more useful than the top 100, since this does take into account socio-educational advantage to highlight the schools around the country that are punching above their weight classes when it comes to NAPLAN outcomes.

Given that many education departments, Catholic Education Offices, and independent schools around the country have recently implemented policies that reflect decades of research into effective literacy and numeracy instruction, it will be interesting to see whether the results of these changes flow through to higher NAPLAN scores from next year.

Analysing the 2023 NAPLAN Test Results: A Whole New World of Testing

The release of NAPLAN test results always sparks conversations and debates about the state of education and student performance. The 2023 results are no exception, but this year’s results come with a twist that makes comparisons even more intriguing. With NAPLAN testing shifting to two months earlier in the school year (from May to March), it was expected that student results would be lower than in previous years (when students had two months of additional learning to influence test preparation). Of course, it doesn’t make much sense to compare the test results in 2023 with any of the previous tests (which began in 2008), since we would be comparing ripe apples with slightly less ripe apples. But I decided to go ahead and compare the 2023 results with the 2022 results anyway, just for fun, split up by gender. Some of the findings were definitely unexpected!

Reading and Writing: A Mixed Bag

The 2023 reading results for male and female students in every tested year level (i.e., Years 3, 5, 7, and 9) all showed a downward trend compared to the 2022 results. This can likely be attributed to the earlier testing date, with 2023 students having less time to develop reading skills before the test.

There was a dip in writing results among primary school males and females that mirrored this trend, as was expected. However, in a fascinating twist, writing scores actually improved for male and female students in Year 7 and Year 9. In fact, compared to all NAPLAN writing tests since it was modified in 2011, the 2023 scores were the highest ever for Year 7 males and females and Year 9 males, while for Year 9 females it was their second highest. How is this possible? Could something have changed in the marking process? This is unlikely since nothing like this has been discussed by ACARA. Did secondary school students find the 2023 writing prompt easier to address in the limited test time? This is somewhat plausible. Could secondary school students be feeling more positive about NAPLAN testing in March than in May? Without more information, it’s not possible to know what has driven this marked increase in secondary writing test scores. But it’s certainly odd that students with two fewer months of preparation would perform higher on a test that, for all intents and purposes, seems equivalent to all recent writing tests.

Despite the positive news for secondary students, it should be pointed out that Year 9 males are still performing at a level equivalent to Year 7 females, demonstrating a persistent gender gap that merits further investigation. Year 9 males performed more than two years of equivalent learning behind Year 9 females (i.e., 24.12 months – yikes!).

Spelling and Grammar: Heading Down

Like reading, spelling scores were down for males and females in all tested year levels. Again, this was expected given the shift to earlier testing.

Grammar and punctuation results mostly followed the same downward pattern, with Year 3, Year 5, and Year 9 males and females all achieving lower scores than the 2022 students. Strangely, grammar and punctuation scores for Year 7 students from each gender were higher than Year 7 students who sat the test in 2022.

As a noteworthy point from the data, Year 7 females outperformed Year 9 males for the first time in any NAPLAN grammar and punctuation test (or any NAPLAN literacy test). This can be explained by the considerable (but expected) decline in Year 9 male scores, while Year 7 female performance was somehow largely consistent with recent years, even with the earlier testing.

Numeracy: A Glint of Improvement

In terms of numeracy, all primary school males and females somehow scored higher than their 2022 counterparts (except for Year 5 females whose scores in 2023 were slightly down). Surely the numeracy test and its marking procedures haven’t changed, so it’s unclear why there would be clear improvements. Year 3 males managed to achieve their highest mean score of any previous NAPLAN numeracy test. With two fewer months of class time 🤷‍♂️

On the other hand, secondary school students, regardless of gender, scored lower than the 2022 students. But again, this was expected, so no alarm bells yet.

Final Thoughts: Beyond the Numbers

While the 2023 NAPLAN results might not be directly comparable to previous years due to the changed testing timeline, they offer valuable insights into the dynamics of education and student performance. The interplay of gender, year levels, and subject areas provides a rich tapestry of information that policymakers, educators, and researchers can draw from to tailor interventions and strategies.

It was kind of shocking to see that in specific areas, the earlier 2023 testing procedure resulted in higher scores (i.e., secondary writing and primary numeracy). That said, all students would clearly benefit from the additional two months of learning about reading, spelling, grammar, and punctuation.

The 2023 results highlight the importance of considering the broader context surrounding NAPLAN test scores. As we move forward with this whole new world of NAPLAN testing, complete with four shiny new proficiency standards that replace the previous bands, it will be as intriguing as ever to see the rise and fall of student results across the country. These broad pictures of student achievement would not be possible without NAPLAN testing.

Figuring out figurative language in high-scoring narratives

Recently, I started a new research project with four colleagues to investigate the writing choices made by primary and secondary school students who scored highest of all Queensland students on the three most recent NAPLAN writing tests. I have done this sort of research in the past but always focused on successful persuasive writing across the tested year levels (i.e., 3, 5, 7, and 9). For our new project, named NAPtime, we will investigate the narrative writing choices valued by NAPLAN markers for the first time. The Queensland Curriculum and Assessment Authority (caretakers of completed NAPLAN tests up here) granted us access to the 285 highest-scoring Queensland writing samples written for the 2019, 2021, and 2022 NAPLAN tests (i.e., roughly 20-25 samples per year level for the three years of the test). In the next couple of years, my colleagues and I will use a variety of linguistic and rhetorical frameworks to identify patterns in the students’ writing and communicate our findings to the education community.

My first exploration of the successful writing samples will focus on the students’ use of figurative language to entertain their readers. Figurative language choices are often referred to as figurative devices, rhetorical devices, literary devices or figures of speech, and are commonly associated with poetry and argumentation, but high-quality narratives are also overflowing with artful and playful uses of figurative language. In fact, this is often what makes the stories we read so engaging.

Figurative language has been the focus of research and teaching for (literally) thousands of years. The figurative language choices I’ll be looking for in the NAPLAN writing samples were identified first by Aristotle and other rhetoricians way back in Ancient Greece. Aristotle outlined the ins and outs of five canons of classical rhetoric—Invention, Arrangement, Style, Memory, and Delivery—which included everything a speaker or writer would need to discover, organise, and communicate compelling ideas through spoken and written texts. Of most relevance to our NAPtime research project is the third canon, Style, which concerns how we put the ideas we have into words that are communicated with others. This is the part of classical rhetoric that dealt with figurative language.

Figurative language in the Australian Curriculum: English.

It’s quite amazing to see just how much emphasis is given to figurative language in the Australian Curriculum: English. Even a cursory glance will show this is one of the most underrated aspects of English teaching. Unlike certain other aspects of English that are only dealt with in niche sub-strands of the curriculum, figurative language can be found across all three strands (i.e., Language, Literature, and Literacy), spread out across a full eight sub-strands! While figurative language is taught from Year 1 to Year 10, it becomes especially prominent in the secondary school years, where it’s mentioned directly in six content descriptions for each secondary year level (i.e., 7, 8, 9, and 10). In this sense, teaching students to interpret and use figurative language is likely a regular part of every secondary English teacher’s day job.

Despite the wide reach of figurative language, this aspect of English is, arguably, treated in a fairly disjointed manner in the Australian Curriculum: English. Figurative language pops up here, there, and everywhere. It is described as serving many varied functions in different types of texts, such as enhancing and building up layers of meaning; shaping how readers interpret and react to texts; influencing audience emotions, opinions, and preferences; evaluating phenomena; and conveying information and ideas. At times, it is described as a stylistic tool of poetry, songs, and chants; at other times it’s a persuasive tool of argumentation; at other times it’s a literary tool of storytelling. All these uses make figurative language feel a bit like sand slipping through your fingers; nothing really ties it together.

The Australian Curriculum: English refers to 14 figurative devices explicitly (i.e., metaphor, simile, personification, onomatopoeia, assonance, alliteration, hyperbole, idiom, allegory, metonymy, ellipses, puns, rhetorical questions, and irony). This might seem like a lot, but more than 200 figurative devices have been identified in the writing of William Shakespeare alone (Joseph, 1947)! It would be interesting to know how and why these 14 figurative devices have been named in the curriculum.

Figurative language in the NAPLAN writing tests

Another place educators come across figurative devices is in the NAPLAN writing marking guides. The persuasive writing version of the test includes a criterion named Persuasive devices, which involves “The use of a range of persuasive devices to enhance the writer’s position and persuade the reader” (ACARA, 2013, p. 6). In the glossary of the persuasive writing marking guide, nine figurative devices are mentioned: alliteration, simile, metaphor, personification, idiom, puns, irony, hyperbole, and rhetorical questions. The guide also includes some descriptions of the effects of other figurative devices (e.g., parallelism, anaphora, epistrophe) without mentioning the technical names (e.g., “Words or phrases at the beginning or end of successive clauses or statements” refers to anaphora and epistrophe).

The NAPLAN narrative writing marking guide (ACARA, 2010) drops the Persuasive devices criterion and replaces it with another named Character and setting, which involves “The portrayal and development of character” and “The development of a sense of place, time and atmosphere” (p. 4). Only metaphor and simile are mentioned in the glossary as part of key vocabulary choices, while ellipsis is mentioned as a key resource for building text cohesion.

What can we take from the emphasis on figurative language in these marking guides? It seems the designers of the NAPLAN writing test are expecting students to use figurative language in both versions, but only really sets markers up to identify the use of specific figurative devices in the persuasive version. There is possibly an assumption here that figurative language is more important in persuasive writing than in narrative writing. When you add the Australian Curriculum’s substantial but disjointed emphasis on figurative language into the mix, it’s quite likely that some Australian teachers would feel unsure about the aspects of figurative language to teach and in which genres.

Our approach in the NAPtime research

Educators and curriculum designers in contemporary settings might get a better grip on figurative devices if we follow the lead of classical rhetoricians who divided them into two categories: schemes and tropes. Both can be described as fundamental to how we put together sentences in written or spoken texts.

Simply put, a scheme (from the Greek word schēma, meaning form or shape) involves changing the usual pattern or arrangement of words in a sentence. A well-known scheme is alliteration, which involves the repetition of initial phonemes in two or more adjacent words, such as when Professor McGonagall from Harry Potter described students as “behaving like a babbling, bumbling band of baboons!”

A trope (from the Greek word tropein, meaning to turn) involves changing the normal meaning of words in a sentence. A well-known trope is metaphor, which involves making a comparison between two different things that have something in common, such as when Mrs Dursley from Harry Potter is compared to a crane (i.e., a longnecked bird): “she spent so much of her time craning over garden fences, spying on the neighbours”.

Dividing the 14 figurative devices mentioned in the Australian Curriculum: English and the nine in the NAPLAN persuasive writing marking guide into schemes and tropes shows that these documents strongly favour tropes (i.e., nine tropes vs. three schemes in the curriculum and eight tropes vs. one scheme in the NAPLAN marking guide). A key interest of my research into high-scoring NAPLAN narratives will be to determine how the students used schemes and tropes to entertain their readers, and how well these key policy documents reflect the choices valued in the NAPLAN writing context.

I will pay close attention to the following 19 schemes and 17 tropes that are particularly useful in contemporary writing (Corbett & Connors, 1999). Clearly, this is more than double the number mentioned in the curriculum and NAPLAN, and some may not have been used much or at all by the high-scoring students. It’s also possible that some devices were only used in certain year levels, so there is potential for interesting findings here. If we discover that NAPLAN markers rewarded students for using figurative devices that do not even appear in the key policy documents guiding our teachers, there will be fascinating implications for the usefulness, equity, and ongoing enhancement of these documents.

Without further ado, here is a table of the schemes and tropes that I will look for in my first NAPtime article, with pronunciations, definitions, and examples:

SchemeDefinitionExample
ParallelismRefers to when words, word groups, or clauses in a sequence have a similar structureHe enjoyed studying English, history, and science.
Isocolon (ī-sō-cō’-lon)A type of parallelism that occurs when parallel elements not only share a similar structure but also have the same length, such as the same number of words or even syllablesIn this classroom, minds expand, ideas ignite, and knowledge flourishes.
ClimaxWorks together with parallelism. Occurs when words, word groups, or clauses are arranged to build up in importance or intensityBy the end of the school year, students will be armed with skills, wisdom, and a burning desire to make their mark on the world.
Antithesis (an-tith’-ə-sis)A type of parallelism that occurs when contrasting ideas are placed side by sideDespite the rules and routines, the class had wild bursts of creativity. They seemed to value both conformity and rebellion.
Anastrophe (ə-‘na-strə-fē)When the usual word order of a clause or sentence is invertedA place of endless possibilities, a school is.
Parenthesis (pə-ren’-thə-sis)The insertion of a verbal unit that interrupts the normal flow of a sentence’s structureA school—with students hurrying between classrooms and the sound of slamming lockers—is a vibrant and dynamic place.
AppositionPlacing two elements side by side, where the second element serves an an example or modification of the firstThe teacher, a tireless advocate for learning, guides the students with dedication and passion.
EllipsisThe intentional omission of a word or words that can be easily understood from the contextYou can enter the Year 5 classroom down the corridor, and Year 6 up the stairs.
Asyndeton (a-sin’-də-ton)The purposeful omission of conjunctions between connected clausesBooks, pencils, notebooks, a backpack filled to the brim—all essentials for a day of learning.
Polysyndeton (pol-ē-sin’-də-ton)The purposeful use of many conjunctionsThe young student struggled to carry her books and her pens and her laptop and her calculator and her highlighters to class.
AlliterationThe repeated use of the same sound at the start of several consecutive wordsA boisterous banter of students blended with the rhythmic rattle of rolling backpacks.
AssonanceThe repeated use of similar vowel sounds in stressed syllables of consecutive words, with different consonant sounds before and after themThe playful students stayed late to engage in debate.
Anaphora (ə-naf’-ə-rə)The repeated use of the same word or words at the start of several consecutive clausesIn this class we pursue our dreams. In this class we discover our potential. In this class we become who we are meant to be.
Epistrophe (ə-pis’-trō-fē)The repeated use of the same word or words at the ends of consecutive clausesIn the classroom, we learn. In the hallways, we learn. In the library and the gym, we learn. Everywhere in this school, we learn.
Epanalepsis (ə-pon-ə-lep’-sis)The repeated use of a word or words at the end of a clause that was used at the beginning of the same clauseLearning to write is the most important part of learning.
Anadiplosis (an-ə-di-plō’-sis)The repeated use of the last word of one clause at the beginning of the next clauseEducation is the key to unlocking doors, and doors lead to endless possibilities for a life lived well.
Antimetabole (an-tē-mə-tab’-ō-lē)The repeated use of words in successive clauses, but in reverse grammatical orderIn this class you will not only learn to read, but you will read to learn.
Chiasmus (kī-əz’-mus)When the grammatical structure in successive word groups or clauses is reversedAs teachers, we shape our students, but then our students shape us.
Polyptoton (pō-lip’-tə-tahn)The repeated use of words that are derived from the same root wordThe new learnings of the learners helped them learn most of all.
TropeDefinitionExample
MetaphorThe comparison of two different things that by implying a connection between themSchools are fertile gardens where knowledge takes root and young minds can bloom.
SimileThe comparison of two different things by using ‘like’ or ‘as’ to make the comparison explicitThe children gathered around the teacher, like bees around a hive.
Synecdoche (si-nek’-də-kē)When a part of something is used to represent the whole thingMany hands helped make the school fair a success.
Metonymy (mə-tahn’-ə-mē)The substitution of a word or word group with another that is closely associated or suggestive of intended meaningThe pen is mightier than the sword.
Pun: Antanaclasis (an-ta-nak’-la-sis)The intentional use of one word in two or more different waysIf you never learn the content, you’ll never learn to be content.
Pun: Paronomasia (par-ə-nō-mā-zha)The intentional use of words that sound similar but have different meaningsThe teacher plainly explained how the plane’s crash was unplanned.
Pun: SyllepsisThe intentional use of a word in a way that modifies two or more other words, but with each of those words understanding the original word differentlyThe teacher did not raise her voice or her hopes.
AnthimeriaOne part of speech is substituted for anotherThe student papered the hallway with his artistic skills.
Periphrasis (pə-rif’-ə-sis)The use of a descriptive word or word group instead of a proper noun or the use of a proper noun to refer to a quality or characteristic associated with itSarah was crowned the Queen of Knowledge for her amazing academic results.
PersonificationGiving human qualities or abilities things that are not humanThe library books whispered enticing stories, beckoning the students to embark on magical adventures.
Hyperbole (hī-pur’-bə-lē)The intentional use of exaggerated terms to emphasis meaningFor maths we were forced to sit and work through a thousand complex equations.
Litotes (lī’-tə-tēz)The intentional use of understated terms to minimise meaningJim’s performance in the science fair was not unimpressive.
Rhetorical questionPosing a question, not to receive an answer, but to express a point indirectlyCan you deny the importance of education in a child’s life?
IronyThe use of words in a way that mean the opposite of their literal meaningThe 50-page maths assignment was every student’s idea of a fun-filled holiday.
OnomatopoeiaThe use of a word that imitates the sounds it describesOver the courtyard she clashed and clattered on the way to the classroom.
OxymoronThe combination of two terms that are usually contradictory or opposite to each otherThe silent cacophony of the empty classroom filled the air.
ParadoxMaking a statement that seems contradictory but that holds some truthThe more you learn, the more you realise you don’t know.

I look forward to letting you know what we find. My hypothesis is that figurative language plays a much larger role in high-scoring narratives than the narrative marking guide suggests. If you are a teacher, how do you currently teach students to understand and use figurative devices in their own writing? Do you think it’s important for narrative writing?

References

Australian Curriculum, Assessment and Reporting Authority. (2010). National Assessment Program – Literacy and Numeracy: Narrative writing marking guide. https://www.nap.edu.au/_resources/2010_marking_guide.pdf

Australian Curriculum, Assessment and Reporting Authority. (2013). National Assessment Program – Literacy and Numeracy: Persuasive writing marking guide. https://www.nap.edu.au/_resources/amended_2013_persuasive_writing_marking_guide_-with_cover.pdf

Corbett, E. P. J., & Connors, R. J. (1999). Classical rhetoric for the modern student (4th ed.). Oxford University Press. 

Joseph, M. (1947). Shakespeare’s use of the arts of language. Columbia University Press.

3 simple ways to fix NAPLAN writing without scrapping the test

NAPLAN

Many education stakeholders and social commentators dislike the NAPLAN writing test. They think it (and the whole suite of annual tests) should be scrapped. NAPLAN undeniably influences classroom practices in a large number of Australian schools, and it’s also raised stress levels for at least some groups of students and teachers (Hardy & Lewis, 2018; Gannon, 2019; Ryan et al., 2021). These are valid concerns.

But as Australia’s only large-scale standardised assessment of writing, the test has the potential to provide unique and useful insight into the writing development, strengths, and weaknesses of Australia’s primary and secondary school populations (here’s an example). Added to this, the political value of NAPLAN, and the immense time, energy, and money that’s been poured into the tests since 2008 make it unlikely that the tests will be scrapped anytime soon.

Instead of outright scrapping the tests, or keeping them exactly as they are (warts and all), a third option is to make sure the tests are designed and implemented as well as possible to minimise concerns raised since their introduction in 2008. I’ve given the design of the NAPLAN writing test a great deal of thought over the past decade; I’ve even written a PhD about it (sad but true). In this post, I offer 3 simple fixes ACARA can make to improve the writing test while simultaneously addressing concerns expressed by critics.

1. Fix how the NAPLAN writing test assesses different genres

What’s the problem? At present, the NAPLAN writing test requires students to compose either a narrative or a persuasive text each year, giving them 40 minutes to do so.

Why is this a problem? The singular focus on narrative or persuasive writing is potentially problematic for a test designed to provide valid and reliable comparisons between tests over time. Those who have taught narrative and persuasive writing in classrooms will know these genres require often very different linguistic and structural choices to achieve different social purposes. It’s OK to compare them for some criteria, like spelling, but less so for genre specific criteria. ACARA know this too because the marking criteria and guide for the narrative version of the test (ACARA, 2010) are not the same as those for the persuasive writing version (ACARA, 2013). Even though the marking criteria for both versions are not identical, the results are compared as though all students completed the same writing task each year. There is a risk that randomly shifting between these distinct genres leads us to compare apples and oranges.

Also related to genre is the omission of informative texts (e.g., procedures, reports, explanations, etc.) in NAPLAN testing. Approaches to writing instruction like genre-based pedagogy, The Writing Revolution, and SRSD emphasise the importance of writing to inform. This is warranted by the fact that personal, professional, and social success in the adult world relies on being able to clearly inform and explain things to others. It’s not ideal that the significant time spent developing students’ informative writing skills across the school years is not currently assessed as part of NAPLAN.

What’s the solution? A better approach would be to replicate how the National Assessment of Educational Progress (NAEP) in the US deals with genres.

How would this improve NAPLAN? In the NAEP, students write two short texts per test instead of one, with these texts potentially requiring students to persuade (i.e., persuasive), explain (i.e., informative), or convey real or imagined experience (i.e., narrative). The NAEP is administered in Years 4, 8, and 12. Matching the typical development of genre knowledge (Christie & Derewianka, 2008), the Year 4 students are most likely to be asked to write narrative and informative texts, while those in Years 8 and 12 are most likely to write informative and persuasive texts. But students in all tested year levels can still be asked to write to persuade, explain, or convey experience, so knowledge about all the genres is developed in classrooms.

Why couldn’t we do something similar with NAPLAN? Including informative texts in our writing test would incentivise the teaching of a fuller suite of genres in the lead up to NAPLAN each year. Not including informative texts in NAPLAN is a little baffling since a large proportion of student writing in classrooms is informative.

2. Fix the NAPLAN writing prompt design

What’s the problem? At the start of every NAPLAN writing test, students receive a basic prompt that provides general information about the topic. Here’s an example from the 2014 test, which required a persuasive response:

NAPLAN prompt

As you can see, some useful information is provided about how children can structure their texts (e.g., Start with an introduction) and the sorts of writing choices they might like to make (e.g., choose your words carefully to convince a reader of your opinion). But how useful is this to a student who doesn’t have a lot of background knowledge about laws and rules?

My younger sister was in Year 5 in 2014 and she completed this writing test. She had previously been on two international school trips, and drew on these (and other) experiences to write an argument about raising Australia’s legal drinking age to 21, as it is in the US, and the many ways this would positively impact our society. Perhaps unsurprisingly, my sister achieved at the Year 9 standard for this persuasive text.

Why is this a problem? Compare my sister’s experience with a child from a lower socioeconomic area who had never been out of their local area, let alone Australia. It’s more challenging to suggest how rules or laws in our context should change if you don’t know about how these rules or laws differ in other places. The information provided in the prompt is far less useful if the child does not have adequate background knowledge about the actual topic.

Keeping the topic/prompt secret until the test is meant to make the test fairer for all students, yet differences in children’s life experiences already make a prompt like this one work better for some students than others. As an aside, in 2014 so many primary school students couldn’t answer this prompt that ACARA decided to write separate primary and secondary school prompts from 2015. This changed the test conditions in a considerable way, which might make it harder to reliably compare pre- and post-2014 NAPLAN writing tests, but I digress.

What’s the solution? A fairer approach, particularly for a prompt requiring a persuasive writing response, would be to provide students with select information and evidence for both sides of an issue and give them time to read through these resources. The students could then integrate evidence and expert opinion from their chosen side into their arguments (this is a fascinating process known as discourse synthesis, which I’d like to write about another time). Students could still freely argue whatever they liked about the issue at stake, but this would mean Johnny who never went out of his local area would at least have some information on which to base his ideas. Plus, we could potentially make the whole experience more efficient by making these supporting materials/evidence the same as those used to test students’ reading skills in the NAPLAN reading test.

How would this improve NAPLAN? Supporting information for the persuasive writing test (and the informative writing test if we can add that family of genres) would not need to be long: even just a paragraph of evidence on both sides would offer plenty for students to synthesise into their texts. We know that the test conditions and criteria influence what’s taught in classrooms, so there’s an opportunity to promote writing practices that will set students up for success in upper secondary and (for those interested) higher education contexts.

At the moment, students rarely include any evidence in their NAPLAN writing, even high-scoring students. Introducing some supporting information might help our students to get away from forming arguments based on their gut reactions (the kinds of arguments we encounter on social media).

3. Fix how the writing test positions students to address audiences

What’s the problem? Since 2008, NAPLAN writing prompts have had nothing to say about audience. Nothing in the prompt wording positions students to consider or articulate exactly who their audience is. Despite this, students’ capacity to orient, engage, and affect (for narrative) or persuade (for persuasive) the audience is one of the marking criteria. Put another way, we currently assess students’ ability to address the needs of an audience without the marker (nor perhaps the student) explicitly knowing who that audience is.

Why is this a problem? The lack of a specified audience leads many students to just start writing their narratives or persuasive texts without a clear sense of who will (hypothetically speaking) read their work. This isn’t ideal because the writing choices that make texts entertaining or persuasive are dependent on the audience. This has been acknowledged as a key aspect of writing since at least Aristotle way back in Ancient Greece.

Imagine you have to write a narrative on the topic of climate change. Knowing who you are writing for will influence how you write the text. Is the audience younger or older? Are they male or female? Do they like action, romance, mystery, drama, sports-related stories, funny stories, sad stories, or some other kind of story? What if they have a wide and deep vocabulary or a narrow and shallow vocabulary? There are many other factors you could list here, and all of these would point to the fact that the linguistic and structural choices we make when writing a given genre are influenced by the audience. The current design of NAPLAN writing prompts offers no guidance on what to do with the audience.

Others have noticed that this is a problem. In a recent report about student performance on the NAPLAN writing test, the Australian Education Research Organisation (AERO) (2022) described the Audience criterion as one of five that should be prioritised in classroom writing instruction. They argued: “To be able to write to a specific audience needs explicit teaching through modelling, and an understanding of what type of language is most appropriate for the audience” (p. 70). How can the marker know if a student’s writing choices are appropriate if an audience isn’t defined?

What’s the solution? A simple fix would be to give the students information about the audience to whom they’re entertaining, persuading, and/or informing. This is, again, how the NAEP in the US handles things, requiring that the “writing task specify or clearly imply an audience” (National Assessment Governing Board, 2010 p. vi). Audiences for the NAEP will be specified by the context of the writing task, age- and grade-appropriate, familiar to students, and consistent with the purpose identified in the writing task (e.g., to entertain) (see here for more). Another fix would be to ask students to select their own audience and record this somewhere above their response to the prompt.

How would this improve NAPLAN? Having more clarity around the intended audience of a written piece would position students to tailor their writing to suit specific reader needs. This would allow markers to make more accurate judgements about a child’s ability to orient the audience. If this isn’t fixed, markers will continue having to guess at who the student was intending to entertain or persuade.

Boy writing

Righting the writing test

Would these changes make the NAPLAN writing test 100% perfect? Well, no. There would still be questions about the weighting of certain criteria, the benefit/cost ratio of publicly available school results through MySchool, and other perceived issues (if anyone out there finds this interesting, I’d like to write about 3 more possible fixes in a future post). But the simple fixes outlined here would address several concerns that have plagued the writing test since 2008. This would influence the teaching of writing in positive ways and make for a more reliable and meaningful national test. The NAPLAN writing test isn’t going anywhere, so let’s act on what we’ve learnt from over a decade of testing (and from writing tests in other countries) to make it the best writing test it can be.

Seems pretty persuasive to me.

References

Australian Curriculum, Assessment & Reporting Authority. (2010). National Assessment Program – Literacy and Numeracy – Writing: Narrative marking guide. https://www.nap.edu.au/_resources/2010_Marking_Guide.pdf

Australian Curriculum, Assessment & Reporting Authority. (2013). NAPLAN persuasive writing marking guide. https://www.nap.edu.au/resources/Amended_2013_Persuasive_Writing_Marking_Guide-With_cover.pdf

Australian Education Research Organisation. (2022). Writing development: What does a decade of NAPLAN data reveal? https://www.edresearch.edu.au/resources/writing-development-what-does-decade-naplan-data-reveal/writing-development-what-does-decade-naplan-data-reveal-full-summary

Christie, F., & Derewianka, B. (2008). School discourse. Continuum Publishing Group.

Gannon, S. (2019). Teaching writing in the NAPLAN era: The experiences of secondary English teachers. English in Australia, 54(2), 43-56

Hardy, I., & Lewis, S. (2018). Visibility, invisibility, and visualisation: The danger of school performance data. Pedagogy, Culture & Society, 26(2), 233-248. https://www.doi.org/10.1080/14681366.2017.1380073

Ryan, M., Khosronejad, M., Barton, G., Kervin, L., & Myhill, D. (2021). A reflexive approach to teaching writing: Enablements and constraints in primary school classrooms. Written Communication. https://doi.org/10.1177/07410883211005558

Gender gaps in literacy and numeracy

This post provides the key points of a journal article I recently had published in The Australian Educational Researcher, co-written with Belinda Hopwood (UTAS), Vesife Hatisaru (ECU), and David Hicks (UTAS). You can read the whole article here

boy and girl reading
Girls better at literacy, boys better at numeracy?

In recent years, there has been increased attention on gender gaps in literacy and numeracy achievement. This is due in part to international assessments of students’ reading achievement such as PIRLS and PISA (Lynn & Mikk, 2009) that have found gender differences in reading are universal, with girls from all participating countries significantly and meaningfully outperforming boys. Previous research has shown that girls score higher on reading tests and are more likely to be in advanced reading groups at school (Hek et al., 2019), while those who fall below the minimum standards for reading are more likely to be boys (Reilly et al., 2019). Large-scale assessments of numeracy have seen similarly consistent results, though with boys outperforming girls. So, what’s the situation in Australia?

Recently, three colleagues and I found out what 13 years of NAPLAN reading and numeracy testing might show about boys’ and girls’ performance in the Australian context. Something that has been lacking from international research has been a clear picture of how reading and numeracy gender gaps become wider or more narrow across the primary and secondary school years. To provide this picture, we drew on publicly available NAPLAN results from the NAPLAN website (ACARA, 2021) and The Grattan Institute’s (Goss & Sonnemann, 2018) equivalent year level technique.

Findings: Gender gap in reading

We looked at the average reading performance of boys and girls across the four tested year levels of NAPLAN (i.e., Years 3, 5, 7, and 9) between 2008 and 2021. Girls improved consistently from Year 3 to Year 9, with approximately two years of progress made between each test. Boys progressed to a similar extent between Years 3 and 5, yet they fell behind the girls at a faster rate between Year 5 and Year 7. Specifically, boys made 1.95 years of progress between Year 3 and Year 5 and 1.92 years between Year 7 and Year 9, but only managed 1.73 years of progress between Year 5 and Year 7 (i.e., in the transition between primary and secondary school).

letters

The average gap between boys and girls was wider for each increase in year level, with Year 3 males around 4 months of equivalent learning behind Year 3 females, Year 5 males 5 months behind, Year 7 males 7 months behind, and Year 9 males around 10 months of learning behind Year 9 females. While boys made more progress between Year 7 and Year 9, this was also when girls made most progress. While boys seem to keep up with girls reasonably well in the primary school years, more boys struggle with reading as they transition into secondary school.

Findings: Gender gap in numeracy

The overall picture for numeracy is similar to reading, though with boys outperforming girls and the gender gap increasing across each tested year level. Boys made approximately two years of progress between each numeracy test, while girls consistently made just over 1.8 years of progress between each test, leading to a gender gap that grew wider at a consistent rate over time. Put differently, boys and girls made consistent progress between each numeracy test, though the rate of progress was higher for males, leading to a neatly widening gender gap over time.

What about the writing gender gap?

In 2020, I conducted a similar study that looked into the NAPLAN writing results, finding that on average, boys performed around 8 months of equivalent learning behind girls in Year 3, a full year of learning behind in Year 5, 20 months behind in Year 7, and a little over two years of learning behind in Year 9. Boys fell further behind girls with writing at every tested year level, yet the rate at which girls outperformed boys was greatest between Years 5 and 7. Our study into reading and numeracy has found that similar gaps exist in these domains too, though not to the same extent as writing. For ease of comparison, the following graph shows the extent and development of the gender gaps in numeracy, reading, and writing.

gender gaps
Gender gaps in numeracy, reading, and writing (2008-2021)

Why do more boys struggle with literacy as they transition into secondary school? For most Australian students, Year 7 is when many (most?) will move physically from their primary school campus to a secondary school campus. This physical transition has been shown to impact student reading achievement, particularly for boys (Hopwood et al., 2017). For some students, their reading achievement stalls in this transition, or in serious cases, declines to levels below that of their primary school years (Hanewald, 2013). Some students entering secondary school have failed to acquire the necessary and basic reading skills in primary school required for secondary school learning (Lonsdale & McCurry, 2004) stifling their future reading development (Culican, 2005). The secondary school curriculum is more demanding and students are expected to be independent readers, able to decode and comprehend a range of complex texts (Duke et al., 2011; Hay, 2014). As argued by Heller and Greenleaf (2007), schools cannot settle for a modest level of reading instruction, given the importance of reading for education, work, and civic engagement. We need to know more about why this stage of schooling is difficult for many boys and how they can be better supported.

The analysis of the numeracy gender gap was quite different from both the reading and writing results. While previous international studies have suggested that the gender gap in numeracy only becomes apparent in secondary school (Heyman & Legare, 2004), this study showed that average scores for boys were higher than those of girls on every NAPLAN numeracy test, though to a lesser extent than the other domains. The widest numeracy gender gap of a little over 6 months of learning in Year 9 was smaller than the smallest writing gender gap of 8 months in Year 3.

Implications of gender gaps in literacy and numeracy

The findings suggest links between reading and writing development, in that more boys struggle with both aspects of literacy in the transition from primary to secondary school. While other researchers have looked at the numeracy gap over time using NAPLAN scale scores (e.g., Leder & Forgasz, 2018), by using the equivalent year level values, we’ve been able to show how the gender gap widens gradually from roughly 2 months of learning in Year 3 to 6 months of learning in Year 9. While this supports the general argument that, on average, boys outperform girls in numeracy and girls outperform boys in literacy tests, it also shows how the gaps are not equal.

References

Australian Curriculum, Assessment and Reporting Authority. (2021). NAPLAN national report for 2021. https://bit.ly/3q6NaC4

Culican, S. J. (2005). Learning to read: Reading to learn – A middle years literacy intervention research project. Final Report 2003–4. Catholic Education Office.

Goss, P., & Sonnemann, J. (2018). Measuring student progress: A state-by-state report card. https://bit.ly/2UVNxy5

Hanewald, R. (2013). Transition between primary and secondary school: Why it is important and how it can be supported. Australian Journal of Teacher Education, 38(1), 62–74.

Hek, M., Buchman, C., & Kraaykamp, G. (2019). Educational systems and gender differences in reading: A comparative multilevel analysis. European Sociological Review, 35(2), 169-186.

Heller, R. & Greenleaf, C. (2007). Literacy instruction in the content areas: Getting to the core of middle and high school improvement. Alliance for Excellent Education.

Heyman, G. D., & Legare, C. H. (2004). Children’s beliefs about gender differences in the academic and social domains. Sex Roles, 50(3/4), 227-236. https://doi.org/10.1023/B:SERS.0000015554.12336.30

Hopwood, B., Hay, I., & Dyment, J. (2017). Students’ reading achievement during the transition from primary to secondary school. Australian Journal of Language and Literacy, 40(1), 46-58.

Leder, G. C., & Forgasz, H. (2018). Measuring who counts: Gender and mathematics assessment. ZDM, 50, 687–697. https://doi.org/10.1007/s11858-018-0939-z

Lonsdale, M. & McCurry, D. (2004). Literacy in the new millennium. Australian Government, Department of Education, Science and Training.

Lynn, R., & Mikk, J. (2009). Sex differences in reading achievement. Trames, 13, 3-13.

Reilly, D., Neuman, D., & Andrews, G. (2019). Gender differences in reading and writing achievement: Evidence from the National Assessment of Educational Progress (NAEP). American Psychologist, 74(4), 445-458.

NAPLAN 2021: Making sense of the reading, numeracy, and writing results

The full NAPLAN results for 2021 were released by ACARA today. There were concerns that student performance would be negatively impacted by COVID, but my analysis of gender differences suggests there is a LOT to be optimistic about, particularly for primary school leaders, teachers, students, and parents.

(NOTE: To calculate months of equivalent learning, I used the Grattan Institute’s Equivalent Year Levels approach, which you can read about here)

YEAR 3: For reading, Year 3 boys and girls did better than ever on any previous NAPLAN test. The gender gap was also the widest ever at 5.16 months of equivalent learning in favour of girls. For numeracy, the Year 3 gender gap was the widest of any previous test in favour of boys at 2.52 months. For writing, boys and girls did better than any previous NAPLAN test. The gender gap was the same as last year at 7.2 months in favour of girls.

YEAR 5: For reading, boys had their equal best performance on any test and girls did their best ever. The gender gap was the largest ever for reading at 5.76 months in favour of girls. For numeracy, boys had their equal best performance while females were similar to 2019 leading to the widest ever gender gap of 4.68 months in favour of boys. For writing, boys had their best performance since 2010 and females did their best since 2015. The gender gap was the lowest ever at 9.72 months in favour of girls.

YEAR 7: For reading, boys and girls were down slightly from last year. The gender gap was 8.04 months in favour of girls. For numeracy, boys had their equal second-best performance while girls were down slightly. The gender gap was 5.52 months in favour of boys. For writing, boys and girls had their best performance since 2011. The gender gap was the second-lowest at 18.12 months.

YEAR 9: For reading, boys and girls performed lower than in 2019. The gender gap was 9.96 months. For numeracy, boys and girls were down from last year. The gender gap was 5.64 months. For Year 9 writing, males had their best performance since 2011, and females performed higher than in 2018 and 2019. The gender gap was the second-lowest ever at 20.52 months.

READING SUMMARY: Outstanding outcomes for primary students with their best ever performances on any NAPLAN reading test. Secondary reading was down from recent tests. With increased performance, the gender gap appears to be widening at the primary end.

NUMERACY SUMMARY: Primary school boys and girls did reasonably well on the numeracy test. Other than Year 7 males, numeracy performance was down for secondary school students compared to recent tests. The gender gap appears to be widening at the primary end in favour of boys though the gap is still considerably smaller than reading and writing.

WRITING SUMMARY: Outstanding outcomes for primary and secondary males and females with notable improvement over previous tests. The gender gap in favour of girls appears to be closing at all year levels but is still considerably wider than any other NAPLAN test.

Key messages to take from the 2021 NAPLAN tests

Something is clearly working in Australia’s primary schools, particularly when thinking about reading and writing. At the primary end, the gender gaps are widening for reading and numeracy and closing for writing. As has been the case in all NAPLAN tests, girls are ahead on the literacy tests and boys are ahead on the numeracy test. The widest gender gap is still clearly associated with the writing test, with girls performing 7.2 months ahead in Year 3, 9.72 months in Year 5, 18.12 months in Year 7, and (a still concerning) 20.52 months in Year 9! Boys appear to be struggling to keep up with the increased writing demands in the transition from primary to secondary school.

While secondary students’ writing performance was higher than in previous tests, their reading and numeracy performances were down. In this sense, NAPLAN for 2021 might be a cause for celebration in primary schools and a cause for reflection in secondary schools.