(Disclaimer: AI was not used in the planning, writing, or editing of this blog post)
I spent a couple of days last week working with the LANTITE Expert Group, and it occurred to me that many of you might like to know what happens behind the scenes in the land of LANTITE test preparation. The Expert Group brings together a collection of individuals from various educational contexts to review the newest bank of text items preservice teachers will answer.
Below, I briefly introduce LANTITE, explain its significance in the Australian education landscape, and step you through the test experience. I then outline what the review process involves and finish with some personal insights based on my involvement since joining the University of Queensland in late 2021.
What is the LANTITE?
LANTITE stands for Literacy and Numeracy Test for Initial Teacher Education. It is administered by the team at ACER. This test includes a literacy and a numeracy component, which must be passed by every beginning teacher before they can graduate. The objective is to ensure new teachers are within the top 30% of the Australian population regarding their literacy and numeracy skills. The test is the benchmark to determine if that standard is being met. I’ve written about LANTITE once before, and its role in ensuring new teacher suitability for the profession.
Sounds interesting, but what does the test involve?
The test consists of 130 items (65 each for literacy and numeracy). For literacy, two-thirds of the items test reading comprehension, while the remaining third focuses on technical skills of writing, including spelling, syntax, grammar, word usage, and text organisation. If you’re interested in digging into the nitty-gritty details, the LANTITE Assessment Framework explains all aspects of the test fully.
What’s it like to sit the LANTITE test?
Imagine you’re a preservice teacher in 2026. Alongside your university assignments, you’re sitting this high-stakes test online. Almost all the items are multiple-choice and based on a stimulus text. Most of these texts reflect what you’d actually encounter in a school, like a swimming carnival timetable or a draft letter to parents.
How hard is LANTITE? Well, some of the items are quite simple, requiring you to access and retrieve one or more pieces of information in the stimulus text. Slightly trickier items ask you to integrate and interpret, where you relate more than one part of the text to each other, infer meanings, and understand the text as a whole. The hardest questions require you to reflect and evaluate, relating parts of the text to your external knowledge, ideas, or values.
It would be good to see an example, I hear you say. Well, here’s one that ACER lists in a LANTITE practice test on their website:
Q26. Duty of Care Policy
The sentence that follows relates to a school’s draft duty of care policy.
Be alert and vigilant and intervene immediately if potentially dangerous behaviour is observed in the playground.
The policy writer wants to encourage swift action.
To emphasise this, which word should be bolded?
A. alert B. dangerous C. potentially D. immediately
Reviewing the newest set of LANTITE test questions – Testing the test
There’s a common myth in education that tests like LANTITE or NAPLAN are thrown together over a weekend. After working through the LANTITE review process for the last two years, I can assure you, the validation method is very rigorous.
Step 1. Development: ACER test developers create items based on strict parameters, which are then interrogated in internal panels.
Step 2. Internal audits: Separate teams conduct “fresh eyes” reviews to check for errors, wording, and image accuracy.
Step 3. The Expert Group: This is where people like me come in. A group of practising teachers, government representatives, and academics scrutinise the refined questions. We complete the questions as if we were the students, flagging any item that might be ambiguous or unfair.
Step 4. Final refinement: Following our feedback, the questions undergo further statistical analysis, trial testing, and formal proofreading.
The goal of this multi-step process is to ensure the test remains “relevant, fair, valid and reliable” (ACER, 2026, para. 1).
Some personal thoughts about LANTITE
In my time at UQ, I’ve walked many preservice teachers through these requirements. Most pass the first time without issue, but it’s not smooth sailing for everyone. I’ve worked one-on-one with some who struggle due to test anxiety, learning disabilities like dyslexia, or having a language background other than English. Since LANTITE is a prerequisite for graduation and involves a fee for each attempt, the emotional and financial stakes are high for this minority of students. But, in an era where Generative AI makes it harder to trust the outcomes of university assessments (as this report in The Australian newspaper explained), having a controlled, external assurance of a beginning teacher’s foundational literacy and numeracy skills is arguably more important than ever.
I appreciate the opportunity to play a small role in a much bigger process, and I hope it’s been interesting for you to learn a little more about the inner workings of LANTITE. If you have any questions or views about LANTITE, let me know in the comments below, and I’ll be happy to respond. From where I sit, the rigorous item creation and review process helps LANTITE to pass the test, helping to ensure new teachers coming through are prepared for the literacy and numeracy demands of classroom teaching.
The release conveys some clear messages about what AERO thinks are priorities in the teaching of writing. While I think there are some obvious gaps that the AERO team is quite likely to fill in the future, as a sudden release of practical, evidence-informed resources, the team who produced them and the teachers who have helped trial and refine them should be congratulated. I hope the teaching community embraces this opportunity to foster effective writing at the whole school, classroom, and individual student levels.
In this post, I summarise five aspects of the release that I found particularly wonderful, and I finish with some aspects that I hope are developed further in the days and weeks ahead.
1. Giving this amount of attention to writing is wonderful
It’s well-known in the education research world that writing has received far less attention than reading (e.g., Graham, 2020; Weekes & Jones, 2021). Perhaps relatedly, most of the recent changes in school literacy have related to reading instruction. Of course, we need to continue focusing on and improving reading, but equally important is how we think about and teach writing. Effective writing instruction is critical at a time when AI is (deep breath) threatening the future of writing development and collective human thought.
AERO’s resources aim to foster understandings about writing that can be shared by all teachers at the same school. Whole school approaches for the teaching of reading seem to be working well; it’s time we achieved the same in writing, and given the wide variety of distinct approaches to teaching writing and the relative lack of compelling evidence to support popular approaches, AERO’s release may be just what we need to make this happen.
2. Emphasising and supporting writing beyond the English classroom is wonderful
Perhaps the main message in AERO’s release (and the surrounding media coverage) has been the importance of fostering writing skills in every teaching area/subject. There is a secondary school flavour to these resources, and this makes sense when primary school teachers can’t avoid teaching writing across the teaching areas (in the areas they teach, at least).
I can think of a predictable kneejerk reaction to the idea that every secondary teacher should be an expert teacher of writing. This is the obvious though problematic thought that writing instruction is the work of the English teacher, and that since other teachers already have so many things to be teaching, they don’t have time to do the English teacher’s job as well. There is truth to the idea that teachers are overwhelmed with things to do and teach, but as I’ve argued to preservice teachers for more than a decade, writing (i.e., communicating understandings and thoughts through written words) is fundamental to teaching and learning in every area.
There are genres of writing that are only important in teaching areas outside English, and there’s almost no chance that English teachers are preparing students to write those genres. If science teachers and geography teachers and psychology teachers and all the other kinds of teachers don’t teach students to write in the ways that matter in their specific subjects, what and how the students write in these subjects is unlikely to be very impressive. It would be unreasonable to expect students to write in the ways that matter in a given subject without them having clear instructions on how to achieve this. Writing is fundamental to teaching and learning, so it’s great to see the emphasis on building teacher knowledge about language and genre collectively.
Now, of course, there are many teachers in teaching areas other than English that already teach students to write, but I think what the AERO materials do well is promote a future where every teacher is at least familiar with the same basic metalanguage for writing – that is, the same basic language for talking about our writing choices. We don’t want to give students mixed messages about aspects of writing when learning to write is already so challenging for many of them. While I think we need to add more to AERO’s resources over time so that we are teaching the richness of written language needed to communicate in all prominent school-based contexts, the push for this to be something every teacher can own and share and develop together is exactly what we need.
3. Supporting schools to go deeper when unpacking and using NAPLAN results is wonderful
Sad as it sounds, I’ve been researching NAPLAN-related topics for my whole academic career – in fact, my PhD was about understanding the writing choices of students who scored highest on the 2011 NAPLAN writing test (in the same PhD I critiqued the persuasive writing test for several of its issues). Over the years, I’ve taken an interest in asking practising teachers and school leaders how they make use of NAPLAN, and their responses have never been particularly positive. Many schools have NAPLAN scores baked into their school improvement plans, and they do generally consider broad patterns in NAPLAN results (e.g., “We’re down a bit in spelling this year. Let’s do some spelling PD”). But given our country’s immense investment in NAPLAN each year, it’s surprising that we don’t get more out of it for our students.
From the start, NAPLAN has had two key aims: First, to drive school improvement; and second, to increase the accountability of our education systems. Even the most ardent NAPLAN hater must admit that it has achieved its second aim. Essentially everyone in Australia knows what NAPLAN is, and the media makes sure we all hear about what’s happening in NAPLAN results at least a couple of times each year. NAPLAN is definitely used as a political tool and as both a stick and a carrot for schools, but it’s not really used to enhance the education we offer our students in a compelling way. Despite the stated aim for driving school improvement, this is just not what it’s been used for until now (generally speaking – for a great article that shares how certain schools get value from NAPLAN, see Jackson, 2022).
Having a clear focus on how school leaders and teachers can think about and use NAPLAN results to improve writing instruction and student writing outcomes is a great aspect of the AERO release. Fewer people will have issues with NAPLAN if it can be used to meaningfully improve educational outcomes for learners, and this treatment of NAPLAN by AERO suggests some new directions that, in my opinion, are worth a try. In my writing about NAPLAN, my intention has always been for us to make the annual tests as good as they can be, informing where we prioritise interventions and improving learning, but without NAPLAN overstepping its welcome in classrooms. It will be good to hear how schools go with the suggestions AERO is making here.
4. The resources AERO has included in the release are wonderful
I’ve given it a couple of days of thought, and while I’d still like to dig a bit deeper with AERO’s writing instruction model, on the whole, I think it’s a great first attempt.
As a fan of genre-based pedagogy as an approach for teaching writing, and someone who’s promoted the similar teaching learning cycle as an instructional model for writing in my work with preservice teachers, my approval of AERO’s model is perhaps unsurprising when they state that it’s based on established evidence from the field of applied linguistics (AERO cites Callaghan & Rothery, 1988; Derewianka, 2020; and Humphrey & Feez, 2016 – all key promoters of the teaching learning cycle). It’s great to have a clear emphasis on explicit instruction and on the need to get down to the nuts and bolts of written texts (i.e., the language and genre features).
While all of what’s included is, in my view, necessary and exciting, I do feel like there’s some room for maybe another petal in the model focusing on the more mechanical aspects of writing, like handwriting, keyboarding, and spelling. The team at AERO might argue that these aspects are part of the language skills covered in the first petal? But even without the mechanics, what’s included lines up well with what we know about effective writing instruction as summarised in AERO’s own writing instruction literature review.
Beyond the model itself, I had fun completing the short courses (yep, I’m a nerd). The sentence structure and grammar course teaches ideas that are commonly taught to preservice teachers in many/most Australian universities. They certainly get this with me at UQ, though I like to include both traditional and functional grammar terms (e.g., noun groups and participants) because I’ve found the functional terms offer great assistance to students who are less familiar with traditional grammar (since they are based on meaning). AERO promotes a functional model of language as underpinning all their writing work, so it’s interesting that they haven’t used more of the functional grammar terminology. I think this is all possibly OK though if you treat traditional grammar terms in functional ways (which kind of smuggles in functional grammar without referring to it – a bit like how the Australian Curriculum: English works). This functional approach (arguably) sets our treatment of language/grammar/sentence structure in Australia apart from other countries including the US and UK. It’s quite a big thing that the AERO team has taken a functional approach and it ensures that their offerings are compatible with the treatment of language in the Australian Curriculum: English.
The inclusion of a practice resource for leading writing instruction in schools also shows the thoughtfulness of AERO’s approach. Taken together, this release of materials is clearly intended to be grappled with by teams in schools, not just by individual teachers. This can only increase the impact and sustainability of the work.
5. The lack of focus on AI and writing is wonderful
I’ll keep this section short and sweet! Nothing threatens the future of writing development for students and teachers more than AI. I will leave it to others to argue for the obvious efficiency gains and novel new methods of “writing” together with AI. I’ve decided to take the position that we should fight against the integration of AI in our classrooms for the sake of our students’ thinking and writing development. If people want to use AI as adults, they should go for it. But since the human brain continues developing well after students finish school, giving students access to a technology that so easily supplants the need to learn to struggle productively with writing is, to me, a bad move. This is new technology, but there’s increasing evidence that supports my view (e.g., check out the latest studies that have shown heavy AI use massively limits brain activity and problem-solving capacity, or that AI use is making us all think, speak, and write exactly like AI).
I will admit it’s early days with AI; much research could come out in support of AI and writing, and even if AI smashes a person’s ability to write and think, perhaps our future is one where thinking and writing are unimportant (as horrible as that should sound to every teacher).
Anyway, kudos to AERO for sticking with the writing basics!
6. The potential for more is wonderful
I hope this is just the beginning of AERO’s writing release. The materials provided are wonderful in the ways I’ve outlined, and as we move forward, in addition to adding more on writing mechanics, I feel we also need more of an emphasis on the existing genre petal of the model. I did some work with the Department for Education South Australia in the past few years, and they have developed the most amazing genre scopes and sequences, mapping out the important school-based genres that are typically taught in all teaching areas of primary and secondary school, with plenty of suggested discipline-specific topics to teach those genres, and (importantly) guidance on the structural features of genres that are currently missing in the AERO materials.
In terms of the structure of genres, it’s normal for teachers to teach the key stages that make them up. Less normal is taking the next step to teach about the phases that make up these stages. Let me offer a common primary school example: basic narrative structure includes an orientation stage, a complication stage, a resolution stage, and an (optional) coda stage. These stages are relatively fixed, they are important, and they are useful for students who are first learning to structure narratives. But we can take the next step by also teaching them about the more flexible phases that we use to build these stages. A writer might start a narrative orientation with a character phase (where they introduce the main protagonists). Instead, they might start with a setting phase (where they introduce the setting). They might have a phase in their orientation where they allude to the complication that will be the focus of the next stage. The choices of phases is much more flexible than stages and is a big part of what sets different texts in the same genre apart. If teachers in every teaching area knew the genres of writing that matter in their specific discipline, and knew the stages and phases that authors commonly use when writing these texts (and that are expected by readers), that could definitely improve the situation in schools.
Am I suggesting that genre is completely missing in the AERO materials? No, not exactly. In one of the courses, you eventually come to a point where the sentence-level ideas are distinguished for authors composing imaginative, informative, and persuasive texts (i.e., we learn about sentence-level features that characterise the different text forms). Also, in the paragraph guide, a strong message is conveyed that authors write paragraphs to achieve different purposes, and that if you are explaining or reporting or persuading (or whatever), you will need to change how you write the paragraph.
AERO has seemingly taken a bottom-up approach to genre work. It’s not to say that a bottom-up approach is bad; it’s just different to how genre-based pedagogy often works in classrooms. In genre pedagogy, we often consider the genre first (since this gives you the purpose for writing), and the structural and sentence-level choices are then informed by the purpose (as well as other elements of the context, such as who you are writing for).
But whether a teacher/school wants to go from the bottom-up (i.e., sentences, paragraphs, genre), from the top-bottom (i.e., genre, paragraphs, sentences), or to have a focus on both directions when it makes most sense, we should still hopefully achieve our goals of improved writing outcomes. It’s just important that there aren’t important aspects of writing missing. I do think that giving teachers a clear sense of the genres that matter in each teaching area and the structural features (i.e., stages and phases) that make up those genres is the next obvious piece of the AERO writing instruction puzzle. And if anyone from AERO is reading this, why not reach out to our colleagues at the Department for Education South Australia who have done this exact work recently and effectively?
Let’s all make the chance count
This has become a surprisingly long blog post, so I must have had a lot to say about this release! Overall, we should congratulate the team at AERO and acknowledge that the work has been trialled with teachers and school leaders for at least the last year, so there is some confidence that building teachers’ knowledge of writing in these ways will improve things.
I see a great deal of overlap between what AERO is promoting here and what many/most literacy-focused teacher educators like me already promote in teacher education. This might be in part why Australian teachers felt considerably more prepared by their teaching education to teach writing than teachers in the US, Norway, the Netherlands, Taiwan, and New Zealand (as shown in research by Malpique et al., 2022; Steve Graham and colleagues, and several others).
This might be one of those rare moments where policymakers, researchers, teacher educators, school leaders, teachers, and all education stakeholders can come together to make a difference for our students. Some of us may prefer particular grammar terms over the ones AERO has chosen. Some might want a bigger emphasis on the mechanics of writing or structural features. But it seems like there’s more than enough in common for everyone to get along and push in the same important direction for our students’ sake. This is definitely the right move to make for writing instruction given the undeniable threat posed by AI and the collective (and arguably unique) wisdom we have in Australia on how to teach writing well.
Note: Thanks to Caryn Hellberg (one of my outstanding PhD candidates) who discussed the AERO release with me in the last couple of days. This discussion likely influenced how I’ve thought about the release.
References
Callaghan, M., & Rothery, J. (1988). Teaching factual writing: A genre base approach. Metropolitan East Disadvantaged Schools Program.
Derewianka, B. (2020). Growing into the complexity of mature academic writing. In H. Chen, D. Myhill & H. Lewis (Eds.), Developing writers across the primary and secondary years (pp. 212–236). Routledge
Graham, S. (2020). The sciences of reading and writing must become more fully integrated. Reading Research Quarterly, 55(S1), S35-S44. https://doi.org/10.1002/rrq.332
Humphrey, S., & Feez, S. (2016). Direct instruction fit for purpose: Applying a metalinguistic toolkit to enhance creative writing in the early secondary years. Australian Journal of Language and Literacy, 39, 207–219. https://doi.org/10.1007/BF03651974
Jackson, C. J. (2022). The utility of NAPLAN data: Issues of access, use and expertise for teaching and learning. The Australian Journal of Language and Literacy, 45, 141-157. https://doi.org/10.1007/s44020-022-00009-z
Malpique, A., Valcan, D., Pino-Pasternak, D., & Ledger, S. (2022). Teaching writing in primary education (grades 1– 6) in Australia: A national survey. Reading and Writing. https://doi.org/ 10.1007/s11145-022-10294-2
Weekes, T., & Jones, P. (2021). The challenges of mapping literacy development across the years of schooling. Australian Journal of Language and Literacy, 44(2), 11-25.
In a break from the many accreditation jobs I’ve been working through in recent months, I watched a video of David Perell interviewing ChatGPT’s Sam Altman. Perell’s YouTube channel is dedicated to adult writing of many varieties, and Altman is the CEO of ChatGPT and one of the biggest voices in Generative Artificial Intelligence (GenAI) on the planet. As GenAIs like ChatGPT continue to transform how we write and think in the education world, and how our students demonstrate understanding through writing, how Altman thinks AI will impact writing is relevant for every teacher of writing. You can view the full interview here (it’s very interesting), and in this post, I’ll share some thoughts about GenAI as a person who’s deeply interested in how young people develop writing skills throughout their schooling and beyond.
Altman begins the conversation by describing writing as an extremely important activity because it enables thinking and makes it concrete in a way that speaking cannot. Altman says, “If ChatGPT can allow people to do a writing-like activity and get better quality thinking out of it, that’s wonderful.” This optimistic take on the relationship between people, writing, thinking and GenAI emphasises a key potential affordance of AI for writers: Want to think more deeply and in new ways? Write with GenAI! Sounds pretty appealing, right?
As a positive example, Altman says he recently witnessed a student using ChatGPT to help with their homework. As they worked, the student (apparently) thought, I’m kind of stuck, let me get unblocked and let me generate a bunch of other ideas. In Altman’s words, “The thing that came out was, I think, much better than anything someone would have come up with on their own”. In this example, the possible ideas generated by ChatGPT stepped in to help the student when they were stuck. It prevented them from needing to enter into the struggle of generating their own ideas: a struggle Altman seems to suggest here should be avoided. While it’s possible that “the thing that came out” was in fact better than what this student might have written independently, I’m not sure that this automatically makes the use of GenAI in this scenario a good thing.
Imagine wanting to run a marathon. Instead of training to build up your fitness and stamina, you ask a robot to run the marathon for you. Sounds much easier, and you could even win, but what do you lose in this arrangement? Sure, you might get a medal either way, and the robot might run the race faster for you, but without the struggle, there’s no actual development for you.
A less optimistic take on GenAI and writing
Altman’s positive take on GenAI and writing is at odds with an impressively brief and thought-provoking alternative shared by scientist and writer Paul Graham in late 2024. For Graham, the impact of GenAI on writing will be profoundly negative, leading to a situation where, in just two decades or so, there won’t be many people who can write. He puts this down to the fundamental difficulty of writing (driven by the fundamental difficulty of thinking clearly). “To write well you have to think clearly, and thinking clearly is hard” (Graham, para. 3).”
Writing well has always been difficult, but the new “escape valve” provided by GenAI means people no longer need to worry about entering into the struggle of writing (and thinking). While every writer you’ve met has likely already experienced the incredible efficiency of GenAI-assisted writing, those who have integrated it deeply into their workflows may not yet have thought deeply about the potential tradeoffs involved. As Graham (2024) argues:
“The result will be a world divided into writes and write-nots. There will still be some people who can write. Some of us like it. But the middle ground between those who are good at writing and those who can’t write at all will disappear. Instead of good writers, ok writers, and people who can’t write, there will just be good writers and people who can’t write” (para. 8).
In a world where many people can’t write, collective human thought would be the major casualty.
During Sam Altman’s interview, I was fascinated to hear him admit to witnessing a much less positive example of a second student completing their homework with ChatGPT:
“One of them basically just put in their thing and (it) wrote the whole essay, and I was like, appalled, because I knew that that was a theoretical thing that people were doing. … To watch someone do that and then get an essay that was, like, bad but passable out of it was, like, a real what have we done moment. It was visceral, in a way, like, I just hadn’t seen someone do it before.”
Teachers are well aware of the risk of students simply using what GenAI spits out as their own work. While there are obvious issues around whose work it is and whether a student like this second one mentioned by Altman actually learned the content, the deeper issue I can see affecting every person who uses GenAI for writing is in its potential to stifle (or downright end) our own writing and thinking development. The struggle of writing that GenAI helps us avoid is a key aspect of learning. Offloading that struggle to a mysterious technological entity is, admittedly, a quick way to get something written, but if we aren’t careful, we may become entirely reliant on it for most of our future writing and thinking.
Here is a final quote from Altman:
“I was reflecting a lot on that (i.e., seeing the two students complete their homework with ChatGPT) and the first question was, like, a bad question, like, if you can just put something in and get a super interesting, or super passable, response, I think we’re just asking people to do the wrong thing. But if it’s something that gets them to want to think about a question differently and use the tool to get somewhere they wouldn’t have got on their own, that’s super interesting.”
We can all make predictions about how this will turn out. Clearly, even the guy running the biggest AI company has a variety of hopes and doubts!
How to protect our writing and thinking skills
From my perspective, the wisest thing to do (no matter who you are) is to continue practising to write without the aid of GenAIs. If you use GenAI regularly at work, it might mean setting aside some regular time for your own writing (and maybe do some quality reading too if you’re not already). You could start writing a journal of your thoughts, get some ideas down for short stories, play with poetry, or write summaries of your main learning from different events. Like going to the gym or practising for that marathon, this kind of struggle will pay off and it’s what life is all about.
For me, a key outlet for my own GenAI-free writing is this blog (other than the odd AI-generated image). As you can probably tell from how it’s written, this work does not involve ChatGPT or any other GenAI. Does the writing kind of stink, Sure. And could I write many more posts by offloading the struggle to AI? Right again, but it wouldn’t be the same for you or me. It’s worth thinking about the GenAI-free outlets you can create in your own life.
For those who teach young people to write, I fear we can no longer justify the struggle of learning to write with the argument that, sorry kids, writing is crucial for your future work and there’s no magic get-out-of-jail-free card. There are many advantages of writing (and thinking) that are not easily accomplished in other ways. And there’s a lot at stake for students if they trade thinking and the struggle that generates and enhances it for quick and easy outcomes.
With so many students already finding it challenging to develop writing skills, I don’t think there’s any logical reason to introduce children to GenAI in the primary/elementary years. For those teaching secondary students, if GenAI is already part of your students’ learning, it’s important to consider and communicate with them the significant potential cost involved if they rely on it too much and too often. This is not just about plagiarism and who has done the work; it’s about the development of thinking and learning that is so wonderfully fostered through good old-fashioned writing. Call me a laggard, but the efficiency gains are simply not worth the tradeoff.
The release of NAPLAN test results always sparks conversations and debates about the state of education and student performance. The 2023 results are no exception, but this year’s results come with a twist that makes comparisons even more intriguing. With NAPLAN testing shifting to two months earlier in the school year (from May to March), it was expected that student results would be lower than in previous years (when students had two months of additional learning to influence test preparation). Of course, it doesn’t make much sense to compare the test results in 2023 with any of the previous tests (which began in 2008), since we would be comparing ripe apples with slightly less ripe apples. But I decided to go ahead and compare the 2023 results with the 2022 results anyway, just for fun, split up by gender. Some of the findings were definitely unexpected!
Reading and Writing: A Mixed Bag
The 2023 reading results for male and female students in every tested year level (i.e., Years 3, 5, 7, and 9) all showed a downward trend compared to the 2022 results. This can likely be attributed to the earlier testing date, with 2023 students having less time to develop reading skills before the test.
There was a dip in writing results among primary school males and females that mirrored this trend, as was expected. However, in a fascinating twist, writing scores actually improved for male and female students in Year 7 and Year 9. In fact, compared to all NAPLAN writing tests since it was modified in 2011, the 2023 scores were the highest ever for Year 7 males and females and Year 9 males, while for Year 9 females it was their second highest. How is this possible? Could something have changed in the marking process? This is unlikely since nothing like this has been discussed by ACARA. Did secondary school students find the 2023 writing prompt easier to address in the limited test time? This is somewhat plausible. Could secondary school students be feeling more positive about NAPLAN testing in March than in May? Without more information, it’s not possible to know what has driven this marked increase in secondary writing test scores. But it’s certainly odd that students with two fewer months of preparation would perform higher on a test that, for all intents and purposes, seems equivalent to all recent writing tests.
Despite the positive news for secondary students, it should be pointed out that Year 9 males are still performing at a level equivalent to Year 7 females, demonstrating a persistent gender gap that merits further investigation. Year 9 males performed more than two years of equivalent learning behind Year 9 females (i.e., 24.12 months – yikes!).
Spelling and Grammar: Heading Down
Like reading, spelling scores were down for males and females in all tested year levels. Again, this was expected given the shift to earlier testing.
Grammar and punctuation results mostly followed the same downward pattern, with Year 3, Year 5, and Year 9 males and females all achieving lower scores than the 2022 students. Strangely, grammar and punctuation scores for Year 7 students from each gender were higher than Year 7 students who sat the test in 2022.
As a noteworthy point from the data, Year 7 females outperformed Year 9 males for the first time in any NAPLAN grammar and punctuation test (or any NAPLAN literacy test). This can be explained by the considerable (but expected) decline in Year 9 male scores, while Year 7 female performance was somehow largely consistent with recent years, even with the earlier testing.
Numeracy: A Glint of Improvement
In terms of numeracy, all primary school males and females somehow scored higher than their 2022 counterparts (except for Year 5 females whose scores in 2023 were slightly down). Surely the numeracy test and its marking procedures haven’t changed, so it’s unclear why there would be clear improvements. Year 3 males managed to achieve their highest mean score of any previous NAPLAN numeracy test. With two fewer months of class time 🤷♂️
On the other hand, secondary school students, regardless of gender, scored lower than the 2022 students. But again, this was expected, so no alarm bells yet.
Final Thoughts: Beyond the Numbers
While the 2023 NAPLAN results might not be directly comparable to previous years due to the changed testing timeline, they offer valuable insights into the dynamics of education and student performance. The interplay of gender, year levels, and subject areas provides a rich tapestry of information that policymakers, educators, and researchers can draw from to tailor interventions and strategies.
It was kind of shocking to see that in specific areas, the earlier 2023 testing procedure resulted in higher scores (i.e., secondary writing and primary numeracy). That said, all students would clearly benefit from the additional two months of learning about reading, spelling, grammar, and punctuation.
The 2023 results highlight the importance of considering the broader context surrounding NAPLAN test scores. As we move forward with this whole new world of NAPLAN testing, complete with four shiny new proficiency standards that replace the previous bands, it will be as intriguing as ever to see the rise and fall of student results across the country. These broad pictures of student achievement would not be possible without NAPLAN testing.
Recently, I started a new research project with four colleagues to investigate the writing choices made by primary and secondary school students who scored highest of all Queensland students on the three most recent NAPLAN writing tests. I have done this sort of research in the past but always focused on successful persuasive writing across the tested year levels (i.e., 3, 5, 7, and 9). For our new project, named NAPtime, we will investigate the narrative writing choices valued by NAPLAN markers for the first time. The Queensland Curriculum and Assessment Authority (caretakers of completed NAPLAN tests up here) granted us access to the 285 highest-scoring Queensland writing samples written for the 2019, 2021, and 2022 NAPLAN tests (i.e., roughly 20-25 samples per year level for the three years of the test). In the next couple of years, my colleagues and I will use a variety of linguistic and rhetorical frameworks to identify patterns in the students’ writing and communicate our findings to the education community.
My first exploration of the successful writing samples will focus on the students’ use of figurative language to entertain their readers. Figurative language choices are often referred to as figurative devices, rhetorical devices, literary devices or figures of speech, and are commonly associated with poetry and argumentation, but high-quality narratives are also overflowing with artful and playful uses of figurative language. In fact, this is often what makes the stories we read so engaging.
Figurative language has been the focus of research and teaching for (literally) thousands of years. The figurative language choices I’ll be looking for in the NAPLAN writing samples were identified first by Aristotle and other rhetoricians way back in Ancient Greece. Aristotle outlined the ins and outs of five canons of classical rhetoric—Invention, Arrangement, Style, Memory, and Delivery—which included everything a speaker or writer would need to discover, organise, and communicate compelling ideas through spoken and written texts. Of most relevance to our NAPtime research project is the third canon, Style, which concerns how we put the ideas we have into words that are communicated with others. This is the part of classical rhetoric that dealt with figurative language.
Figurative language in the Australian Curriculum: English.
It’s quite amazing to see just how much emphasis is given to figurative language in the Australian Curriculum: English. Even a cursory glance will show this is one of the most underrated aspects of English teaching. Unlike certain other aspects of English that are only dealt with in niche sub-strands of the curriculum, figurative language can be found across all three strands (i.e., Language, Literature, and Literacy), spread out across a full eight sub-strands! While figurative language is taught from Year 1 to Year 10, it becomes especially prominent in the secondary school years, where it’s mentioned directly in six content descriptions for each secondary year level (i.e., 7, 8, 9, and 10). In this sense, teaching students to interpret and use figurative language is likely a regular part of every secondary English teacher’s day job.
Despite the wide reach of figurative language, this aspect of English is, arguably, treated in a fairly disjointed manner in the Australian Curriculum: English. Figurative language pops up here, there, and everywhere. It is described as serving many varied functions in different types of texts, such as enhancing and building up layers of meaning; shaping how readers interpret and react to texts; influencing audience emotions, opinions, and preferences; evaluating phenomena; and conveying information and ideas. At times, it is described as a stylistic tool of poetry, songs, and chants; at other times it’s a persuasive tool of argumentation; at other times it’s a literary tool of storytelling. All these uses make figurative language feel a bit like sand slipping through your fingers; nothing really ties it together.
The Australian Curriculum: English refers to 14 figurative devices explicitly (i.e., metaphor, simile, personification, onomatopoeia, assonance, alliteration, hyperbole, idiom, allegory, metonymy, ellipses, puns, rhetorical questions, and irony). This might seem like a lot, but more than 200 figurative devices have been identified in the writing of William Shakespeare alone (Joseph, 1947)! It would be interesting to know how and why these 14 figurative devices have been named in the curriculum.
Figurative language in the NAPLAN writing tests
Another place educators come across figurative devices is in the NAPLAN writing marking guides. The persuasive writing version of the test includes a criterion named Persuasive devices, which involves “The use of a range of persuasive devices to enhance the writer’s position and persuade the reader” (ACARA, 2013, p. 6). In the glossary of the persuasive writing marking guide, nine figurative devices are mentioned: alliteration, simile, metaphor, personification, idiom, puns, irony, hyperbole, and rhetorical questions. The guide also includes some descriptions of the effects of other figurative devices (e.g., parallelism, anaphora, epistrophe) without mentioning the technical names (e.g., “Words or phrases at the beginning or end of successive clauses or statements” refers to anaphora and epistrophe).
The NAPLAN narrative writing marking guide (ACARA, 2010) drops the Persuasive devices criterion and replaces it with another named Character and setting, which involves “The portrayal and development of character” and “The development of a sense of place, time and atmosphere” (p. 4). Only metaphor and simile are mentioned in the glossary as part of key vocabulary choices, while ellipsis is mentioned as a key resource for building text cohesion.
What can we take from the emphasis on figurative language in these marking guides? It seems the designers of the NAPLAN writing test are expecting students to use figurative language in both versions, but only really sets markers up to identify the use of specific figurative devices in the persuasive version. There is possibly an assumption here that figurative language is more important in persuasive writing than in narrative writing. When you add the Australian Curriculum’s substantial but disjointed emphasis on figurative language into the mix, it’s quite likely that some Australian teachers would feel unsure about the aspects of figurative language to teach and in which genres.
Our approach in the NAPtime research
Educators and curriculum designers in contemporary settings might get a better grip on figurative devices if we follow the lead of classical rhetoricians who divided them into two categories: schemes and tropes. Both can be described as fundamental to how we put together sentences in written or spoken texts.
Simply put, a scheme (from the Greek word schēma, meaning form or shape) involves changing the usual pattern or arrangement of words in a sentence. A well-known scheme is alliteration, which involves the repetition of initial phonemes in two or more adjacent words, such as when Professor McGonagall from Harry Potter described students as “behaving like a babbling, bumbling band of baboons!”
A trope (from the Greek word tropein, meaning to turn) involves changing the normal meaning of words in a sentence. A well-known trope is metaphor, which involves making a comparison between two different things that have something in common, such as when Mrs Dursley from Harry Potter is compared to a crane (i.e., a longnecked bird): “she spent so much of her time craning over garden fences, spying on the neighbours”.
Dividing the 14 figurative devices mentioned in the Australian Curriculum: English and the nine in the NAPLAN persuasive writing marking guide into schemes and tropes shows that these documents strongly favour tropes (i.e., nine tropes vs. three schemes in the curriculum and eight tropes vs. one scheme in the NAPLAN marking guide). A key interest of my research into high-scoring NAPLAN narratives will be to determine how the students used schemes and tropes to entertain their readers, and how well these key policy documents reflect the choices valued in the NAPLAN writing context.
I will pay close attention to the following 19 schemes and 17 tropes that are particularly useful in contemporary writing (Corbett & Connors, 1999). Clearly, this is more than double the number mentioned in the curriculum and NAPLAN, and some may not have been used much or at all by the high-scoring students. It’s also possible that some devices were only used in certain year levels, so there is potential for interesting findings here. If we discover that NAPLAN markers rewarded students for using figurative devices that do not even appear in the key policy documents guiding our teachers, there will be fascinating implications for the usefulness, equity, and ongoing enhancement of these documents.
Without further ado, here is a table of the schemes and tropes that I will look for in my first NAPtime article, with pronunciations, definitions, and examples:
Scheme
Definition
Example
Parallelism
Refers to when words, word groups, or clauses in a sequence have a similar structure
He enjoyed studying English, history, and science.
Isocolon (ī-sō-cō’-lon)
A type of parallelism that occurs when parallel elements not only share a similar structure but also have the same length, such as the same number of words or even syllables
In this classroom, minds expand, ideas ignite, and knowledge flourishes.
Climax
Works together with parallelism. Occurs when words, word groups, or clauses are arranged to build up in importance or intensity
By the end of the school year, students will be armed with skills, wisdom, and a burning desire to make their mark on the world.
Antithesis (an-tith’-ə-sis)
A type of parallelism that occurs when contrasting ideas are placed side by side
Despite the rules and routines, the class had wild bursts of creativity. They seemed to value both conformity and rebellion.
Anastrophe (ə-‘na-strə-fē)
When the usual word order of a clause or sentence is inverted
A place of endless possibilities, a school is.
Parenthesis (pə-ren’-thə-sis)
The insertion of a verbal unit that interrupts the normal flow of a sentence’s structure
A school—with students hurrying between classrooms and the sound of slamming lockers—is a vibrant and dynamic place.
Apposition
Placing two elements side by side, where the second element serves an an example or modification of the first
The teacher, a tireless advocate for learning, guides the students with dedication and passion.
Ellipsis
The intentional omission of a word or words that can be easily understood from the context
You can enter the Year 5 classroom down the corridor, and Year 6 up the stairs.
Asyndeton (a-sin’-də-ton)
The purposeful omission of conjunctions between connected clauses
Books, pencils, notebooks, a backpack filled to the brim—all essentials for a day of learning.
Polysyndeton (pol-ē-sin’-də-ton)
The purposeful use of many conjunctions
The young student struggled to carry her books and her pens and her laptop and her calculator and her highlighters to class.
Alliteration
The repeated use of the same sound at the start of several consecutive words
A boisterous banter of students blended with the rhythmic rattle of rolling backpacks.
Assonance
The repeated use of similar vowel sounds in stressed syllables of consecutive words, with different consonant sounds before and after them
The playful students stayed late to engage in debate.
Anaphora (ə-naf’-ə-rə)
The repeated use of the same word or words at the start of several consecutive clauses
In this class we pursue our dreams. In this class we discover our potential. In this class we become who we are meant to be.
Epistrophe (ə-pis’-trō-fē)
The repeated use of the same word or words at the ends of consecutive clauses
In the classroom, we learn. In the hallways, we learn. In the library and the gym, we learn. Everywhere in this school, we learn.
Epanalepsis (ə-pon-ə-lep’-sis)
The repeated use of a word or words at the end of a clause that was used at the beginning of the same clause
Learning to write is the most important part of learning.
Anadiplosis (an-ə-di-plō’-sis)
The repeated use of the last word of one clause at the beginning of the next clause
Education is the key to unlocking doors, and doors lead to endless possibilities for a life lived well.
Antimetabole (an-tē-mə-tab’-ō-lē)
The repeated use of words in successive clauses, but in reverse grammatical order
In this class you will not only learn to read, but you will read to learn.
Chiasmus (kī-əz’-mus)
When the grammatical structure in successive word groups or clauses is reversed
As teachers, we shape our students, but then our students shape us.
Polyptoton (pō-lip’-tə-tahn)
The repeated use of words that are derived from the same root word
The new learnings of the learners helped them learn most of all.
Trope
Definition
Example
Metaphor
The comparison of two different things that by implying a connection between them
Schools are fertile gardens where knowledge takes root and young minds can bloom.
Simile
The comparison of two different things by using ‘like’ or ‘as’ to make the comparison explicit
The children gathered around the teacher, like bees around a hive.
Synecdoche (si-nek’-də-kē)
When a part of something is used to represent the whole thing
Many hands helped make the school fair a success.
Metonymy (mə-tahn’-ə-mē)
The substitution of a word or word group with another that is closely associated or suggestive of intended meaning
The pen is mightier than the sword.
Pun: Antanaclasis (an-ta-nak’-la-sis)
The intentional use of one word in two or more different ways
If you never learn the content, you’ll never learn to be content.
Pun: Paronomasia (par-ə-nō-mā-zha)
The intentional use of words that sound similar but have different meanings
The teacher plainly explained how the plane’s crash was unplanned.
Pun: Syllepsis
The intentional use of a word in a way that modifies two or more other words, but with each of those words understanding the original word differently
The teacher did not raise her voice or her hopes.
Anthimeria
One part of speech is substituted for another
The student papered the hallway with his artistic skills.
Periphrasis (pə-rif’-ə-sis)
The use of a descriptive word or word group instead of a proper noun or the use of a proper noun to refer to a quality or characteristic associated with it
Sarah was crowned the Queen of Knowledge for her amazing academic results.
Personification
Giving human qualities or abilities things that are not human
The library books whispered enticing stories, beckoning the students to embark on magical adventures.
Hyperbole (hī-pur’-bə-lē)
The intentional use of exaggerated terms to emphasis meaning
For maths we were forced to sit and work through a thousand complex equations.
Litotes (lī’-tə-tēz)
The intentional use of understated terms to minimise meaning
Jim’s performance in the science fair was not unimpressive.
Rhetorical question
Posing a question, not to receive an answer, but to express a point indirectly
Can you deny the importance of education in a child’s life?
Irony
The use of words in a way that mean the opposite of their literal meaning
The 50-page maths assignment was every student’s idea of a fun-filled holiday.
Onomatopoeia
The use of a word that imitates the sounds it describes
Over the courtyard she clashed and clattered on the way to the classroom.
Oxymoron
The combination of two terms that are usually contradictory or opposite to each other
The silent cacophony of the empty classroom filled the air.
Paradox
Making a statement that seems contradictory but that holds some truth
The more you learn, the more you realise you don’t know.
I look forward to letting you know what we find. My hypothesis is that figurative language plays a much larger role in high-scoring narratives than the narrative marking guide suggests. If you are a teacher, how do you currently teach students to understand and use figurative devices in their own writing? Do you think it’s important for narrative writing?
References
Australian Curriculum, Assessment and Reporting Authority. (2010). National Assessment Program – Literacy and Numeracy: Narrative writing marking guide. https://www.nap.edu.au/_resources/2010_marking_guide.pdf
Australian Curriculum, Assessment and Reporting Authority. (2013). National Assessment Program – Literacy and Numeracy: Persuasive writing marking guide. https://www.nap.edu.au/_resources/amended_2013_persuasive_writing_marking_guide_-with_cover.pdf
Corbett, E. P. J., & Connors, R. J. (1999). Classical rhetoric for the modern student (4th ed.). Oxford University Press.
Joseph, M. (1947). Shakespeare’s use of the arts of language. Columbia University Press.
Many education stakeholders and social commentators dislike the NAPLAN writing test. They think it (and the whole suite of annual tests) should be scrapped. NAPLAN undeniably influences classroom practices in a large number of Australian schools, and it’s also raised stress levels for at least some groups of students and teachers (Hardy & Lewis, 2018; Gannon, 2019; Ryan et al., 2021). These are valid concerns.
But as Australia’s only large-scale standardised assessment of writing, the test has the potential to provide unique and useful insight into the writing development, strengths, and weaknesses of Australia’s primary and secondary school populations (here’s an example). Added to this, the political value of NAPLAN, and the immense time, energy, and money that’s been poured into the tests since 2008 make it unlikely that the tests will be scrapped anytime soon.
Instead of outright scrapping the tests, or keeping them exactly as they are (warts and all), a third option is to make sure the tests are designed and implemented as well as possible to minimise concerns raised since their introduction in 2008. I’ve given the design of the NAPLAN writing test a great deal of thought over the past decade; I’ve even written a PhD about it (sad but true). In this post, I offer 3 simple fixes ACARA can make to improve the writing test while simultaneously addressing concerns expressed by critics.
1. Fix how the NAPLAN writing test assesses different genres
What’s the problem? At present, the NAPLAN writing test requires students to compose either a narrative or a persuasive text each year, giving them 40 minutes to do so.
Why is this a problem? The singular focus on narrative or persuasive writing is potentially problematic for a test designed to provide valid and reliable comparisons between tests over time. Those who have taught narrative and persuasive writing in classrooms will know these genres require often very different linguistic and structural choices to achieve different social purposes. It’s OK to compare them for some criteria, like spelling, but less so for genre specific criteria. ACARA know this too because the marking criteria and guide for the narrative version of the test (ACARA, 2010) are not the same as those for the persuasive writing version (ACARA, 2013). Even though the marking criteria for both versions are not identical, the results are compared as though all students completed the same writing task each year. There is a risk that randomly shifting between these distinct genres leads us to compare apples and oranges.
Also related to genre is the omission of informative texts (e.g., procedures, reports, explanations, etc.) in NAPLAN testing. Approaches to writing instruction like genre-based pedagogy, The Writing Revolution, and SRSD emphasise the importance of writing to inform. This is warranted by the fact that personal, professional, and social success in the adult world relies on being able to clearly inform and explain things to others. It’s not ideal that the significant time spent developing students’ informative writing skills across the school years is not currently assessed as part of NAPLAN.
What’s the solution? A better approach would be to replicate how the National Assessment of Educational Progress (NAEP) in the US deals with genres.
How would this improve NAPLAN? In the NAEP, students write two short texts per test instead of one, with these texts potentially requiring students to persuade (i.e., persuasive), explain (i.e., informative), or convey real or imagined experience (i.e., narrative). The NAEP is administered in Years 4, 8, and 12. Matching the typical development of genre knowledge (Christie & Derewianka, 2008), the Year 4 students are most likely to be asked to write narrative and informative texts, while those in Years 8 and 12 are most likely to write informative and persuasive texts. But students in all tested year levels can still be asked to write to persuade, explain, or convey experience, so knowledge about all the genres is developed in classrooms.
Why couldn’t we do something similar with NAPLAN? Including informative texts in our writing test would incentivise the teaching of a fuller suite of genres in the lead up to NAPLAN each year. Not including informative texts in NAPLAN is a little baffling since a large proportion of student writing in classrooms is informative.
2. Fix the NAPLAN writing prompt design
What’s the problem? At the start of every NAPLAN writing test, students receive a basic prompt that provides general information about the topic. Here’s an example from the 2014 test, which required a persuasive response:
As you can see, some useful information is provided about how children can structure their texts (e.g., Start with an introduction) and the sorts of writing choices they might like to make (e.g., choose your words carefully to convince a reader of your opinion). But how useful is this to a student who doesn’t have a lot of background knowledge about laws and rules?
My younger sister was in Year 5 in 2014 and she completed this writing test. She had previously been on two international school trips, and drew on these (and other) experiences to write an argument about raising Australia’s legal drinking age to 21, as it is in the US, and the many ways this would positively impact our society. Perhaps unsurprisingly, my sister achieved at the Year 9 standard for this persuasive text.
Why is this a problem? Compare my sister’s experience with a child from a lower socioeconomic area who had never been out of their local area, let alone Australia. It’s more challenging to suggest how rules or laws in our context should change if you don’t know about how these rules or laws differ in other places. The information provided in the prompt is far less useful if the child does not have adequate background knowledge about the actual topic.
Keeping the topic/prompt secret until the test is meant to make the test fairer for all students, yet differences in children’s life experiences already make a prompt like this one work better for some students than others. As an aside, in 2014 so many primary school students couldn’t answer this prompt that ACARA decided to write separate primary and secondary school prompts from 2015. This changed the test conditions in a considerable way, which might make it harder to reliably compare pre- and post-2014 NAPLAN writing tests, but I digress.
What’s the solution? A fairer approach, particularly for a prompt requiring a persuasive writing response, would be to provide students with select information and evidence for both sides of an issue and give them time to read through these resources. The students could then integrate evidence and expert opinion from their chosen side into their arguments (this is a fascinating process known as discourse synthesis, which I’d like to write about another time). Students could still freely argue whatever they liked about the issue at stake, but this would mean Johnny who never went out of his local area would at least have some information on which to base his ideas. Plus, we could potentially make the whole experience more efficient by making these supporting materials/evidence the same as those used to test students’ reading skills in the NAPLAN reading test.
How would this improve NAPLAN? Supporting information for the persuasive writing test (and the informative writing test if we can add that family of genres) would not need to be long: even just a paragraph of evidence on both sides would offer plenty for students to synthesise into their texts. We know that the test conditions and criteria influence what’s taught in classrooms, so there’s an opportunity to promote writing practices that will set students up for success in upper secondary and (for those interested) higher education contexts.
At the moment, students rarely include any evidence in their NAPLAN writing, even high-scoring students. Introducing some supporting information might help our students to get away from forming arguments based on their gut reactions (the kinds of arguments we encounter on social media).
3. Fix how the writing test positions students to address audiences
What’s the problem? Since 2008, NAPLAN writing prompts have had nothing to say about audience. Nothing in the prompt wording positions students to consider or articulate exactly who their audience is. Despite this, students’ capacity to orient, engage, and affect (for narrative) or persuade (for persuasive) the audience is one of the marking criteria. Put another way, we currently assess students’ ability to address the needs of an audience without the marker (nor perhaps the student) explicitly knowing who that audience is.
Why is this a problem? The lack of a specified audience leads many students to just start writing their narratives or persuasive texts without a clear sense of who will (hypothetically speaking) read their work. This isn’t ideal because the writing choices that make texts entertaining or persuasive are dependent on the audience. This has been acknowledged as a key aspect of writing since at least Aristotle way back in Ancient Greece.
Imagine you have to write a narrative on the topic of climate change. Knowing who you are writing for will influence how you write the text. Is the audience younger or older? Are they male or female? Do they like action, romance, mystery, drama, sports-related stories, funny stories, sad stories, or some other kind of story? What if they have a wide and deep vocabulary or a narrow and shallow vocabulary? There are many other factors you could list here, and all of these would point to the fact that the linguistic and structural choices we make when writing a given genre are influenced by the audience. The current design of NAPLAN writing prompts offers no guidance on what to do with the audience.
Others have noticed that this is a problem. In a recent report about student performance on the NAPLAN writing test, the Australian Education Research Organisation (AERO) (2022) described the Audience criterion as one of five that should be prioritised in classroom writing instruction. They argued: “To be able to write to a specific audience needs explicit teaching through modelling, and an understanding of what type of language is most appropriate for the audience” (p. 70). How can the marker know if a student’s writing choices are appropriate if an audience isn’t defined?
What’s the solution? A simple fix would be to give the students information about the audience to whom they’re entertaining, persuading, and/or informing. This is, again, how the NAEP in the US handles things, requiring that the “writing task specify or clearly imply an audience” (National Assessment Governing Board, 2010 p. vi). Audiences for the NAEP will be specified by the context of the writing task, age- and grade-appropriate, familiar to students, and consistent with the purpose identified in the writing task (e.g., to entertain) (see here for more). Another fix would be to ask students to select their own audience and record this somewhere above their response to the prompt.
How would this improve NAPLAN? Having more clarity around the intended audience of a written piece would position students to tailor their writing to suit specific reader needs. This would allow markers to make more accurate judgements about a child’s ability to orient the audience. If this isn’t fixed, markers will continue having to guess at who the student was intending to entertain or persuade.
Righting the writing test
Would these changes make the NAPLAN writing test 100% perfect? Well, no. There would still be questions about the weighting of certain criteria, the benefit/cost ratio of publicly available school results through MySchool, and other perceived issues (if anyone out there finds this interesting, I’d like to write about 3 more possible fixes in a future post). But the simple fixes outlined here would address several concerns that have plagued the writing test since 2008. This would influence the teaching of writing in positive ways and make for a more reliable and meaningful national test. The NAPLAN writing test isn’t going anywhere, so let’s act on what we’ve learnt from over a decade of testing (and from writing tests in other countries) to make it the best writing test it can be.
Christie, F., & Derewianka, B. (2008). School discourse. Continuum Publishing Group.
Gannon, S. (2019). Teaching writing in the NAPLAN era: The experiences of secondary English teachers. English in Australia, 54(2), 43-56
Hardy, I., & Lewis, S. (2018). Visibility, invisibility, and visualisation: The danger of school performance data. Pedagogy, Culture & Society, 26(2), 233-248. https://www.doi.org/10.1080/14681366.2017.1380073
Ryan, M., Khosronejad, M., Barton, G., Kervin, L., & Myhill, D. (2021). A reflexive approach to teaching writing: Enablements and constraints in primary school classrooms. Written Communication. https://doi.org/10.1177/07410883211005558
This post provides the key points of a journal article I recently had published in The Australian Educational Researcher, co-written with Belinda Hopwood (UTAS), Vesife Hatisaru (ECU), and David Hicks (UTAS). You can read the whole article here
Girls better at literacy, boys better at numeracy?
In recent years, there has been increased attention on gender gaps in literacy and numeracy achievement. This is due in part to international assessments of students’ reading achievement such as PIRLS and PISA (Lynn & Mikk, 2009) that have found gender differences in reading are universal, with girls from all participating countries significantly and meaningfully outperforming boys. Previous research has shown that girls score higher on reading tests and are more likely to be in advanced reading groups at school (Hek et al., 2019), while those who fall below the minimum standards for reading are more likely to be boys (Reilly et al., 2019). Large-scale assessments of numeracy have seen similarly consistent results, though with boys outperforming girls. So, what’s the situation in Australia?
Recently, three colleagues and I found out what 13 years of NAPLAN reading and numeracy testing might show about boys’ and girls’ performance in the Australian context. Something that has been lacking from international research has been a clear picture of how reading and numeracy gender gaps become wider or more narrow across the primary and secondary school years. To provide this picture, we drew on publicly available NAPLAN results from the NAPLAN website (ACARA, 2021) and The Grattan Institute’s (Goss & Sonnemann, 2018) equivalent year level technique.
Findings: Gender gap in reading
We looked at the average reading performance of boys and girls across the four tested year levels of NAPLAN (i.e., Years 3, 5, 7, and 9) between 2008 and 2021. Girls improved consistently from Year 3 to Year 9, with approximately two years of progress made between each test. Boys progressed to a similar extent between Years 3 and 5, yet they fell behind the girls at a faster rate between Year 5 and Year 7. Specifically, boys made 1.95 years of progress between Year 3 and Year 5 and 1.92 years between Year 7 and Year 9, but only managed 1.73 years of progress between Year 5 and Year 7 (i.e., in the transition between primary and secondary school).
The average gap between boys and girls was wider for each increase in year level, with Year 3 males around 4 months of equivalent learning behind Year 3 females, Year 5 males 5 months behind, Year 7 males 7 months behind, and Year 9 males around 10 months of learning behind Year 9 females. While boys made more progress between Year 7 and Year 9, this was also when girls made most progress. While boys seem to keep up with girls reasonably well in the primary school years, more boys struggle with reading as they transition into secondary school.
Findings: Gender gap in numeracy
The overall picture for numeracy is similar to reading, though with boys outperforming girls and the gender gap increasing across each tested year level. Boys made approximately two years of progress between each numeracy test, while girls consistently made just over 1.8 years of progress between each test, leading to a gender gap that grew wider at a consistent rate over time. Put differently, boys and girls made consistent progress between each numeracy test, though the rate of progress was higher for males, leading to a neatly widening gender gap over time.
What about the writing gender gap?
In 2020, I conducted a similar study that looked into the NAPLAN writing results, finding that on average, boys performed around 8 months of equivalent learning behind girls in Year 3, a full year of learning behind in Year 5, 20 months behind in Year 7, and a little over two years of learning behind in Year 9. Boys fell further behind girls with writing at every tested year level, yet the rate at which girls outperformed boys was greatest between Years 5 and 7. Our study into reading and numeracy has found that similar gaps exist in these domains too, though not to the same extent as writing. For ease of comparison, the following graph shows the extent and development of the gender gaps in numeracy, reading, and writing.
Gender gaps in numeracy, reading, and writing (2008-2021)
Why do more boys struggle with literacy as they transition into secondary school? For most Australian students, Year 7 is when many (most?) will move physically from their primary school campus to a secondary school campus. This physical transition has been shown to impact student reading achievement, particularly for boys (Hopwood et al., 2017). For some students, their reading achievement stalls in this transition, or in serious cases, declines to levels below that of their primary school years (Hanewald, 2013). Some students entering secondary school have failed to acquire the necessary and basic reading skills in primary school required for secondary school learning (Lonsdale & McCurry, 2004) stifling their future reading development (Culican, 2005). The secondary school curriculum is more demanding and students are expected to be independent readers, able to decode and comprehend a range of complex texts (Duke et al., 2011; Hay, 2014). As argued by Heller and Greenleaf (2007), schools cannot settle for a modest level of reading instruction, given the importance of reading for education, work, and civic engagement. We need to know more about why this stage of schooling is difficult for many boys and how they can be better supported.
The analysis of the numeracy gender gap was quite different from both the reading and writing results. While previous international studies have suggested that the gender gap in numeracy only becomes apparent in secondary school (Heyman & Legare, 2004), this study showed that average scores for boys were higher than those of girls on every NAPLAN numeracy test, though to a lesser extent than the other domains. The widest numeracy gender gap of a little over 6 months of learning in Year 9 was smaller than the smallest writing gender gap of 8 months in Year 3.
Implications of gender gaps in literacy and numeracy
The findings suggest links between reading and writing development, in that more boys struggle with both aspects of literacy in the transition from primary to secondary school. While other researchers have looked at the numeracy gap over time using NAPLAN scale scores (e.g., Leder & Forgasz, 2018), by using the equivalent year level values, we’ve been able to show how the gender gap widens gradually from roughly 2 months of learning in Year 3 to 6 months of learning in Year 9. While this supports the general argument that, on average, boys outperform girls in numeracy and girls outperform boys in literacy tests, it also shows how the gaps are not equal.
References
Australian Curriculum, Assessment and Reporting Authority. (2021). NAPLAN national report for 2021. https://bit.ly/3q6NaC4
Culican, S. J. (2005). Learning to read: Reading to learn – A middle years literacy intervention research project. Final Report 2003–4. Catholic Education Office.
Goss, P., & Sonnemann, J. (2018). Measuring student progress: A state-by-state report card. https://bit.ly/2UVNxy5
Hanewald, R. (2013). Transition between primary and secondary school: Why it is important and how it can be supported. Australian Journal of Teacher Education, 38(1), 62–74.
Hek, M., Buchman, C., & Kraaykamp, G. (2019). Educational systems and gender differences in reading: A comparative multilevel analysis. European Sociological Review, 35(2), 169-186.
Heller, R. & Greenleaf, C. (2007). Literacy instruction in the content areas: Getting to the core of middle and high school improvement. Alliance for Excellent Education.
Heyman, G. D., & Legare, C. H. (2004). Children’s beliefs about gender differences in the academic and social domains. Sex Roles, 50(3/4), 227-236. https://doi.org/10.1023/B:SERS.0000015554.12336.30
Hopwood, B., Hay, I., & Dyment, J. (2017). Students’ reading achievement during the transition from primary to secondary school. Australian Journal of Language and Literacy, 40(1), 46-58.
Leder, G. C., & Forgasz, H. (2018). Measuring who counts: Gender and mathematics assessment. ZDM, 50, 687–697. https://doi.org/10.1007/s11858-018-0939-z
Lonsdale, M. & McCurry, D. (2004). Literacy in the new millennium. Australian Government, Department of Education, Science and Training.
Lynn, R., & Mikk, J. (2009). Sex differences in reading achievement. Trames, 13, 3-13.
Reilly, D., Neuman, D., & Andrews, G. (2019). Gender differences in reading and writing achievement: Evidence from the National Assessment of Educational Progress (NAEP). American Psychologist, 74(4), 445-458.
In 2017, Judith Hochman and Natalie Wexler published The Writing Revolution (TWR): a book outlining a new way of thinking about and teaching writing. A key feature that sets TWR apart from other approaches is its suggestion that school students should only focus on sentence-level writing until this is mastered (i.e., the purposes and structures of written genres should only be added after a lot of work on sentences).
This is a somewhat controversial idea if you believe that the sentences we write are always influenced by what and why we’re writing. It also introduces the risk that children will spend much of their primary schooling (and even their secondary schooling, depending on when they start) repeating the same set of basic sentence tasks in every subject. But in taking a developmental approach, Hochman and Wexler argue that learning to write is challenging for young learners, and focusing solely on sentences in the beginning greatly reduces the cognitive load. They say you can’t expect a child to write a strong text, let alone a strong paragraph, until they can write strong sentences. A brief document has been published on the TWR website outlining the theoretical ideas that underpin the approach, which you can read about here.
As I mentioned in my last post on TWR, there haven’t been any research studies or reports to verify if teaching the TWR way enables or constrains writing development… until now.
A reader named ‘Rebecca A’ left a comment on that post to say she’d found a report by an independent research and evaluation firm (Metis Associates) into the efficacy of a TWR trial in New York. The firm partnered with TWR in 2017 and spent some years evaluating how it worked with 16 NYC partner schools and their teachers. Partner schools were given curriculum resources, professional development sessions in TWR, and on-site and off-site coaching by TWR staff.
Evaluating TWR
Metis Associates were interested in TWR writing assessment outcomes, outcomes from external standardised writing assessments, and student attendance data. They compared the writing outcomes of students at partner schools with the outcomes of children at other schools. Teacher attitudes were also captured in end-of-year surveys.
This report did not go through a rigorous, peer-reviewed process, but if you are interested to know if TWR works, it’s probably the best that’s currently out there. Also, keep in mind that the partner schools were very well supported by the TWR team with resourcing, PD, and ongoing coaching. In that sense, you might consider this a report of TWR under ideal circumstances.
If you work at a school using TWR or if you’re interested in the approach, I’d recommend reading the full report here. I will summarise the key findings of the report in the rest of this blog post.
Key finding 1: Teacher attitudes
Teachers at partner schools reportedly found the TWR training useful for their teaching and got the most value from the online TWR resource library. School leaders liked being able to reach out to the TWR team for support if necessary. Some teachers wanted more independence from the strict sequence and focus of TWR activities. Most though found the approach had helped them to teach writing more effectively.
Key finding 2: Impact of TWR on student writing outcomes
But what about the development of students’ writing skills? TWR seems to have made a positive difference at the partner schools. TWR instruction helped students in each grade to advance somewhat beyond the usual levels of achievement. It’s not possible to say much more about this since the presentation of results is quite selective and we only see how the partner schools compared with non-partner schools for certain statistics, like graduation rates and grade promotion rates, which are likely influenced by all sorts of factors. The one writing assessment statistic that does include comparison schools is for the 2019 Regents assessment for students in Years 10, 11, and 12. In this case, those at TWR schools did better in Year 10, results between TWR and comparison schools were similar in Year 11, while comparison schools did better in Year 12. So, a mixed result. Being behind other schools is not really an issue if everyone is doing well, but it’s not immediately clear from this report how these results compare with grade-level expectations or previous results at the same schools.
Figure 4 from Ricciardi et al.’s (2020) evaluation of TWR in partner schools
Something that might explain the mixed outcome for senior secondary students is the tendency for teachers at partner schools to favour the basic sentence level strategies over paragraph or whole text/genre strategies in their teaching. Partner schools taught TWR in Year 3 through to Year 12, and 81% of teachers reported teaching the “because, but, so” strategy regularly (i.e., more than 2 times per week). By comparison, evidence-based strategies like sentence combining were far less commonly taught (i.e., regularly taught by 22% of teachers). This suggests that it’s important for schools using TWR to be systematic and intentional about the strategies taught and to ensure that you aren’t spending longer than needed on basic sentence-level activities so you can get the most important of what TWR offers, which I would argue comes with the single and multiple paragraph outlines and genre work.
When only looking at partner school outcomes, the picture looks positive. The report shows percentages of students performing at Beginning, Developing, Proficient, Skilled, and Exceptional levels at the beginning and end of the year. At each partner school, percentages are all heading in the right direction with many more proficient and skilled writers at the end of the evaluation.
Conclusion
To summarise, in offering select outcomes and comparisons only, and in using metrics that aren’t entirely clear, the report highlights the need for rigorous, peer-reviewed studies to better understand how TWR works for different learners and teachers in different contexts. Despite its limitations, the report points to positive outcomes for the new approach to teaching writing. This is good news for the schools out there that have jumped on board the TWR train.
It also suggests that careful attention should be paid to the specific TWR strategies that dominate classroom instruction if students are to get the most out of it. If you are using the TWR approach, my advice would be not to spend a disproportionate amount of time on basic sentence work from the middle primary years, since well-supported approaches like SRSD and genre pedagogy have shown students can (and should?) be writing simple texts that serve different purposes from a young age.
I remain greatly intrigued by TWR. It turns the writing instruction game on its head and has made me question whether other approaches expect too much from beginning writers. Its approach seems to line up nicely with cognitive load theory, in gradually building the complexity and expectation as learners are prepared for it. There’s a lot at stake though if this specific combination of strategies doesn’t actually prepare students for the considerable challenge of genre writing in the upper primary and secondary school years. You could follow its strategies diligently across the school years but inadvertently limit your students’ writing development (in time, more research will tell us if this is the case).
I realise it’s anecdotal, but my 7-year-old son (just finished Year 1) and I have been talking about argumentative/persuasive writing at home for the last few weeks and the discussions we’ve had and the writing he’s done as a result have been incredibly satisfying for both of us. To think that he should be limited to basic sentence writing and not think about and address different purposes of writing (like persuading others about matters of personal significance) for years into his primary schooling wouldn’t sit well with me after seeing what he’s capable of with basic support grounded in a firm knowledge of language and text structures and encouragement.
It’s also possible to see how students who struggle badly with writing could benefit from practice with basic sentence writing before much else. It was in a context filled with struggling writers that TWR was first conceived, and that it may be most useful.
I’m looking forward to additional research being conducted about TWR. If any schools using the approach are able to comment on its usefulness in your context, that might be helpful for others thinking about giving it a go.
References
Hochman, J. C., & Wexler, N. (2017). The writing revolution: A guide to advancing thinking through writing in all subjects and grades. Jossey-Bass.
Ricciardi, L., Zacharia, J., & Harnett, S. (2020). Evaluation of The Writing Revolution: Year 2 Report. Metis Associates. https://www.guidestar.org/ViewEdoc.aspx?eDocId=1956692
The full NAPLAN results for 2021 were released by ACARA today. There were concerns that student performance would be negatively impacted by COVID, but my analysis of gender differences suggests there is a LOT to be optimistic about, particularly for primary school leaders, teachers, students, and parents.
(NOTE: To calculate months of equivalent learning, I used the Grattan Institute’s Equivalent Year Levels approach, which you can read about here)
YEAR 3: For reading, Year 3 boys and girls did better than ever on any previous NAPLAN test. The gender gap was also the widest ever at 5.16 months of equivalent learning in favour of girls. For numeracy, the Year 3 gender gap was the widest of any previous test in favour of boys at 2.52 months. For writing, boys and girls did better than any previous NAPLAN test. The gender gap was the same as last year at 7.2 months in favour of girls.
YEAR 5: For reading, boys had their equal best performance on any test and girls did their best ever. The gender gap was the largest ever for reading at 5.76 months in favour of girls. For numeracy, boys had their equal best performance while females were similar to 2019 leading to the widest ever gender gap of 4.68 months in favour of boys. For writing, boys had their best performance since 2010 and females did their best since 2015. The gender gap was the lowest ever at 9.72 months in favour of girls.
YEAR 7: For reading, boys and girls were down slightly from last year. The gender gap was 8.04 months in favour of girls. For numeracy, boys had their equal second-best performance while girls were down slightly. The gender gap was 5.52 months in favour of boys. For writing, boys and girls had their best performance since 2011. The gender gap was the second-lowest at 18.12 months.
YEAR 9: For reading, boys and girls performed lower than in 2019. The gender gap was 9.96 months. For numeracy, boys and girls were down from last year. The gender gap was 5.64 months. For Year 9 writing, males had their best performance since 2011, and females performed higher than in 2018 and 2019. The gender gap was the second-lowest ever at 20.52 months.
READING SUMMARY: Outstanding outcomes for primary students with their best ever performances on any NAPLAN reading test. Secondary reading was down from recent tests. With increased performance, the gender gap appears to be widening at the primary end.
NUMERACY SUMMARY: Primary school boys and girls did reasonably well on the numeracy test. Other than Year 7 males, numeracy performance was down for secondary school students compared to recent tests. The gender gap appears to be widening at the primary end in favour of boys though the gap is still considerably smaller than reading and writing.
WRITING SUMMARY: Outstanding outcomes for primary and secondary males and females with notable improvement over previous tests. The gender gap in favour of girls appears to be closing at all year levels but is still considerably wider than any other NAPLAN test.
Key messages to take from the 2021 NAPLAN tests
Something is clearly working in Australia’s primary schools, particularly when thinking about reading and writing. At the primary end, the gender gaps are widening for reading and numeracy and closing for writing. As has been the case in all NAPLAN tests, girls are ahead on the literacy tests and boys are ahead on the numeracy test. The widest gender gap is still clearly associated with the writing test, with girls performing 7.2 months ahead in Year 3, 9.72 months in Year 5, 18.12 months in Year 7, and (a still concerning) 20.52 months in Year 9! Boys appear to be struggling to keep up with the increased writing demands in the transition from primary to secondary school.
While secondary students’ writing performance was higher than in previous tests, their reading and numeracy performances were down. In this sense, NAPLAN for 2021 might be a cause for celebration in primary schools and a cause for reflection in secondary schools.
Recently, I completed a PETAA short course delivered by Professor Deb Myhill of the University of Exeter named Going Meta: Enabling Rich Talk about Writing. Of all the approaches to writing I’ve come across, Myhill’s is likely the only one that attempts to integrate ideas from the three theoretical perspectives on writing. Since it doesn’t leave out a critical piece of the writing puzzle, I think that makes it quite special and potentially game-changing.
In this post, I’ve drawn on my learning through the course to outline key terms relevant to Myhill’s approach, discuss its benefits, and explain how you can use it to improve your students’ writing skills.
Key perspectives on writing
Every primary and secondary school teacher wants to help their students become strong writers. There are many specific approaches out there for achieving this, but did you know most are underpinned by one or two of three main theoretical perspectives on writing: cognitive, linguistic, and/or sociocultural?
Briefly, cognitive approaches focus on helping children develop cognitive, metacognitive, and self-regulated learning strategies for managing the processes of writing, such as planning, drafting, revising, editing, and publishing. They are about the thinking processes you engage in while writing.
Linguistic approaches focus mainly on helping children learn to use language features and structures of written texts. They are about your growing mastery of language for writing.
Sociocultural approaches focus on influences of culture and social contexts on what written forms are valued. They are about how you learn to write through collaboration, co-construction, and shared values.
Specific approaches to the teaching of writing tend to draw on ideas from one or more of these perspectives. As an example, self-regulated strategy development (SRSD) can be defined as a sociocognitive approach, since it develops children’s use of cognitive processes and strategies to write different genres for different social purposes. It’s important for teachers to know which perspective underpins their approaches to writing instruction since this will impact what aspects of writing are taught, how they are taught, and what skills and understandings they will help students develop.
Metalinguistic understanding
Deb Myhill’s approach to writing instruction is based largely on helping students develop metalinguistic understanding. Myhill described metalinguistic understanding as a subcategory of metacognition. While metacognition is about reflecting on your own thinking and learning processes, metalinguistic understanding is about reflecting on how writers use language to achieve social purposes (Myhill et al., 2020). Students with strong metalinguistic understanding are able to identify and reason about how words, sentences, and paragraphs make meaning in texts (Cremlin & Myhill, 2012). It enables students to both comprehend and produce written texts (Gombert, 1992).
(Meta)talking the talk
Classroom conversations that foster metalinguistic understanding (i.e., how is language working in this text) is known as metatalk. Through metatalk, a teacher can draw students’ attention to a writer’s authorial intention and the language and structural choices they make to achieve the intention (Myhill, 2021). Teachers can use metatalk as a pedagogical device to check students’ metalinguistic understandings before, during, and/or after a teaching episode.
Also, even if students don’t know the technical grammar terms, they can still talk about different aspects of texts and show metalinguistic understanding with everyday language. That said, across the years of schooling, students should be more capable of learning a specific language (or a metalanguage) for referring to given linguistic features, such as noun phrases, verbs, adverbs, and so on. As per the Australian Curriculum: English, Australian teachers are expected to help students learn about these and many other linguistic features from Year 1.
Benefits of developing students’ metalinguistic understandings
According to Myhill, students with strong metalinguistic understanding can look critically at writing and make more informed, intentional writing choices. It reveals the rich possibilities of language and gives writers agency as they create texts (Cremin & Myhill, 2012). It also makes learning visible and encourages students to play, explore, and experiment when making writing choices (Myhill, 2021).
Like many of the important things in literacy, metalinguistic understanding needs to be taught explicitly.
Four ways to build students’ metalinguistic understandings
1. Create opportunities underpinned by teacher knowledge
First, teachers need to create opportunities for investigations into the choices made in texts written by experienced authors, the teachers themselves, and the students. This requires time and for teachers to have a sufficient knowledge about language and structural features of texts. If the teacher can’t articulate what writing choices make a text do its work, they will struggle to build their students’ metalinguistic understandings of it.
2. Use Myhill and colleagues’ LEAD Principles
Deb Myhill and her colleagues at the University of Exeter developed the LEAD Principles to support teachers to scaffold thinking about grammar as being meaningfully linked to writing (Myhill et al., 2020). The LEAD acronym stands for Links, Examples, Authentic Texts, and Discussion.
Links: Teachers make links between a grammatical feature being introduced (e.g., adjectives) and how it works in a focus written genre (e.g., narratives).
Examples: Teachers explain the grammar with examples rather than long explanations.
Authentic texts: By using metatalk to explore the features of authentic model texts, teachers make connections between writers and the broader writing community.
Discussion: Teachers can promote metalinguistic understandings by engaging children in discussions about grammar and the work it does in written texts.
3. Use specific strategies
While the LEAD Principles are relatively broad and flexible, in Myhill’s short course (Myhill, 2021) she suggested the following ten specific strategies for fostering stronger metatalk and metalinguistic understandings in classrooms:
Strategy 1. Fill the Gap: Select an extract of text, probably a paragraph, which allows for students to see the language choice within its surrounding context, and delete the particular language choice you are going to explore. Invite students to discuss what might go in the gap, then reveal what the author chose, and discuss why the author may have made that choice.
Strategy 2. Let’s Compare!: One very effective way to help students see how different language choices can create different effects is to explore two different versions: this can be at the level of a word, a phrase or a sentence (possibly even two paragraphs?)
Strategy 3. Sort it out: Giving students words, or sequences of words, printed on cards to undertake a card sort activity is helpful because the physical manipulation of the cards to create different possibilities also generates a lot of focused metalinguistic talk about the options. It works particularly well to explore the syntactic structure of a sentence.
Strategy 4. Playing with possibilities: Invite students to generate a list of possibilities for a particular purpose eg a list of noun phrases to describe a character; or a list of sentences to describe an image of an event. Then invite them to choose two of their possibilities which create different effects, and to explain to the class what the effect is and what language choice is shaping this.
Strategy 5. Thinking Questions: Crucial to the quality of the peer metalinguistic talk is how the talking activity is set up. Pay particular attention to the questions you give students to steer the activity into focused, purposeful discussion, but without constraining it with limiting or closed questions.
Strategy 6. Collaborative Composition: Give students a short writing composition task to write together, perhaps just one paragraph. There should be a clear goal for this writing which will guide the talk which will occur during the writing. One real benefit of collaborative writing is that peers have to articulate their choices and reasons for those choices.
Strategy 7. Collaborative Revision: This is similar to the collaborative composition but more focused on deliberate decision-making through revision. It works particularly well when students are asked to rewrite together a short piece of text which involves an explicit change eg rewriting this character description to infer that he is gentle, not aggressive.
Strategy 8. Questioning the Writer: In pairs, students read a text, or section of text, looking at how the writer has crafted a particular aspect of the text eg how an argument has been signposted; how formality or informality have been used; how a narrative opens. The students create a list of questions for the author about the language choices that they can see in this extract which link to the particular aspect under focus. These questions can then be used for subsequent group, and/or whole class discussion.
Strategy 9. Text-marking: There are lots of different possibilities for asking students to read a text and mark the text in some way which highlights the language choices made. It could be highlighting all the prepositional phrases which evoke a setting; underlining any verbs which convey a sense of emotion; highlighting formal language in blue and informal language in red. As with the card sort and collaborative writing, it is the peer talk which occurs around this activity which is valuable.
Strategy 10.The Author Talks: After a period where students have been composing their own texts, create time for students to explain their own authorial choices to peers. This works best when there is a focused question to consider and when students are asked to choose one example to discuss eg ‘Choose one noun phrase where you are particularly pleased with how strong a visual image it evokes’. The peer talk is much less effective if student are asked to talk about their writing more generally.
4. Engage in metalinguistic modelling
The final way to build metalinguistic understanding should appeal to any fans of explicit instruction. Essentially, when the teacher models their own thinking while writing, this can assist students to understand what linguistic choices are available and how to make stronger linguistic decisions.
Such modelling could be focused on what the teacher is writing while they write or it may involve them thinking aloud about connections between linguistic choices and the effect in a written text (Myhill, 2021). From Myhill’s studies into metalinguistic modelling, she and her colleagues found that teachers need to be clear and focused about what they are modelling. If they are unfocused or try to cover too many aspects of writing, the intended skills will not transfer to students’ writing.
Also, when teachers focus too much on ‘what’ features should be in a written text, students may never even think about ‘why’ these choices make sense for that type of writing. Unfortunately, this prescriptive focus on including a select set of language features without a clear focus on meaning or purpose is promoted in several popular approaches to writing instruction.
Some key takeaways from the course
Three key theoretical perspectives on writing include linguistic, sociocultural, and cognitive.
While metacognitive understanding is about how writers engage in different thinking processes while writing, metalinguistic understanding is about how language works in different written genres.
Developing students’ metalinguistic understandings helps them make more informed, intentional writing choices and to play and experiment with language.
Metalanguage is language for talking about language. Even without metalanguage, children can show metalinguistic understanding of writing choices using everyday language.
Teachers can use several strategies to promote metalinguistic understandings in classroom conversations. There are also general principles for this represented by the LEAD acronym.
Metalinguistic modelling is text-focused and aims to help students think about writing choices, particularly links between grammar and its rhetorical effects.
Key to discussions about metalinguistic understandings is identifying linguistic/structural/literary choices in texts and explaining their effect(s) on audiences.
There does not seem to be a widespread approach to teaching writing that integrates the sociocultural, linguistic, and cognitive perspectives, but Myhill’s approach seems to go closest.
Developing metalinguistic understanding connects language choices with writing purposes and effects (Myhill, 2021). Without it, the teaching of grammar is likely to be disconnected from writing and, therefore, a poor way to build writing skills (e.g., Andrews, 2010; Andrews et al., 2006).
So, what’s missing from this approach?
After finishing Myhill’s course, I was still left pondering many questions that could be the basis of further research in classrooms with teachers and their students. For example:
What counts as strong metalinguistic understandings of different genres in early, middle, and upper primary school? Without this knowledge, it’s hard for teachers to know what to focus on in their classrooms.
Does this approach follow any kind of developmental pattern, or is it all just randomly generated depending on what teachers teach at a given point? If a teacher chooses to focus on the features of one set of model texts over others, could that change everything their students learn about that kind of writing?
If this is the case, how can there be any consistency in this approach, particularly when aiming to support struggling writers, those in minority groups, or those in low SES areas?
Want to learn more?
If you would like to foster your students’ metalinguistic understandings and help them become more independent writers, it would pay for you to ‘know your stuff’ when it comes to the grammatical, structural, and literary/rhetorical aspects of different written genres. In fact, I would only recommend teachers complete this short course if they already possess an adequate knowledge of language, since it assumes you know your prepositional phrases from your conjunctions and your narrative text structures from your persuasive text structures.
Since I felt Myhill’s course might be most beneficial for teachers more knowledgeable about linguistic features, I asked Associate Professor Pauline Jones (PETAA’s President) and Robyn Topp (PETAA’s Manager of Professional Learning) if they had any introductory offerings to support teachers new to text features. While the following courses are currently not open for registration, they should be again in 2022:
Rod Campbell has an online course named Teaching Knowledge for the Art and Craft of Writing that provides an introduction to English language and sentence grammar, and advanced sentence grammar and cohesion.
Jennifer Asha has a course titled Teaching Grammar with Rich Literature, in which she covers basic skills in functional grammar and how to teach it in the context of quality texts and dialogic pedagogy.
Jo Rossbridge and Kathy Rushton offered a face-to-face course named Grammar and Teaching from April – June 2021. This popular course is likely to be converted into a self-paced, online course in 2022.
PETAA will also soon be launching an open access Early Career Teachers’ Portal with quite a bit of grammar and writing content for new (and old) teachers to upskill in this area.
As a parting comment, the metalinguistic understanding course involved participants planning, writing, and revising a short character description (roughly a paragraph in length) over the five modules. It was such a joy to engage in my own creative writing like this. I believe all teachers of writing should be writers themselves, honing and experimenting with their own writing choices over time just like they expect of their students. Sharing your own writing with a class may feel a bit daunting, but it provides considerable motivation and encouragement for students to write and share their own ideas and understandings with others. After all, isn’t that what it’s all about?
References
Andrews, R. (2010). Teaching sentence-level grammar for writing: The evidence so far. In T. Locke (Ed.), Beyond the grammar wars: A resource for teachers and students on developing language knowledge in the English/literacy classroom (pp. 90-108). Routledge.
Andrews, R., Torgerson, C., Beverton, S., Freeman, A., Locke, T., Low, G., Robinson, A., & Zhu, D. (2006). The effect of grammar teaching on writing development. British Educational Research Journal, 32(1), 39-55.
Cremin, T., & Myhill, D. (2012). Writing voices: Creating communities of writers. Routledge.
Gombert, J. E. (1992). Metalinguistic development. University of Chicago Press.
Myhill, D. (2021). Going meta: Enabling rich talk about writing. Primary English Teaching Association of Australia – PETAA College. https://www.petaa.edu.au/iCore/Events/Event_display.aspx?EventKey=DM181021&WebsiteKey=23011635-8260-4fec-aa27-927df5da6e68
Myhill, D., Watson, A., & Newman, R. (2020). Thinking differently about grammar and metalinguistic understanding in writing. Bellaterra Journal of Teaching & Learning Language & Literature, 13(2). https://doi.org/10.5565/rev/jtl3.870