Bookmarked Learning from disruption: Why we should rethink the place of NAPLAN in our schools by Fiona Longmuir (lens.monash.edu)

In 2020, school communities survived without NAPLAN. They taught, they assessed, they reported, and – most importantly – they supported. From that lesson, we should reimagine the testing regime so that schools and students can be supported to thrive.

Fiona Longmuir, Jane Wilkinson and Amanda Heffernan reflect upon NAPLAN in light of the changes associated with the current crisis. With so much focus on wellbeing, they question why we persist with the same model, especially when other such tests are sample based. This is elaborated on within such books as The Global Education Race and National Testing in Schools. In response, they provide six ways to reimagine NAPLAN:

  • Shifting NAPLAN to be a sample assessment, rather than assessing entire student cohorts. This would enable monitoring of system trends over time, and was suggested by the federal review as a possible solution to address some of the negative consequences of NAPLAN.
  • Valuing a rich repertoire of assessments with teachers’ professional judgements being the basis of reporting to parents and students. Sample assessments such as NAPLAN that monitor the education system can be included, but should be “used only by schools and teachers as one piece of evidence contributing to reports to parents/carers, students and local education authorities”. (2020 federal review of NAPLAN, p10).
  • The removal of the My School website. Its detrimental impacts that allow league tables of schools to be assembled has perverted and distorted the fundamental purpose of our education systems.
  • Fair funding. Among OECD nations, Australia has a highly inequitable system of public education funding. A fairer system of funding that provides the resources and support that all students need to maximise their potential would improve achievement and social outcomes for every community across the country.
  • Trust teachers and school leaders. NAPLAN and the associated focus on narrow measures of achievement have resulted in reduced trust in the professionalism and quality of our educators. Believing in their abilities and trusting in their expertise to know their students would lift the learning of all students in the best ways for them, not just for their test scores.
  • Listen to students. With disengagement and mental ill-health at concerning levels, we need to put more time and resources into understanding the experiences of students in our schools.

It is interesting to think about this alongside Peter DeWitt’s discussion of de-implementation. Although I assume DeWitt is talking about focusing at a local level, I wonder if the real challenge in regards to de-implementation is actually at a systems level?

Bookmarked Too many adjectives, not enough ideas: how NAPLAN forces us to teach bad writing (theconversation.com)

Essentially, students are not thinking of the best ideas, words or strategies to achieve their communication goals. They are thinking of what NAPLAN wants, even if this is bad writing.

Lucinda McKnight talks about the way in which NAPLAN influences and inhibits writing habits. This relates to John Warner’s book on the problems of essay writing.
Replied to NAPLAN to be cancelled for 2020 due to COVID-19 disruption (The Sydney Morning Herald)

Education ministers have decided to cancel NAPLAN for 2020

It will be interesting to see what the ‘new normal’ looks like in 2021. Will it be an opportunity to reset with an upgraded NAPLAN or will they use this as an opportunity to completely overhaul it as was mooted in the Gonski 2.0 review?
Replied to

That is confusing. It feels like people have forgotten the purpose of its design. Isn’t it only ever indicative?
https://collect.readwriterespond.com/notes-from-national-testing/
Bookmarked ‘Back to basics’ is not our education cure – it’s where we’ve gone wrong (The Sydney Morning Herald)

NAPLAN, by contrast, does test basic literacy and numeracy. However, unlike our declining PISA performance, there has been no downward slide in NAPLAN results. If anything, the year 3 NAPLAN cohort from 2013 did better than their counterparts from five years earlier.


Whatever the reason for the decline in PISA results, it is not mirrored by a corresponding decline in NAPLAN scores for the same cohorts of students. So what is going on?

Richard Holden and Adrian Piccoli discuss the difference between NAPLAN and PISA. They also explain that a ‘return to basics’ is actually counter-intuitive when it comes to improving PISA results, which are designed to focus on workplace skills:

In an increasingly globalised and automated world, problem-solving ability is the scarce skill. It is the skill that will generate the long-run productivity growth required to maintain high standards of living.

Replied to No, minister! Keep NAPLAN results away from student job applications (The Conversation)

The more we conflate learning with NAPLAN performance, the more we risk making misguided decisions on schooling policy and practice. The notion that threatening the future of Year 9 students will “encourage them to give their best efforts while sitting NAPLAN” is dangerous and detracts from the meaningful work occurring every day in classrooms.

I recently started reading Why They Can’t Write: Killing the Five-Paragraph Essay and Other Necessities. For me this captures the problem with such things as NAPLAN where students are feed formulas to ‘pass the test’, but then struggle to find a voice in their writing. What frustrates me is that the test would achieve the same validity if it were based on a sample group as PISA is. However, there are some who want it both ways. Not only do they want the systemic pulse check, but they want the individual pulse check too.
.
Bookmarked

Marten Koomen responds to the suggestion of having Year 9 NAPLAN test linked to future job applications.

Replied to

If NAPLAN‘s process works so well, why doesn’t PISA test every student in the world?
Replied to ‘This has caused significant stress’: NAPLAN computer errors anger teachers, students (ABC News)

Victorian schools are given the option to scrap online NAPLAN tests after computer glitches and broadband problems affected schools across the country.

What is slightly disconcerting is that it is hard to find anyone who was surprised by this, especially those who were around during the days of the Ultranet.
Bookmarked QandA:‘what works’ in ed with Bob Lingard, Jessica Gerrard, Adrian Piccoli, Rob Randall,Glenn Savage (chair) by Glenn Savage (aare.edu.au)

On November 6th, I hosted a Q&A Forum at the University of Sydney, co-sponsored by the AARE ‘Politics and Policy in Education’ Special Interest Group and the School and Teacher Education Policy Research Network at the University of Sydney.

It featured Adrian Piccoli (Director of the UNSW Gonski Institute for Education), Jessica Gerrard (senior lecturer in education, equity and politics at the University of Melbourne), Bob Lingard (Emeritus Professor at the University of Queensland and Professorial Research Fellow at the Australian Catholic University) and Rob Randall (CEO of the Australian Curriculum, Assessment and Reporting Authority).

Glenn Savage chairs a conversation with a varied group of voices discussing impact of evidence, Think Tanks and NAPLAN on education.

Marginalia

We can’t rely on a medical model, where RCTs come from, for something like classroom practice, and you can see this in John Hattie’s very influential book Visible Learning. You just have to look at the Preface where he says that he bracketed out of his study any factor that was out of school … there’s no RCT on the funding of elite private schools, but yet we do these things. (Jessica Gerrard)

The think tank usually has a political-ideological position, it usually takes the policy problem as given rather than thinking about the construction, I think it does research and writes reports which have specific audiences in mind, one the media and two the politicians. (Bob Lingard)

NAPLAN is the King Kong of education policy because it started off relatively harmless on this little island and now it’s ripping down buildings and swatting away airplanes. I mean it’s just become this dominant thing in public discourse around education. (Adrian Piccoli)

Liked Teacher learning, not student test results, should be a national priority for Australia by By Ian Hardy (EduResearch Matters)

Useful data can also include teachers’ notes about student academic progress more generally, their level of attentiveness in class, as well as about their well-being and social engagement with their peers, and other adults in the school. Seeking to work productively with a wide and deep array of data, beyond simply standardized measures, is the key to fostering substantive teacher learning for student learning.

Bookmarked How school principals respond to govt policies on NAPLAN. (Be surprised how some are resisting) by By Dr Amanda Heffernan (EduResearch Matters)

My study found two main ways that she managed to resist the more performative influences of school improvement policies. Firstly, the school had a collaboratively-developed school vision that focused on valuing individual students and valuing the aspects of education that can’t be easily measured. The power of the vision was that it served as a filter for all policy enactment decisions made at the school. If it didn’t align with their vision, it didn’t happen. There was also agreement in this vision from the staff, students, and community members, who kept that vision at the forefront of their work with the school.

The second key aspect was that Anne had developed a strong ‘track record’ with her supervisors, and this engendered trust in her judgment as a leader. She was given more autonomy to make her policy enactment decisions as a result, because of this sense of trust. It was developed over a long time in the same school and in the same region before that. To develop her track record, Anne worked hard to comply with departmental requirements (deadlines, paperwork, and other basic compliance requirements).

Dr Amanda Heffernan reflects upon a case study investigating ‘policy enactment’.

How principals implement, or carry out, policy in their schools.

An example of this is the focus on growth, testing and NAPLAN results. She highlights two methods used to refocus things. Firstly, have a clear school vision and secondly, build trust with her system supervisors.

This continues some of the discussions had in the collect National Testing in Schools.

Liked Why the NAPLAN results delay is a storm in a teacup by Jim Tognolini (The Conversation)

The real issue underpinning the controversy is the misuse of NAPLAN data. It was never intended that NAPLAN data would be used for fine-grained comparison of students.

The MySchool website has contributed to the misuse of NAPLAN data. For example, the scores from the site are being used to make comparisons irrespective of the “error bands” that need to be taken into account when making comparisons. People are ascribing a level of precision to the results that was never intended when the tests were developed. The test was never designed to be high-stakes and the results should not be used as such.

When people challenge the “validity” of the NAPLAN test, they should be challenging the validity of the use of the results. NAPLAN has a high degree of validity, but we need to understand it better and use the results in a more judicious and defensible manner. The correct use of NAPLAN data is a major issue and it needs to be addressed as a matter of priority.

Liked The NAPLAN online controversy is about a failure of meaning, and not about a failure of technology by Marten Koomen (Tulip Education)

The inability of NAPLAN to reflect broader developments in society is being exposed by the transition to NAPLAN online. The latest NAPLAN controversy is not the result of a glitch or technical incompetence. Instead, the controversy exposes a broader conceptual problem in Australian education. Australian policy-makers and commentators have been spoiled by Australia coming of high-base of educational performance, and by an abundance of educational data that allows for broad and sweeping policy commentary. However, this approach is leading to a continued decline in Australian educational achievement. NAPLAN online exposes the need to reconnect educational assessment with the world that students experience.

Replied to Building the Windmill (or knocking it down again?) by Darcy Moore (Darcy Moore’s Blog)

It is easy to be wise after the event but it was clear to everyone in education at the time what this kind of standardised testing (soon to be turned into pseudo-league tables) would do to our schools and communities. Now, we are all about to embark on the next iteration of school reform with many of the same players in place and the same kind of flawed, grand educational policy about to start afresh. One can only hope we do not forget the lessons of Animal Farm for those of us who have to carry out the real work of planning for the never-ending rebuilding of The Windmill.

It feels like people are picking and choosing the bits that they like in the new Gonski review, I wonder though whether we can have the collaboration without the newfound accountability?

Anyway, off to push the rock to the top of the hill once again.

Bookmarked It’s time to be honest with parents about NAPLAN: your child’s report is misleading, here’s how by By Nicole Mockler (EduResearch Matters)

At the national level, however, the story is different. What NAPLAN is good for, and indeed what it was originally designed for, is to provide a national snapshot of student ability, and conducting comparisons between different groups (for example, students with a language background other than English and students from English-speaking backgrounds) on a national level.

This is important data to have. It tells us where support and resources are needed in particular. But we could collect the data we need this by using a rigorous sampling method, where a smaller number of children are tested (a sample) rather than having every student in every school sit tests every few years. This a move that would be a lot more cost effective, both financially and in terms of other costs to our education system.

Nicole Mockler summarises Margaret Wu’s work.around the limitations to NAPLAN in regards to statistical testing. Moving forward, Mockler suggests that NAPLAN should become a sample based test (like PISA) and is better suited as a tool for system wide analysis. To me, there is a strange balance where on the one hand many agree that NAPLAN is flawed, yet again and again we return to it as a source of ‘truth’.
Listened TER #111 – Learning and Wellbeing with Helen Street – 29 April 2018 from Teachers’ Education Review

Links and notes coming soon! Timecodes: 00:00:00 Opening Credits 00:01:31 Intro 00:02:28 NAPLAN in the news 00:15:04 Feature Introduction 00:16:32 Off Campus – Dan Haesler 00:18:44 Dr Helen S…

Cameron Malcher provides a useful summary of the recent discussions of NAPLAN in the news: