Liked A Gradgrind ethos is destroying the school system | Simon Jenkins by (the Guardian)
Pisa, Whitehall and Ofsted are obsessed with maths not because algebra is the key to happiness, or geometry to great riches, but because it is easy to score globally. Bereft of an ideal of a good education, government, and especially central government, likes anything that yields mass data. It holds the key to control, to the regime of rewards and penalties that underpins modern administration and its funding.
Bookmarked It’s time to be honest with parents about NAPLAN: your child’s report is misleading, here’s how by By Nicole Mockler (EduResearch Matters)

At the national level, however, the story is different. What NAPLAN is good for, and indeed what it was originally designed for, is to provide a national snapshot of student ability, and conducting comparisons between different groups (for example, students with a language background other than English and students from English-speaking backgrounds) on a national level.

This is important data to have. It tells us where support and resources are needed in particular. But we could collect the data we need this by using a rigorous sampling method, where a smaller number of children are tested (a sample) rather than having every student in every school sit tests every few years. This a move that would be a lot more cost effective, both financially and in terms of other costs to our education system.

Nicole Mockler summarises Margaret Wu’s work.around the limitations to NAPLAN in regards to statistical testing. Moving forward, Mockler suggests that NAPLAN should become a sample based test (like PISA) and is better suited as a tool for system wide analysis. To me, there is a strange balance where on the one hand many agree that NAPLAN is flawed, yet again and again we return to it as a source of ‘truth’.
Liked Testing is not a moral agent (
We should stop treating tests like moral agents that can define the future. I agree with David Rutkowski’s point about agency, perhaps we’d be well-advised to think about what is enabled, and what we don’t have to do, when we cede our agency to tests and ask whether we really breath a sigh of relief that it is our responsibility we can explain away. The desire for a testing regime is a symptom, not a cause, and it seems to me if you better understand those individual and collective desires at work, you may understand why it is that reconciliACTION and social justice remain distractable.

📓 Measuring NAPLAN Performance

In a message to parents, I came across the following explanation of NAPLAN:

The tests provide parents and schools with an understanding of how individual students are performing at the time of tests.

This is such a hard thing to communicate. It is easy to read ‘performing’ as some sort of exact since, such as Johnny got 33 out of 40 in the recent test on whatever. The problem though is that NAPLAN is not ‘exact’ either at the time or as a measurement of growth. This is highlighted by Richard Olsen in his look at the limitations:

In practice, NAPLAN relative growth is a so unreliable that I cannot believe that it is a suitable measure and I would personally discourage anyone from using it. The narrow range of questions that define average growth, compounded by the error inherent to NAPLAN’s testing method make it an extremely unreliable measure.

I think that Margaret Wu captures this best when she explains:

In summary, we would say that a NAPLAN test only provides an indicative level of the performance of a student: whether the student is struggling, on track, or performing above average. The NAPLAN tests do not provide fine grading of students by their performance levels because of the large uncertainties associated with the ability measures.

Liked Yes, Project-Based Learning Gets Kids Ready for the Test (and so much more) (A.J. JULIANI)
I’d ask anyone who is criticizing PBL in the classroom to talk to the teachers and students who have had this opportunity. I’d ask them to look at what students are creating, making, and building during this time. I’d ask them to talk to the parents about their students’ attitude towards learning. I give two answers to the question above: Try it for a day and see what happens. Start small and build from there. Teach through the project, instead of using the project as an “end-of-unit” assessment that takes more time than a multiple choice test. When kids learn during the project, the time constraint goes away.
Listened TER #109 – How large-scale tests affect school management with Marten Koomen – 04 March 2018 by Cameron Malcher from Teachers' Education Review
Cameron Malcher speaks with Marten Koomen about his research into the process by which large-scale tests like PISA and NAPLAN affect school management and curriculum.

Marten Koomen frames the conversation around a discussion of collectivism, neoliberalism and skepticism. For collectivists, school is the responsibility of the state, whereas neoliberals consider it as another product to be consumed. While without effective governance, skepticism ends up in tragedy. Our current climate is very much in response to neoliberalism, however:

We are all part collectivist, individualists neoliberals and skeptics, so to identify in one corner is disingenuous.

The key question that Koomen tries to address is: How did Victoria go from a state that was a leader in content knowledge and democratic values to the launch of a content-free platform driven by the terror of performativity? As he explains,

They had this idea of the net, but no idea of the content … a complete infatuation with the technology.

Discussing PISA, Koomen provides some background to computer-based testing and the ‘Koomen Model’. The model involved providing schools with standardized devices for the consistency of data. It failed based on pressure.

In part, Koomen’s model tells us something about the data and what it tells us. There are groups out there that want the outcomes without the content or context. Koomen returns again and again to the difference between entity realism vs. constructivism:

Entity Realism = things are real

Constructivism = things agreed upon

Realists ignore context as it is not mapped back to a central curriculum. It also allows for the insult of the human spirit through comparison of outcomes, ratio and market results. For example, NAPLAN uses Item Response Theory, a format that does not allow any direct recall or reference to learning and development. This leads to the situation where a student can ‘improve’ yet remain on the same score. Margaret Wu explains this in her chapter in National Testing in Schools, while Sam Sellar, Greg Thompson and David Rutkowski elaborate on it in The Global Education Race.

For Koomen our decline in these scales comes back to a focus on the market:

Neoliberalism considers content as: self-evident, real, axiomatic, socially constructed and marketable. In a way that supports the status quo.

This leads to conversations with students in regards to points on a scale, rather than aspects of context and development. For example, it is easier in the media to talk about a change in ratios or job rates, rather than the collapse in the car industry and what impact that has for the state. This allows for the rise of education conferences based around data with little reference to the local context.

The answer Koomen closes with is to work together though associations to make systemic change.

Listened TER #108 – ACARA’s Literacy & Numeracy Progressions with Hilary Dixon – 18 Feb 2018 from
Main Features: ACARA’s Acting Director, Curriculum, discusses the new literacy and numeracy progressions, their relationship to curriculum, and intended applications in teaching and assessment practices; Annabel Astbury outlines the ABC’s new education initiative. Regular Features: Off Campus, ...

00.000 Opening Credits
01:31.214 Intro
01:55.324 Off Campus – Dan Haesler
12:48.141 Education in the News
20:44.068 ABC Education – Annabel Astbury
28:50.180 Feature Introduction
30:52.440 Interview – Hilary Dixon
59:28.218 Announcements
1:01:52.482 Quote & Sign Off

In this edition of the TER Podcast, Cameron Malcher interviews Hilary Dixon about the new Literacy and Numeracy Progressions released earlier this year from ACARA. Although the interview discusses what the progressions are, it also provides a critical context to their creation and where they might sit within the wider debate around NAPLAN and back-to-bacics curriculum.

Liked PISA for personality testing – the OECD and the psychometric science of social-emotional skills by Ben Williamson (code acts in education)
SSES extends the reach of datafication of education beyond school walls into the surveillance of home contexts and family life, treating them as a ‘home learning environment’ to be assessed on how it enables or impedes students’ development of valuable socio-emotional skills

Ben Williamson provides a (very partial) overview of some of the key features of SSES. However, it does raise a few headline points:

SSES extends international-large scale assessment beyond cognitive skills to the measurement of personality and social-emotional skills

SSES will deliver a direct assessment instrument modelled on psychological personality tests

SSES enacts a psychological five-factor model of personality traits for the assessment of students, adopting a psychometric realist assumption that personality test data capture the whole range of cross-cultural human behaviour and emotions in discrete quantifiable categories

SSES extends the reach of datafication of education beyond school walls into the surveillance of home contexts and family life, treating them as a ‘home learning environment’ to be assessed on how it enables or impedes students’ development of valuable socio-emotional skills

SSES normalizes computer-based assessment in schools, with students required to produce direct survey data while also being measured through indirect assessments provided by teachers, parents and leaders

SSES produces increasingly fine-grained, detailed data on students’ behaviours and activities at school and at home that can be used for targeted intervention based on analyses performed at a distance by an international contractor

SSES involves linking data across different datasets, with direct assessment data, indirect assessments, school admninistrative data, and process metadata generated during assessment as multiple sources for both large-scale macro-analysis and fine-grained micro-analytics–with potential for linking data from other OECD assessments such as PISA

SSES uses digital signals such as response times and keystrokes, captured as process metadata in software log files, as sources for stealth assessment based on assumptions about their correlation with specific social-emotional skills

SSES promotes a therapeutic role for education systems and schools, by identifying ‘success’ factors in SELS provision and encouraging policymakers to develop targeted intervention where such success factors are not evident

SSES treats students’ personalities as malleable, and social-emotional skills as learnable, seeking to produce policy-relevant psychometric knowledge for policymakers to design interventions to target student personalities

SSES exemplifies how policy-relevant knowledge is produced by networks of influential international organizations, connected discursively and organizationally to think tanks, government departments and outsourced contractors

SSES represents a psycho-economic hybridization of psychological and psychometric concepts and personality measurement practices with economic logics relating to the management of labour market behaviours and human resources

Bookmarked 'We must kill this cult of measuring everything that schools do' (Tes)
The idea that the solution to the nefarious effects of constant high-stakes measurement is to bring in more high-stakes measurement – albeit of a different thing – is palpably insane. It is further evidence, if we needed any, that we have surrendered our profession to a cultish scientism whose mantra is measurement.
JT Dutaut wonders about a future where the solution to too much testing in education is more testing.