Student insights from my study indicated that students’ relationship with their teachers is vitally important. This is consistent with previous research such as the study by psychologist and Director of the Center for Reality Therapy, Bob Wubbolding, who found that, “the higher the quality of student-teacher relationship, the higher the level of students’ interest in learning”. A student’s perception of that relationship is fundamental because, as this study found, students view their teacher as either policing or supporting their learning.
Insights from the students in my study also suggested that pedagogy was a key issue in (dis)engaging students. So, teachers who practice pedagogies that I call Connective, Participatory, and Differentiated that are founded on connections with students will have fewer disengaged students.
The glaring contradiction in the report, as I see it, its that it asks for massive changes to an assembly-line reality by advocating for more assessment assembly-lines. Ken Boston in his recent commentary speaks to this by advocating that this is a “evolution not a revolution.” What is missing from this argument for learning progressions is the assumption that learning can be standardized across children. Chunking a NAPLAN component a day or week turns teachers into test givers and paper pushers rather than gifted learning scientists negotiating each child’s journey through the curriculum so that they are engaged and inspired, not lab rats.
So on a general level, the case for evidence-based practice has a definite value. But let’s not over-extend this general appeal, because we also have plenty of experience of seeing good research turn into zealous advocacy with dubious intent and consequence. The current over-extensions of the empirical appeal have led paradigmatic warriors to push the authority of their work well beyond its actual capacity to inform educational practice. Here, let me name two forms of this over-extension.
Simply ask ‘effect on what?’ and you have a clear idea of just how limited such meta-analyses actually are.
While in regards to RCT’s, he states:
By definition, RCTs cannot tell us what the effect of an innovation will be simply because that innovation has to already be in place to do an RCT at all. And to be firm on the methodology, we don’t need just one RCT per innovation, but several – so that meta-analyses can be conducted based on replication studies.
Another issue is that Research shows what has happened, not what will happen. This is not to say no to evidence, but a call to be sensible about what we think that we can learn from it.
What it can do is provide a solid basis of knowledge for teachers to know and use in their own professional judgements about what is the best thing to do with their students on any given day. It might help convince schools and teachers to give up on historical practices and debates we are pretty confident won’t work. But what will work depends entirely on the innovation, professional judgement and, as Paul Brock once put it, nous of all educators.
The ALP’s pledge to fund an ‘Evidence Institute for Schools’ lacks attention to what is needed most—funding for schools and classrooms. Further, the effectiveness of this large sum of funding spent on an institute is premised on the notion that it will produce significantly more effective research than is already available.
Here’s what could be done
- investigating more efficient ways to encourage the uptake of educational research in our schools and universities
- improve overall accessibility of education research to the public