Tag: Teaching Machines
The challenge, I think, is to express to readers this drudgery not only in contrast to the fantasies of shiny and efficient teaching machines – stories about robot teachers or otherwise – but also as the same sort of drudgery that today’s ed-tech dictates. Calling it “personalized learning” doesn’t make today’s computer-based instruction any more exciting. I promise you.
In the introduction to Teaching Machines and Programmed Learning: A Source Book (1960), a collection of articles penned by some of the best known theorists and practitioners in the field (including both Skinner and Pressey), A. A. Lumsdaine lists these as the three key properties of “teaching machines”:
First, continuous active student response is required, providing explicit practice and testing of each step of what is to be learned.
Second, a basis is provided for informing the student with minimal delay whether each response he makes is correct, leading him directly or indirectly to correction of his errors.
Third, the student proceeds on an individual basis at his own rate – faster students romping through an instructional sequence very rapidly, slower students being tutored as slowly as necessary, with indefinite patience to meet their special needs.
The devices thus represent a way of providing a pre-programmed study-practice combination which simulates, in partially or fully automated fashion, the functions of a private tutor in recitation and practice, with immediate correction of errors and feedback to the student.
These are “technologies of behavior” that we can trace back to Skinner – perhaps not directly, but certainly indirectly due to Skinner’s continual engagement with the popular press. His fame and his notoriety. Behavioral management – and specifically through operant conditioning – remains a staple of child rearing and pet training. It is at the core of one of the most popular ed-tech apps currently on the market, ClassDojo. Behaviorism also underscores the idea that how we behave and data about how we behave when we click can give programmers insight into how to alter their software and into what we’re thinking.
I think there’s a lot to say about machine learning and the push for “personalization” in education. And the historian in me cannot help but add that folks have trying to “personalize” education using machines for about a century now. The folks building these machines have, for a very long time, believed that collecting the student data generated while using the machines will help them improve their “programmed instruction” – this decades before Mark Zuckerberg was born.
I think we can talk about the labor issues – how this continues to shift expertise and decision making in the classroom, for starters, but also how students’ data and students’ work is being utilized for commercial purposes. I think we can talk about privacy and security issues – how sloppily we know that these companies, and unfortunately our schools as well, handle student and teacher information.
But I’ll pick two reasons that we should be much more critical about education technologies.
Anytime you hear someone say “personalization” or “AI” or “algorithmic,” I urge you to replace that phrase with “prediction.”
It’s official. I signed the book contract late last night: the MIT University Press has agreed to publish Teaching Machines.
That was an interesting comment about behaviourism. It is a reminder that history is always a trace or thread and can never really be a complete recreation.