Bookmarked It’s time to be honest with parents about NAPLAN: your child’s report is misleading, here’s how (EduResearch Matters)

At the national level, however, the story is different. What NAPLAN is good for, and indeed what it was originally designed for, is to provide a national snapshot of student ability, and conducting comparisons between different groups (for example, students with a language background other than English and students from English-speaking backgrounds) on a national level.

This is important data to have. It tells us where support and resources are needed in particular. But we could collect the data we need this by using a rigorous sampling method, where a smaller number of children are tested (a sample) rather than having every student in every school sit tests every few years. This a move that would be a lot more cost effective, both financially and in terms of other costs to our education system.

Nicole Mockler summarises Margaret Wu’s work.around the limitations to NAPLAN in regards to statistical testing. Moving forward, Mockler suggests that NAPLAN should become a sample based test (like PISA) and is better suited as a tool for system wide analysis. To me, there is a strange balance where on the one hand many agree that NAPLAN is flawed, yet again and again we return to it as a source of ‘truth’.
Bookmarked What We Talk About When We Talk About Digital Capabilities: Keynote for #udigcap | Donna Lanclos–The Anthropologist in the Stacks (donnalanclos.com)

The history of Anthropology tells us that categorizing people is lesser than understanding them. Colonial practices were all about the describing and categorizing, and ultimately, controlling and exploiting. It was in service of empire, and anthropology facilitated that work.

It shouldn’t any more, and it doesn’t have to now.

You don’t need to compile a typology of students or staff. You need to engage with them.

In a keynote at the UCISA Digital Capabilities event at Warwick University, Donna Lanclos unpacks the effect of analytics and the problems of profiling when trying to identify improvements. A skills approach is an issue when decisions get made on your behalf based on the results of a pre-conceived checklist:

I want to draw a line from quiz-type testing that offers people an opportunity to profile themselves and the problems inherent in reducing knowledge work to a list of skills. And I also want to draw attention to the risks to which we expose our students and staff, if we use these “profiles” to predict, limit, or otherwise determine what might be possible for them in the future.

Lanclos suggests that we need to go beyond the inherent judgments of contained within metaphors and deficit models, and instead start with context:

We need to start with people’s practices, and recognize their practice as as effective for them in certain contexts.

And then ask them questions. Ask them what they want to do. Don’t give them categories, labels are barriers. Who they are isn’t what they can do.

Please, let’s not profile people.

When you are asking your students and staff questions, perhaps it should not be in a survey. When you are trying to figure out how to help people, why not assume that the resources you provide should be seen as available to all, not just the ones with “identifiable need?”

The reason deficit models persist is not a pedagogical one, it’s a political one.

She closes with the remark:

When we ask students questions, it shouldn’t be in a survey.

This reminds me of coaching the fluidity of the conversation. This also touches on my concern with emotional intelligences as a conversational tool.

The interior of the triangle is where people map the practices that are bounded by their
institution and the work they do in institutional digital platforms and places. The exterior of the triangle
is where they can map everything else–what they do that is not bounded by the institution. This can
be their personal lives, or their work that does not take place in official channels, but rather on the
open web, in self-hosted or commercial platforms.

There is also a recording of this presentation:

Listened Golden State Killer: the end of DNA privacy? Chips with Everything podcast by an author from the Guardian

US investigators recently tracked down the suspect of a 40-year-old murder case after uploading DNA to a genealogy website. Jordan Erica Webber weighs up the pros of finding ancestors with the cons of selling privacy

Jordan Erica Webber talks to Prof Charles Tumosa of the University of Baltimore, Prof Denise Syndercombe-Court of King’s College and Lee Rainie of the Pew Research Center. This is a challenging conversation and comes back to notions of ‘informed consent’.

Maggie Koerth-Baker discusses changes in data arguing that we need to stop seeing privacy as a ‘personal’ thing:

Experts say these examples show that we need to think about online privacy less as a personal issue and more as a systemic one. Our digital commons is set up to encourage companies and governments to violate your privacy. If you live in a swamp and an alligator attacks you, do you blame yourself for being a slow swimmer? Or do you blame the swamp for forcing you to hang out with alligators?

Bookmarked I am a data factory (and so are you) (ROUGH TYPE)

The shift of data ownership from the private to the public sector may well succeed in reducing the economic power of Silicon Valley, but what it would also do is reinforce and indeed institutionalize Silicon Valley’s computationalist ideology, with its foundational, Taylorist belief that, at a personal and collective level, humanity can and should be optimized through better programming.

Nicholas Carr reflects on the metaphors that we use and demonstrates some of the flaws, particularly when they are used against us inadvertently. This is something brought to the for with Google’s effort to support wellbeing. As Arielle Pardes explains:

While Google says “digital wellness” is now part of the company’s ethos, not once during the Google I/O keynote did anyone mention “privacy.”

Liked To work for society, data scientists need a hippocratic oath with teeth (WIRED UK)

The first question is, are the algorithms that we deploy going to improve the human processes that they are replacing? Far too often we have algorithms that are thrown in with the assumptions that they’re going to work perfectly, because after all they’re algorithms, but they actually end up working much worse than the system that they’re replacing. For example in Australia they implemented an algorithm that sent a bunch of ridiculously threatening letters to people saying that they had defrauded the Australian Government. That’s a great example where they actually just never tested it to make sure it worked.

The second question is to ask, for whom is the algorithm failing? We need to be asking, “Does it fail more often for women than for men? Does it fail more often for minorities than for whites? Does it fail more often for old people than for young people?” Every single class should get a question and an answer. The big example I have for this one is the facial recognition software that the MIT Media Lab found worked muchbetter for white men than black women. That is a no-brainer test that every single facial recognition software company should have done and it’s embarrassing that they didn’t do it.

The third category of question is simply, is this working for society? Are we tracking the mistakes of the system? Are we inputting these mistakes back into the algorithm so that it’ll work better? Is it causing some other third unintended consequence? Is it destroying democracy? Is it making people worse off?

Bookmarked Comments on ClassDojo controversy (code acts in education)

The educational app ClassDojo has been the target of articles in several British newspapers. The Times reported on data privacy risks raised by the offshoring of UK student data to the US company–a story The Daily Mail re-reported. The Guardian then focused on ClassDojo promoting competition in classrooms. All three pieces have generated a stream of public comments. At the current time, there are 56 comments on the Mail piece, 78 at The Times, and 162 on The Guardian. I’ve been researching and writing about ClassDojo for a couple of years, on and off, and was asked some questions by The Times and The Guardian. So the content of the articles and the comments and tweets about them raise issues and questions worth their own commentary–a response to key points of controversy that also speak to wider issues  with the current expansion of educational technology across public education, policy and practice. ClassDojo has also now released its own response and reaffirmation of its privacy policy.

Ben Williamson addresses a number of questions leveled at Class Dojo, especially in light of the current concern around data. One of the points that he makes that really stuck out was the notion of ‘sensitive data’. Often this is defined by privacy, however as Williamson explains the collection of data over time actually has the potential to turn the seemingly arbitrary into sensitive data.

ClassDojo has been dealing with privacy concerns since its inception, and it has well-rehearsed responses. Its reply to The Times was: ‘No part of our mission requires the collection of sensitive information, so we don’t collect any. … We don’t ask for or receive any other information [such as] gender, no email, no phone number, no home address.’ But this possibly misses the point. The ‘sensitive information’ contained in ClassDojo is the behavioural record built up from teachers tapping reward points into the app.

Williamson does however close with a warning, that with GDPR coming in, ‘data danger’ is quickly becoming its own genre:

The risks of ‘data-danger’ for children reported in the articles about ClassDojo doubtless need to be viewed through the wider lens of media interest in social media data misuses following the Facebook/Cambridge Analytica scandal. This presents opportunities and challenges. It’s an opportunity to raise awareness and perhaps prompt efforts to tighten up student privacy and data protection, where necessary, as GDPR comes into force. ClassDojo’s response to the controversy raised by the press confirmed it was working on GDPR compliance and would update its privacy policy accordingly. Certainly 2018 is shaping up as a year of public awareness about uses and misuses of personal data. It’s a challenge too, though, as media coverage tends to stir up overblown fears that risk obscuring the reality, and that may then easily be dismissed as paranoid conspiracy theorizing. It’s important to approach ed-tech apps like ClassDojo–and all the rest–cautiously and critically, but to be careful not to get swept up in media-enhanced public outrage.

Bookmarked Here’s what is going wrong with ‘evidence-based’ policies and practices in schools in Australia (EduResearch Matters)

So on a general level, the case for evidence-based practice has a definite value. But let’s not over-extend this general appeal, because we also have plenty of experience of seeing good research turn into zealous advocacy with dubious intent and consequence. The current over-extensions of the empirical appeal have led paradigmatic warriors to push the authority of their work well beyond its actual capacity to inform educational practice. Here, let me name two forms of this over-extension.

James Ladwig unpacks evidence-based approaches. In response to ‘synthetic reviews’, he suggests:

Simply ask ‘effect on what?’ and you have a clear idea of just how limited such meta-analyses actually are.

While in regards to RCT’s, he states:

By definition, RCTs cannot tell us what the effect of an innovation will be simply because that innovation has to already be in place to do an RCT at all. And to be firm on the methodology, we don’t need just one RCT per innovation, but several – so that meta-analyses can be conducted based on replication studies.

Another issue is that Research shows what has happened, not what will happen. This is not to say no to evidence, but a call to be sensible about what we think that we can learn from it.

What it can do is provide a solid basis of knowledge for teachers to know and use in their own professional judgements about what is the best thing to do with their students on any given day. It might help convince schools and teachers to give up on historical practices and debates we are pretty confident won’t work. But what will work depends entirely on the innovation, professional judgement and, as Paul Brock once put it, nous of all educators.

This touches on Mark Esner’s argument that great teacher will make anything work to a degree. Also, Deborah Netolicky’s observations about evidence.

Bookmarked Switching up my signals (W. Ian O’Byrne)

This is not easy. This is not normal. This is a bit challenging as I’m forcing myself to redirect the streams that the social networks have made super simple for me (and others) to use over time. This is not easy as general users are conditioned to the sorts of signals, environments, and features that are rolled out over time. What I’m trying to do here will not make sense to most people who I interact with. This will confuse and possibly anger some f my followers. This may also cause many users to unfollow me, or (better yet) the algorithms on the social networks will just filter me out of the discussions.

Ian O’Byrne reflects on the signals that he shares online and his efforts to reclaim them with the creation of his own digital Commonplace Book. I too have gone down this path, exploring the many possibilities of the #IndieWeb. It is not a simple solution, however there is something about engaging in what Clay describes as ‘awkward workflows’. I like how O’Byrne’s post captures some of the wicked questions that this all poses, such as how do we share and which links do we use.
Liked Facebook warns investors to expect bigger and worse scandals than Cambridge Analytica (Boing Boing)

In reality, Facebook is designed to allow its partners to violate its users’ privacy, so the fact that Cambridge Analytica got caught with its hand in 80 million of our cookie-jars is an indication of how incompetent they were (they were the easiest to detect, in part because of their public boasting about their wrongdoing), and that means there are much worse scammers who are using Facebook to steal our data in ways that makes CA look like the petty grifters they are.

Liked From Student Agency to Dating Agency: Hiring Teachers by Algorithm (maelstrom)

There is probably little doubt that the analysis of data will play an increasing role in teacher recruitment. I am sure that among the companies involved in the development of such platforms there are many good people with solid beliefs and values, individuals who will want to see these systems used in conjunction with personal connections, interviews, and relationships. In other words, in very humane ways, using the algorithm as a guide, not a decision-maker, and this is where biometric data may prove initially attractive. The question, of course, with all “data-driven” initiatives lies not so much with the intent or even the veracity of the data collected, but with how it is used. Data can too easily become the decision-making tool of lazy convenience and ends up being used in ways never intended. When I consider my teaching colleagues, I recoil at the prospect of viewing them as data points. Someone needs to shout stop.

Liked Testing is not a moral agent (drbeardface.net)

We should stop treating tests like moral agents that can define the future. I agree with David Rutkowski’s point about agency, perhaps we’d be well-advised to think about what is enabled, and what we don’t have to do, when we cede our agency to tests and ask whether we really breath a sigh of relief that it is our responsibility we can explain away. The desire for a testing regime is a symptom, not a cause, and it seems to me if you better understand those individual and collective desires at work, you may understand why it is that reconciliACTION and social justice remain distractable.

Replied to 🎞 The Circle (2017) by Chris AldrichChris Aldrich (Chris Aldrich | BoffoSocko)

Watched The Circle (2017) from STX Entertainment https://i0.wp.com/a.ltrbxd.com/resized/sm/upload/mo/zt/9r/l2/bQVqd5rWrx5GbXhJNuvKy4Viz6j-0-150-0-225-crop.jpg?w=840&ssl=1 Directed by James Ponsoldt. With Emma Watson, Tom Hanks, John Boyega, Ellar Coltrane.
A woman lands a dream job at a powerful tec…

I was disappointed with the movie too. I highly recommend the book. It really builds things up gradually, whereas the film rushes straight to the point and fails to set a deep foundation.
Liked Google’s Facebook Copycat Moves Leave It More Exposed to Privacy Backlash by an author (Bloomberg.com)

Google is already buttoning up its data policies in anticipation of Europe’s General Data Protection Regulation, or GDPR, which kicks in next month. The company restricted the number of third-party companies that can serve and track ads through its advertising exchange and on YouTube. Google is also requiring publishers to get user consent for targeted ads to comply with GDPR.

Bookmarked Personalized precision education and intimate data analytics (code acts in education)

Precision education represents a shift from the collection of assessment-type data about educational outcomes, to the generation of data about the intimate interior details of students’ genetic make-up, their psychological characteristics, and their neural functioning.

Ben Williamson breaks down the idea of precision through the use of data and how it might apply to education.
Liked Cambridge Analytica: the data analytics industry is already in full swing (The Conversation)

If we want a full and comprehensive debate about the role of data in our lives, we need to first appreciate that the analysis and use of our data is not restricted to the types of figures that we have been reading about in these recent stories – it is deeply embedded in the structures in which we live.

Bookmarked 10 definitions of datafication (in education) by Ben Williamson (code acts in education)

In simple terms, datafication can be said to refer to ways of seeing, understanding and engaging with the world through digital data. This definition draws attention to how data makes things visible, knowable, and explainable, and thus amenable to some form of action or intervention. However, to be a bit more specific, there are at least ten ways of defining datafication.

Ben Williamson documents ten ways of defining ‘datafication’:

  • Historically
  • Technically
  • Epistemologically
  • Ontologically
  • Socially
  • Politically
  • Culturally
  • Imaginatively
  • Dystopically
  • Legally & ethically

This is a good introduction to his book Big Data in Education.

Bookmarked Personality Tests and the Downfall of Democracy (word.weid.io)

Facebook has been designed to be an information-gathering engine in order to more effectively sell personalized advertising. Its algorithm also attempts to deeply understand your interests in order to “optimize for engagement”: keep you using the site, and therefore viewing those personalized ads, for as long as possible. Its users access Facebook for 50 minutes a day.

In order to gather the most information it can, Facebook has been engineered to be the world’s most efficient peer pressure engine. Users on the platform are constantly being persuaded to stay; those who try and leave report being relentlessly emailed with personalized, emotional content to try and get them to come back.

Werdmuller explains how the use of personality quizzes can be and have been used by Facebook to develop a complex profile. In light of the Facebook breach and concerns around Cambridge Analytica, Werdmuller explains that none of this should surprise anyone. It is how the platform has been designed.

Tantek Çelik explains this in the IndieWeb Chat:

The big reveal (IMO) of the FB/CA disclosures is that nothing you post to FB is actually “private”, in practice it is silently shared with random apps (that you happen to use your FB ID to sign into), which then are sharing it with other orgs via acquisition or just outright selling your data.

Liked ClassDojo poses data protection concerns for parents by an author (Parenting for a Digital Future)

It is time to support parents and teachers to ask critical questions about ClassDojo. As the owners and controllers of a vast global database of children’s behavioural information and a global social media site for schools, its entrepreneurial founders need to be more transparent about what they intend to do with that data, how they intend to generate income from it, and how they want ClassDojo to play a part in interactions between children.