the goal of disinformation isn’t really around these individual transactions. The goal of disinformation is to, over time, change our psychological set-points. To the researcher looking at individuals at specific points in time, the homeostasis looks protective — fire up Mechanical Turk, see what people believe, give them information or disinformation, see what changes. What you’ll find is nothing changes — set-points are remarkably resilient. But underneath that, from year to year, is drift. And its the drift that matters.
But whatever your take, I encourage you to think of disinformation in this way, at least for a bit — not as the spread of false information, but as the hacking of the simulated reality which we all must necessarily inhabit. As something that does not just change knowledge, but which produces new life experiences as real as the the Iraq War, your neighbor’s fight with cancer, or your child’s illness. To see it in this way is perhaps more terrifying, but ultimately necessary as we attempt to address the problem.
Truly tackling the problem of hateful misinformation online requires rejecting the false choice between leaving it alone or censoring it outright. The real solution is one that has not been entertained by either Zuckerberg or his critics: counter-programming hateful or misleading speech with better speech.