Many Americans — especially conservative Americans — do not trust contemporary news organizations. This “crisis” is well-trod territory, but the focus on fact-checking, media literacy, and business models tends to obscure three features of the contemporary information landscape that I think are poorly understood:
- Differences in worldview are being weaponized to polarize society.
- We cannot trust organizations, institutions, or professions when they’re abstracted away from us.
- Economic structures built on value extraction cannot enable healthy information ecosystems.
Doctorow creates these oppositional positions to make a point and to highlight that there is a war over epistemology, or the way in which we produce knowledge.The reality is much messier, because what’s at stake isn’t simply about resolving two competing worldviews. Rather, what’s at stake is how there is no universal way of knowing, and we have reached a stage in our political climate where there is more power in seeding doubt, destabilizing knowledge, and encouraging others to distrust other systems of knowledge production.
As the institutional construction of news media becomes more and more proximately divorced from the vast majority of people in the United States, we can and should expect trust in news to decline. No amount of fact-checking will make up for a widespread feeling that coverage is biased. No amount of articulated ethical commitments will make up for the feeling that you are being fed clickbait headlines.
It doesn’t take a quasi-documentary to realize that McDonald’s is not a fast-food franchise; it’s a real estate business that uses a franchise structure to extract capital from naive entrepreneurs.
no amount of innovative new business models will make up for the fact that you can’t sustain responsible journalism within a business structure that requires newsrooms to make more money quarter over quarter to appease investors. This does not mean that you can’t build a sustainable news business, but if the news is beholden to investors trying to extract value, it’s going to impossible. And if news companies have no assets to rely on (such as their now-sold real estate), they are fundamentally unstable and likely to engage in unhealthy business practices out of economic desperation.
ROI capitalism isn’t the only version of capitalism out there. We take it for granted and tacitly accept its weaknesses by creating binaries, as though the only alternative is Cold War Soviet Union–styled communism. We’re all frogs in an ocean that’s quickly getting warmer. Two degrees will affect a lot more than oceanfront properties.
There are three key higher-order next steps, all of which are at the scale of the New Deal.
- Create a sustainable business structure for information intermediaries (like news organizations) that allows them to be profitable without the pressure of ROI.
- Actively and strategically rebuild the social networks of America.
- Find new ways of holding those who are struggling.
Trust cannot be demanded. It’s only earned by being there at critical junctures when people are in crisis and need help. You don’t earn trust when things are going well; you earn trust by being a rock during a tornado.
That presentation is worth watching (or listening) too as well.
How is it that its not necessarily [technologies] intentions, but the structuring configuration that causes the pain
danah boyd continues her investigation of algorithms and the way in which our data is being manipulated. This is very much a wicked problem with no clear answer. Data & Society have also published a primer on the topic. I wonder if it starts by being aware of the systemic nature of it all? Alternatively, Jamie Williams and Lena Gunn provide five questions to consider when using algorithms.
- “Actively taking things out of context can be helpful for analysis”
- “help students truly appreciate epistemological differences”
- “help students see how they fill in gaps when the information presented to them is sparse and how hard it is to overcome priors [confirmation bias and selective attention]”
Benjamin Doxtdator raises the concern that focusing on the individual:
Would boyd’s cognitive strength training exercises have helped here? No. Turning inwards to psychology, rather outwards to the political context, is precisely what gives us ‘lone wolf’ analyses of white supremacy.
Instead Doxtdator suggests considering the technical infrastructure. Interestingly, she does touch on platforms in the Q&A at the end:
One of the things that is funny is that these technologies get designed for a very particular idea of what they could be used for and then they twist in different ways.source
I’ve never been one to feel the need to put on a lot of makeup in order to leave the house and I haven’t been someone who felt the need to buy bots to appear cool online. But I find it deeply hypocritical to listen to journalists and politicians wring their hands about fake followers and bots given that they’ve been playing at that game for a long time. Who among them is really innocent of trying to garner attention through any means possible?
Many people have unhealthy habits and dynamics in their life. Some are rooted in physical addiction. Others are habitual or psychological crutches. But across that spectrum, most people are aware of when something that they’re doing isn’t healthy. They may not be able to stop. Or they may not want to stop. Untangling that is part of the challenge. When you feel as though your child has an unhealthy relationship with technology (or anything else in their life), you need to start by asking if they see this the same way you do. When parents feel as though what their child is doing is unhealthy for them, but the child does not, the intervention has to be quite different than when the child is also concerned about the issue.
Parents don’t like to see that they’re part of the problem or that their efforts to protect and help their children might backfire.
In response, she suggests two things for parents to do:
- Verbalize what you’re doing with your phone’
Create a household contract
After reading this, I tried verbalising my actions and it soon becomes apparent when maybe the phone could go away.
Rather than thinking of AI as “artificial intelligence,” Eubanks effectively builds the case for how we should think that AI often means “automating inequality” in practice.
Published in DLTV Journal 1.2 December 2014