The “Star Wars Kid” video is a classic example of a widely viewed video that was shared online to embarrass a teen. In 2002, a fourteen-year-old heavyset boy created a home video of himself swinging a golf ball retriever as though it was a light saber from Star Wars. A year later, a classmate of his found this home video, digitized it, and put it up online. Others edited the video, setting the action to music and dubbing in sound effects, graphics, and other special effects. The resultant “Star Wars Kid” video spread rapidly and received extensive media attention. It became the source of new memes and mocking video spin-offs. Even comedians like Weird Al Yankovic and Stephen Colbert produced their own renditions. Although people gained attention for spreading the video or creating their own versions, the cost of this mass attention was devastating to the teenager in the video. His family sued his classmates for emotional duress because of
the ongoing harassment he faced.
The “Star Wars Kid” video exemplifies how mass public shaming is a byproduct of widespread internet attention and networked distribution. Teenagers commonly face a lesser version of this when they receive unexpected and unwanted attention, when they become the target of a rumor, or when others share their content beyond its intended audience. Social media complicates the dynamics of social sharing and gossip because it provides a platform for information to spread far and wide, and people are often motivated to spread embarrassing content because others find it interesting. Spreadable media can be used to drum up productive attention, but it can also be used to shame.
The goal shouldn’t be to avoid being evil; it should be to actively do good. But it’s not enough to say that we’re going to do good; we need to collectively define — and hold each other to — shared values and standards.
People can change. Institutions can change. But doing so requires all who harmed — and all who benefited from harm — to come forward, admit their mistakes, and actively take steps to change the power dynamics. It requires everyone to hold each other accountable, but also to aim for reconciliation not simply retribution. So as we leave here tonight, let’s stop designing the technologies envisioned in dystopian novels. We need to heed the warnings of artists, not race head-on into their nightmares. Let’s focus on hearing the voices and experiences of those who have been harmed because of the technologies that made this industry so powerful. And let’s collaborate with and design alongside those communities to fix these wrongs, to build just and empowering technologies rather than those that reify the status quo.
Let me be clear — this is deeply destabilizing for me. I am here today in-no-small-part because I benefited from the generosity of men who tolerated and, in effect, enabled unethical, immoral, and criminal men. And because of that privilege, I managed to keep moving forward even as the collateral damage of patriarchy stifled the voices of so many others around me. I am angry and sad, horrified and disturbed because I know all too well that this world is not meritocratic. I am also complicit in helping uphold these systems.
I am here today because I learned how to survive and thrive in a man’s world, to use my tongue wisely, watch my back, and dodge bullets. I am being honored because I figured out how to remove a few bricks in those fortified walls so that others could look in. But this isn’t enough.
This all comes on light of all those who have benefited from the ties with Jeffrey Epstein.
boyd explains that we are now faced with a challenge to build, rather than break, a better web.
The Great Reckoning is in front of us. How we respond to the calls for justice will shape the future of technology and society. We must hold accountable all who perpetuate, amplify, and enable hate, harm, and cruelty. But accountability without transformation is simply spectacle. We owe it to ourselves and to all of those who have been hurt to focus on the root of the problem. We also owe it to them to actively seek to not build certain technologies because the human cost is too great.
Our emotions are being manipulated, hacked and shared like never before. So what does this mean for their future, our relationships and the technology that’s reading them?
The social media tools that teens use are direct descendants of the hangouts and other public places in which teens have been congregating for decades. What the drive-in was to teens in the 1950s and the mall in the 1980s, Facebook, texting, Twitter, instant messaging, and other social media are to teens now. Teens flock to them knowing they can socialize with friends and become better acquainted with classmates and peers they don’t know as well. They embrace social media for roughly the same reasons earlier generations of teens attended sock hops, congregated in parking lots, colonized people’s front stoops, or tied up the phone lines for hours on end. Teens want to gossip, flirt, complain, compare notes, share passions, emote, and joke around. They want to be able to talk among themselves—even if that means going online.(Page 20-21)
This episode also raises the question about the internet of things and the potential to gather emotional data. This is a topic touched upon by Ben Williamson in his book Big Data in Education.
Epistemology is the term that describes how we know what we know. Most people who think about knowledge think about the processes of obtaining it. Ignorance is often assumed to be not-yet-knowledgeable. But what if ignorance is strategically manufactured? What if the tools of knowledge production are perverted to enable ignorance?
What’s at stake right now is not simply about hate speech vs. free speech or the role of state-sponsored bots in political activity. It’s much more basic. It’s about purposefully and intentionally seeding doubt to fragment society. To fragment epistemologies. This is a tactic that was well-honed by propagandists.
One of the best ways to seed agnotology is to make sure that doubtful and conspiratorial content is easier to reach than scientific material. And then to make sure that what scientific information is available, is undermined. One tactic is to exploit “data voids.” These are areas within a search ecosystem where there’s no relevant data; those who want to manipulate media purposefully exploit these. Breaking news is one example of this. Another is to co-opt a term that was left behind, like social justice.
Playing for Team Human today, technology and social media scholar, founder of Data & Society Research Institute, and author of It’s Complicated: The Social Lives of Networked Teens, danah boyd.
if we don’t support young people in building out a strategically rich graph, they will reinforce the worst segments of our society (1.10)
For those who may not have kept up with boyd’s work since It’s Complicated, this is a really good introduction.
You are not algorithms. But you are also not neutral. And because you have the power to amplify messages, people also want to manipulate you. That’s just par for the course. And in today’s day and age, it’s not just corporations, governments, and PR shops that have your number. Just as the US military needed to change tactics to grapple with a tribal, networked, and distributed adversary, so must you. Focus on networks — help connect people to information. Build networks across information and across people. Be an embedded part of the social fabric of this country.
Democracy depends on you.
Many Americans — especially conservative Americans — do not trust contemporary news organizations. This “crisis” is well-trod territory, but the focus on fact-checking, media literacy, and business models tends to obscure three features of the contemporary information landscape that I think are poorly understood:
- Differences in worldview are being weaponized to polarize society.
- We cannot trust organizations, institutions, or professions when they’re abstracted away from us.
- Economic structures built on value extraction cannot enable healthy information ecosystems.
Doctorow creates these oppositional positions to make a point and to highlight that there is a war over epistemology, or the way in which we produce knowledge.The reality is much messier, because what’s at stake isn’t simply about resolving two competing worldviews. Rather, what’s at stake is how there is no universal way of knowing, and we have reached a stage in our political climate where there is more power in seeding doubt, destabilizing knowledge, and encouraging others to distrust other systems of knowledge production.
As the institutional construction of news media becomes more and more proximately divorced from the vast majority of people in the United States, we can and should expect trust in news to decline. No amount of fact-checking will make up for a widespread feeling that coverage is biased. No amount of articulated ethical commitments will make up for the feeling that you are being fed clickbait headlines.
It doesn’t take a quasi-documentary to realize that McDonald’s is not a fast-food franchise; it’s a real estate business that uses a franchise structure to extract capital from naive entrepreneurs.
no amount of innovative new business models will make up for the fact that you can’t sustain responsible journalism within a business structure that requires newsrooms to make more money quarter over quarter to appease investors. This does not mean that you can’t build a sustainable news business, but if the news is beholden to investors trying to extract value, it’s going to impossible. And if news companies have no assets to rely on (such as their now-sold real estate), they are fundamentally unstable and likely to engage in unhealthy business practices out of economic desperation.
ROI capitalism isn’t the only version of capitalism out there. We take it for granted and tacitly accept its weaknesses by creating binaries, as though the only alternative is Cold War Soviet Union–styled communism. We’re all frogs in an ocean that’s quickly getting warmer. Two degrees will affect a lot more than oceanfront properties.
There are three key higher-order next steps, all of which are at the scale of the New Deal.
- Create a sustainable business structure for information intermediaries (like news organizations) that allows them to be profitable without the pressure of ROI.
- Actively and strategically rebuild the social networks of America.
- Find new ways of holding those who are struggling.
Trust cannot be demanded. It’s only earned by being there at critical junctures when people are in crisis and need help. You don’t earn trust when things are going well; you earn trust by being a rock during a tornado.
That presentation is worth watching (or listening) too as well.
How is it that its not necessarily [technologies] intentions, but the structuring configuration that causes the pain
danah boyd continues her investigation of algorithms and the way in which our data is being manipulated. This is very much a wicked problem with no clear answer. Data & Society have also published a primer on the topic. I wonder if it starts by being aware of the systemic nature of it all? Alternatively, Jamie Williams and Lena Gunn provide five questions to consider when using algorithms.
- “Actively taking things out of context can be helpful for analysis”
- “help students truly appreciate epistemological differences”
- “help students see how they fill in gaps when the information presented to them is sparse and how hard it is to overcome priors [confirmation bias and selective attention]”
Benjamin Doxtdator raises the concern that focusing on the individual:
Would boyd’s cognitive strength training exercises have helped here? No. Turning inwards to psychology, rather outwards to the political context, is precisely what gives us ‘lone wolf’ analyses of white supremacy.
Instead Doxtdator suggests considering the technical infrastructure. Interestingly, she does touch on platforms in the Q&A at the end:
One of the things that is funny is that these technologies get designed for a very particular idea of what they could be used for and then they twist in different ways.source
I’ve never been one to feel the need to put on a lot of makeup in order to leave the house and I haven’t been someone who felt the need to buy bots to appear cool online. But I find it deeply hypocritical to listen to journalists and politicians wring their hands about fake followers and bots given that they’ve been playing at that game for a long time. Who among them is really innocent of trying to garner attention through any means possible?
Many people have unhealthy habits and dynamics in their life. Some are rooted in physical addiction. Others are habitual or psychological crutches. But across that spectrum, most people are aware of when something that they’re doing isn’t healthy. They may not be able to stop. Or they may not want to stop. Untangling that is part of the challenge. When you feel as though your child has an unhealthy relationship with technology (or anything else in their life), you need to start by asking if they see this the same way you do. When parents feel as though what their child is doing is unhealthy for them, but the child does not, the intervention has to be quite different than when the child is also concerned about the issue.
Parents don’t like to see that they’re part of the problem or that their efforts to protect and help their children might backfire.
In response, she suggests two things for parents to do:
- Verbalize what you’re doing with your phone’
Create a household contract
After reading this, I tried verbalising my actions and it soon becomes apparent when maybe the phone could go away.
Rather than thinking of AI as “artificial intelligence,” Eubanks effectively builds the case for how we should think that AI often means “automating inequality” in practice.
Published in DLTV Journal 1.2 December 2014