Bookmarked Interoperable Facebook (Electronic Frontier Foundation)

What if Facebook – and the other tech giants – were simply less important to your life? What if you had lots of choices about how you and the people you care about could communicate with each other?

In a video and paper, Cory Doctorow unpacks how an interoperable Facebook might work. He does this by unpacking four scenarios:

– Notifying contacts that you are leaving Facebook
– Blocking content from particular federated servers
– Blocking objectionable material from Facebook that it allows, but your network does not
– Posting material that Facebook prohibits

What excites me about the world that Doctorow imagines in this paper is the control and nuance over such things as feeds. Personally, I would be just grateful to be able to follow updates from sites such as Facebook, Instagram and Linkedin via a feed, rather than having to log in.

Liked Facebook Understands the Metaverse All Too Well by Ian Bogost (

If those dreams become realized, you’ll probably end up buying crap and yelling at people through a head-mounted display, instead of through your smartphone. Sure, calling that a metaverse probably sounds better. Just like “the cloud” sounds better than, you know, a server farm where people and companies rent disk space.

It’s absurd but telling that the inspiration for the metaverse was meant as satire. Just as OZY Media misinterprets Shelley, so Zuck and crew misconstrue metaverse fiction.

Liked The Metaverse Was Lame Even Before Facebook by Ethan Zuckerman (

Facebook can claim originality in at least one thing. Its combination of scale and irresponsibility has unleashed a set of diverse and fascinating sociopolitical challenges that it will take lawmakers, scholars, and activists at least a generation to fix. If Facebook has learned anything from 17 years of avoiding mediating those conflicts, it’s not apparent from the vision for the metaverse, where the power of human connection is celebrated as uncritically as it was before Macedonian fake-news brokers worked to sway the 2016 election.

Liked Instagram for Kids and What Facebook Knows About the Effects of Social Media by Sue Halpern (The New Yorker)

As reported by the Journal, the documents show that the company is fully aware that Instagram has deleterious effects on teens. A PowerPoint slide created by Facebook researchers in 2019, for example, states that Instagram makes body-image issues worse for one in three teen-age girls. Another research presentation, from March, 2020, which was published on Facebook’s internal message board and was viewed by the reporters, noted that “thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.” Teens also told Facebook’s researchers that the app contributed to their depression and anxiety, a complaint that a company document from 2019 noted “was unprompted and consistent across all groups.” Young Instagram users also indicated that they felt addicted to the app and lacked the wherewithal to limit their use of it.

Liked Pluralistic: 28 Aug 2021 by Cory DoctorowCory Doctorow (

Network effects are how Facebook attracts users, but switching costs are how it holds them hostage.

The higher the switching costs, the bigger the shit sandwich Facebook can force you to eat before you leave.

That’s why interoperability is such a big deal – because it lowers the switching costs. If you can take your apps or friends or files or media with you when you leave a service, then the service has to treat you better, lest you depart.

Replied to What If Regulating Facebook Fails? by Siva Vaidhyanathan (WIRED)

Beyond the GDPR, an even more radical and useful approach would be to throttle Facebook’s (or any company’s) ability to track everything we do and say, and limit the ways it can use our data to influence our social connections and political activities. We could limit the reach and power of Facebook without infringing speech rights. We could make Facebook matter less.

Imagine if we kept our focus on how Facebook actually works and why it’s as rich and powerful as it is. If we did that, instead of fluttering our attention to the latest example of bad content flowing across the platform and reaching some small fraction of users, we might have a chance. As Marshall McLuhan taught us 56 years ago, it’s the medium, not the message, that ultimately matters.

Liked Pluralistic: 19 Apr 2021 by Cory DoctorowCory Doctorow (Pluralistic)

The anonymous author of the leaked memo calls themself and their colleagues “the tonsils of the internet, a constantly bombarded first line of defense against potential trauma to the userbase.”

FB is a company that says it can do everything – operate local offices in more than 100 countries, field a major VR platform, issue a currency. But when it comes to moderation, it is rendered helpless before the enormity of the task.

The “we must outsource” explanation grows ever thinner, while the “tonsils” hypothesis has enormous explanatory power.

Liked How Facebook got addicted to spreading misinformation by Karen Hao (MIT Technology Review)

Misinformation and hate speech constantly evolve. New falsehoods spring up; new people and groups become targets. To catch things before they go viral, content-moderation models must be able to identify new unwanted content with high accuracy. But machine-learning models do not work that way. An algorithm that has learned to recognize Holocaust denial can’t immediately spot, say, Rohingya genocide denial. It must be trained on thousands, often even millions, of examples of a new type of content before learning to filter it out. Even then, users can quickly learn to outwit the model by doing things like changing the wording of a post or replacing incendiary phrases with euphemisms, making their message illegible to the AI while still obvious to a human. This is why new conspiracy theories can rapidly spiral out of control, and partly why, even after such content is banned, forms of it can persist on the platform.

Listened Facebook’s Mark Zuckerberg and Treasurer Josh Frydenberg continue talks as Australia stands firm on proposed media laws from ABC News

The Prime Minister says the government will not respond to Facebook’s threats, urging the company to “come back to the table” and continue negotiations over the news media code.

In response to Facebook’s decision to temporarily remove all news in Australia, Waleed Aly, Scott Stevens and Belinda Barnet investigation whether if it is even right for news organisations to depend upon Facebook as the modern form of distribution in the first place. Aly actually praised Facebook’s decision as a ‘brief relief from the tyranny of pragmatism’. The problem raised is that Facebook is not a model that is moral. As a platform, it showcases media all together, which subsequently ends up lowering the value of everything on there. They investigate whether it is a better outcome for people to seek news elsewhere, rather be at the peril of algorithms and the shareablility of content. The concern is that this is all beyond regulation when the platform capitalists get the data and content for free.

In other reporting on the situation, Nicholas Stuart suggests that this decision only confirms Facebook’s dominant role:

Facebook wanted a deal, but only one that left it in control. Sure, they preferred not to gift money to anyone, particularly slow media behemoths that can’t even get their distribution model right. But Zuckerberg can live with this, because the government’s cemented his role. Facebook’s now driving the media jalopy.

Adding to this, Alex Hern suggests that until the technology sector change their approach, we are going to have more calls to “regulate us, just not like that”

Like all industries, tech has its shadow lobbying groups, it secretly funds its think tanks and so on. But, unlike others, the tech industry has focused firmly on that core pro-regulation message. “We want to be regulated; we want new laws that cover us; we’re not like the Other Industries, we’re cool and likeable”.

Except that message fails, because it’s coming from an industry that simply cannot accept that regulation is driven, first and foremost, by a desire to limit the harms caused by that industry – harms which the industry can’t even be convinced exist.

In other words, we’re going to see this more in the future. Until tech changes its view of regulation from something that offloads blame to something that prevents harm, that cycle – “Regulate us! No not like that” – will continue.

While Cory Doctorow argues the real focus is not links, but the ad market.

This vertical integration is the source of confusion about whether this is a link-tax. The goal of the regulation is to clean up the ad markets, but Googbook use links as a stick to beat up publishers when they don’t submit to corrupt ad practices, so links get implicated.

Doctorow suggests that the another approach to the problem is adversarial interoperability and adjusting the control that companies like Facebook have on our data and attention.

Bookmarked The flight from WhatsApp (

Not surprisingly, Signal has been staggering under the load of refugees from WhatsApp following Facebook’s ultimatum about sharing their data with other companies in its group. According to data from Sensor Tower Signal was downloaded 8.8m times worldwide in the week after the WhatsApp changes were first announced on January 4. Compare that with 246,000 downloads the week before and you get some idea of the step-change. I guess the tweet — “Use Signal” — from Elon Musk on January 7 probably also added a spike.

John Naughton talks about the flight from WhatsApp in response to news that data will soon be incorporated within the wider Facebook ecosystem. As Alex Hern reported:

If you’re comfortable with Facebook’s use of data (or that of its much closer subsidiary Instagram), it might be difficult to care about this. The company was recently forced by Apple to provide a privacy “nutritional label” on its iOS app, revealing how it works with user data. The labels disclosed more than 100 different pieces of data that may be collected, many of which are directly linked to user profiles, including health and fitness data, “sensitive info” and search histories. For the typical user, who has an account on both services, adding in the small amount of information WhatsApp has is a drop in a bucket by comparison.

But the change does start to eat away at the idea that you can be on WhatsApp without a Facebook footprint. The two apps’ very different histories and intended uses have led to a split in demographics among their users, and a small but significant proportion of WhatsApp users, drawn by the encryption, ad-free nature and no-frills interface, avoid Facebook itself while still using the chat app it owns.

In response, Facebook has paused this change. For Charles Arthur, this says a lot in that Facebook were able to act so swiftly.

The irony is so thick you could spread it on toast. Misinformation spread on WhatsApp has been blamed for deaths in India and election distortion in Brazil, but the company slow-walked complaints there. But when people start defecting, that’s a different matter: it acts like it’s on fire.

Replied to

It is intriguing the efforts required to crack open the black box that is the Facebook algorithm.
Bookmarked Facebook Is a Doomsday Machine by Adrienne LaFrance (The Atlantic)

Andrew Bosworth, one of Facebook’s longtime executives, has compared Facebook to sugar—in that it is “delicious” but best enjoyed in moderation. In a memo originally posted to Facebook’s internal network last year, he argued for a philosophy of personal responsibility. “My grandfather took such a stance towards bacon and I admired him for it,” Bosworth wrote. “And social media is likely much less fatal than bacon.” But viewing Facebook merely as a vehicle for individual consumption ignores the fact of what it is—a network. Facebook is also a business, and a place where people spend time with one another. Put it this way: If you owned a store and someone walked in and started shouting Nazi propaganda or recruiting terrorists near the cash register, would you, as the shop owner, tell all of the other customers you couldn’t possibly intervene?

Adrienne LaFrance makes the comparison between the nuclear threat of the doomsday machine in the 70’s and the position served by Facebook today to connect so many people.

Limitations to the Doomsday Machine comparison are obvious: Facebook cannot in an instant reduce a city to ruins the way a nuclear bomb can. And whereas the Doomsday Machine was conceived of as a world-ending device so as to forestall the end of the world, Facebook started because a semi-inebriated Harvard undergrad was bored one night. But the stakes are still life-and-death. Megascale is nearly the existential threat that megadeath is. No single machine should be able to control the fate of the world’s population—and that’s what both the Doomsday Machine and Facebook are built to do.

In his own commentary on this piece, John Gruber highlights how the blackboxed nature of Facebook is far worse than 8kun:

We instinctively think that 8kun is “worse” than Facebook because its users are free to post the worst content imaginable, and because they are terribly imaginative, do. It feels like 8kun must be “worse” because its content is worse — what is permitted, and what actually is posted. But Facebook is in fact far worse, because by its nature we, as a whole, can’t even see what “Facebook” is because everyone’s feed is unique. 8kun, at least, is a knowable product. You could print it out and say, “Here is what 8kun was on December 29, 2020.” How could you ever say what Facebook is at any given moment, let alone for a given day, let alone as an omnipresent daily presence in billions of people’s lives?

Bookmarked Facebook, QAnon and the world’s slackening grip on reality (

The coronavirus pandemic has left us living more and more of our lives online. But the place where we chat with friends, get our news and form our opinions is full of vile and dangerous conspiracy theories. Is the world’s biggest social network doing enough to combat them?

Alex Hern unpacks the ramifications of COVID and the move online. He explains how the current circumstances have forged new ground for what Buzzfeed describes as ‘collective delusion‘:

Bullying, sexual abuse, political polarisation and conspiracy theorists all existed before the social network, but all took on new contours as they moved online.

One of the common threads with ‘QAnonCasualties’ is that others seem to live a different reality. The challenge with all of this is moderation. A part of the problem is the role served by celebrities in amplifying stories.

Liked What Facebook Fed the Baby Boomers by Charlie Warzel (

Like most of us, they gave little thought to the connections they made. Mr. Young added friends he hadn’t spoken to in decades. When Ms. Pierce joined a nonprofit organization she accepted dozens of friend requests — some from people she’d met only in passing. “I meet people on airplanes all the time and we exchange Facebook handles,” she told me.

But as Facebook evolved, these weak connections became unlikely information nodes. Mr. Young and Ms. Pierce were now getting their commentary from people they hardly knew, whose politics had once been unknown or illegible.

Liked Why Twitter is (Epistemically) Better Than Facebook (

Design can’t solve all of our problems. Users still need to decide to make good use of the tools available to them. But as we spend more time on social media platforms, it’s important to recognize the strengths of their effects on us. Platforms can mimic and codify the limitations of our offline epistemic environments by only connecting us with people we already know (and giving us the option to filter them out when it’s uncomfortable). Or they can encourage productive epistemic friction by pushing us to consider who to engage with, what topics to avoid, and for how long, so that we consciously shape our own epistemic environments.

Liked Facebook Cannot Separate Itself From the Hate It Spreads (Medium)

As we consider Facebook’s place in our lives and in our society, particularly during a revolutionary moment, when the abolition of technologies and institutions is now a serious discussion after being dismissed as impossible for so long, we should ask ourselves: How much white supremacy and hate are we willing to tolerate in exchange for whatever “good” one thinks Facebook does? It’s similar to asking “how much lead do you want in your water?”, “How much E. coli do you want in your food?”, or “How many heavy metals would you like in your farmland?”

For what it’s worth, my answer is “none.” A company whose business model necessitates that it consistently discharge poison into the environment should be dismantled.

How much toxic waste is Facebook willing to spill into the environment? Its answer seems to have been — and to remain — “as much as we can get away with.