Bookmarked The Moral Rot of the MIT Media Lab (Slate Magazine)

Over the course of the past century, MIT became one of the best brands in the world, a name that confers instant credibility and stature on all who are associated with it. Rather than protect the inherent specialness of this brand, the Media Lab soiled it again and again by selling its prestige to banks, drug companies, petroleum companies, carmakers, multinational retailers, at least one serial sexual predator, and others who hoped to camouflage their avarice with the sheen of innovation. There is a big difference between taking money from someone like Epstein and taking it from Nike or the Department of Defense, but the latter choices pave the way for the former. It is easy to understand why Jeffrey Epstein wanted to get involved with the Media Lab. Unfortunately, it is also easy to understand why Joi Ito got involved with Jeffrey Epstein. The only bad donations were the ones that weren’t received.

Justin Peters discusses the history associated with MIT, the birth of the Media Lab and the choice to soil its specialness, rather than support activists like Aaron Swartz.

According to the Abelson Report, MIT had chosen not to aid Swartz in part because doing so could have sent the wrong message to its institutional partners, which might have interpreted the gesture as MIT coming out as soft on content piracy. And then Swartz died, and the Media Lab was the site of an ice-cream social in his honor. The Media Lab and MIT were capable of anything, it seemed, except meaningful self-reflection.

In a Twitter thread, James Bridle questions the ethics of MIT’s Media Lab and their history in building products to improve people’s lives, only to then pivot into market products.

Bookmarked Clouds and networks: reflections on James Bridle’s New Dark Age (Bryan Alexander)

I started reading James Bridle’s New Dark Age thinking it was another entry in the recent spate of “techlash” books. The subtitle, Technology and the End of the Future, is a hint.…

Bryan Alexander provides a breakdown of James Bridle’s New Dark Age. He summarises some of the arguments and makes case with a few of the flaws:

There are also some curiously too-quick dismissals. Bridle slams geoengineering and new developments in material science in less than a sentence, without citation (64). Hollywood is paranoid, but it’s not clear what that means (130). The charge that tech companies “are still predominantly white” (143-4) manages to ignore the large numbers of Asians in those firms, disproportionate to their representation in the general population. An early chapter makes good use of an 1884 Ruskin lecture, but then mistakingly sees it describing, rather than anticipating, World War I’s battlefields, a generation later (195).

Clive Thompson discusses the power of big data to support making clearer decisions around climate change. In the New Dark Age, James Bridle argues that there is a certain irony associated with using technology to solve the problems of technology.

Thinking about climate change is degraded by climate change itself, just as communications networks are undermined by the softening ground, just as our ability to debate and act on entangled environmental and technological change is diminished by our inability to conceptualise complex systems. And yet at the heart of our current crisis is the hyperobject of the network: the internet and the modes of life and ways of thinking it weaves together (Page 79)

The other problem is where the data gets manipulated to support vested interests.

Bookmarked New Ways of Seeing: can John Berger’s classic decode our baffling digital age? (the Guardian)

From ‘the cloud’ to invisible beams carrying billions of dollars, our world can often feel like a neverland of terrifying tech. A new radio series is here to help

James Bridle discusses some of the ideas relating to cloud computing and seeing the world around us in a BBC podcast series. This is associated with his book New Dark Age.

Marginalia

As Berger wrote in 1972: “We only see what we look at. To look is an act of choice.” Beneath the surface of the street, and behind the screens of our computers, hide powerful forces that shape all of our lives. To reckon with them requires a new way of seeing: an understanding of the connections between infrastructure and code, state surveillance and corporate power, social prejudice and algorithmic bias, and the environment and computation. It’s a form of seeing vital to understanding the times we live in, and as at every previous time in history, it’s artists who are helping us to forge it.

Replied to Episode 112: Running to Bangkok by an author (Tide Podcast)

This week, Doug and Dai discuss what’s been going on over the last couple of weeks, MozFest, MoodleMoot US, Universal Basic Income, humane technology, Creative Commons, success, decentralised apps, and more!

Dai, I was interested in your discussion of James Bridle’s book. I agree Dai with your concern about such a dystopian portrayal. That was something that I tried to get across in my post on becoming informed. Is the answer that it takes a range of voices to get to this stage?

With this in mind I am intrigued by the video you mention from Vox featuring Tristan Harris:

I liked his suggestions, but for me it is like going to McDonald’s to buy a salad. The issue is not the salad, but McDonald’s and their push for non-human consumption. I wonder about a sustainable smartphone, one that is built ethically from the outset, not one that puts the blame back on the user. As Audrey Watters argues:

I don’t want to suggest that this is something the consumer alone is responsible for – blaming consumers, for example, for looking at their phone when it vibrates or beeps or for downloading Candy Crush and trying to get all their friends to play along. The whole modus operandi of the tech industry has been to create apps that are as engaging and compelling and viral as Candy Crush. The industry views its users as highly manipulable, their behaviors as something that can be easily shaped and nudged and controlled. Maybe it’s time to rethink and regulate and restrict how that happens?

Discussing the work of Harris the other such apologists, Watters asks why we should trust them:

Why should we trust these revelations (or revelators) to guide us moving forward? Why not trust those of us who knew it was bullshit all along and who can tell you the whole history of a bad idea?

I wonder then why we should trust Harris over Bridle and wonder whether in the end they both have a particular place at the table?

Bookmarked
James Bridle takes a dive into the algorithmic nightmares lurking within YouTube. This is taken from his book and elaborates on a post he wrote exploring the topic. He ends with an explanation that this is a problem that we all must grapple with:

The thing, though, I think most about these systems is that this isn’t, as I hope I’ve explained, really about YouTube. It’s about everything. These issues of accountability and agency, of opacity and complexity, of the violence and exploitation that inherently results from the concentration of power in a few hands — these are much, much larger issues. And they’re issues not just of YouTube and not just of technology in general, and they’re not even new. They’ve been with us for ages. But we finally built this system, this global system, the internet, that’s actually showing them to us in this extraordinary way, making them undeniable. Technology has this extraordinary capacity to both instantiate and continue all of our most extraordinary, often hidden desires and biases and encoding them into the world, but it also writes them down so that we can see them, so that we can’t pretend they don’t exist anymore. We need to stop thinking about technology as a solution to all of our problems, but think of it as a guide to what those problems actually are, so we can start thinking about them properly and start to address them.

Bookmarked Rise of the machines: has technology evolved beyond our control? by James Bridle (the Guardian)

Technology is starting to behave in intelligent and unpredictable ways that even its creators don’t understand. As machines increasingly shape global events, how can we regain control?

In an extract from James Bridle’s new book New Dark Age: Technology and the End of the Future, he discusses the evolution of the machine. This includes the place of the cloud, algorithmic interactions within the stock marker, the corruption of the internet of things and incomprehensibility of machine learning. Bridle believes that we need to reimagine how we think about technology:

Our technologies are extensions of ourselves, codified in machines and infrastructures, in frameworks of knowledge and action. Computers are not here to give us all the answers, but to allow us to put new questions, in new ways, to the universe

This is a part of a few posts from Bridle going around at the moment, including a reflection on technology whistleblowers and YouTube’s response to last years exposé. Some of these ideas remind me of some of the concerns raised in Martin Ford’s Rise of the Robots and Cathy O’Neil’s Weapons of Math Destruction.

Listened Hey! Algorithms, leave them kids alone: Chips with Everything podcast by an author from the Guardian

Jordan Erica Webber looks into reports that YouTube Kids might create an algorithm-free platform


This is an interesting discussion of YT Kids and the role of algorithms. This is an issue that came to light through James Bridle’s post last year.

I must admit that I still use the YT Kids app sometimes. For example, the other day my daughter wanted to watch a song from Little Mermaid. I used the app and it was interesting what I found:

A response from the YT Kids algorithm

It made me think about how that result may have been produced. I listened to the song. It was fine. It was basically a song inspired by The Little Mermaid. I just wonder why horror was allowed through.