re:publica 2018 – danah boyd: Opening Keynote: How an Algorithmic World Can Be Undermined

How is it that its not necessarily [technologies] intentions, but the structuring configuration that causes the pain

danah boyd continues her investigation of algorithms and the way in which our data is being manipulated. This is very much a wicked problem with no clear answer. Data & Society have also published a primer on the topic. I wonder if it starts by being aware of the systemic nature of it all? Alternatively, Jamie Williams and Lena Gunn provide five questions to consider when using algorithms.

via Jenny Mackness and Ian O’Byrne.

Listened Hey! Algorithms, leave them kids alone: Chips with Everything podcast by Jordan Erica Webber from the Guardian
Jordan Erica Webber looks into reports that YouTube Kids might create an algorithm-free platform


This is an interesting discussion of YT Kids and the role of algorithms. This is an issue that came to light through James Bridle’s post last year.

I must admit that I still use the YT Kids app sometimes. For example, the other day my daughter wanted to watch a song from Little Mermaid. I used the app and it was interesting what I found:

A response from the YT Kids algorithm

It made me think about how that result may have been produced. I listened to the song. It was fine. It was basically a song inspired by The Little Mermaid. I just wonder why horror was allowed through.

Liked How Netflix works: the (hugely simplified) complex stuff that happens every time you hit Play by Mayukh Nair (Medium)
This is what happens when you hit that Play button: Hundreds of microservices, or tiny independent programs, work together to make one large Netflix service. Content legally acquired or licensed is converted into a size that fits your screen, and protected from being copied. Servers across the world make a copy of it and store it so that the closest one to you delivers it at max quality and speed. When you select a show, your Netflix app cherry picks which of these servers will it load the video from. You are now gripped by Frank Underwood’s chilling tactics, given depression by BoJack Horseman’s rollercoaster life, tickled by Dev in Master of None and made phobic to the future of technology by the stories in Black Mirror. And your lifespan decreases as your binge watching turns you into a couch potato.
Bookmarked Opinion | YouTube, the Great Radicalizer by Zeynep Tufekci (nytimes.com)
In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted about this by the health department and concerned citizens, the restaurant managers reply that they are merely serving us what we want.
Zeynep Tufekci highlights the problems with YouTube’s algorithm. There is a bias built in to support inflammatory content. In response to the post, Clive Thompson explains it this way:

It’s not that Youtube radicalize politics specifically. It radicalizes everything, and politics just gets swept along in the slurry of zomg.

Bookmarked Social Inequality Will Not Be Solved By an App by Safiya Umoja Noble (WIRED)
The entire experiment of the internet is now with us, yet we do not have enough intense scrutiny at the level of public policy on its psychological and social impact on the public.
In an excerpt from Algorithms of Oppression: How Search Engines Reinforce Racism, Safiya Umoja Noble highlights the part that technology plays in reinstating inequality and oppression. This is a topic that Cathy O’Neil touches on in her book Weapons of Maths Destruction. One point that stood out was the ability to use algorithms to find the ‘best’ school:

These data-intensive applications that work across vast data sets do not show the microlevel interventions that are being made to racially and economically integrate schools to foster educational equity. They simply make it easy to take for granted data about “good schools” that almost exclusively map to affluent, White neighborhoods. We need more intense attention on how these types of artificial intelligence, under the auspices of individual freedom to make choices, forestall the ability to see what kinds of choices we are making and the collective impact of these choices in reversing decades of struggle for social, political, and economic equality. Digital technologies are implicated in these struggles.

Another introduction to Noble’s book is her video, found here.

Listened IRL Podcast Episode 12: Algorisky from irlpodcast.org
On this week's episode of IRL, we sit down with Luke Dormehl, author of Thinking Machines and The Formula, to explore the impact of algorithms, on and offline. Staci Burns and James Bridle , author of "Something is wrong on the internet," investigate the human cost of gaming YouTube recommendations. Anthropologist Nick Seaver talks about the danger of automating the status quo. And researcher Safiya Noble looks at how to prevent racial bias from seeping into code
Bookmarked Why Less News on Facebook Is Good News for Everyone by Will Oremus (Slate Magazine)
To what extent Facebook’s disruption of the media facilitated the political upheaval and polarization we’ve seen over the past several years is a question that researchers will be debating and investigating for some time. But it seems clear they’re related. And it was Facebook’s takeover of the news that gave Russian agents the tools to influence elections and civil discourse in democracies around the world.
Will Oremus discusses Facebook’s flip to prioritise the personal over corporation. This will have a significant impact on the way that news is portrayed on the site. It comes on the back of a series of changes in which Facebook has broken the back of digital news coverage:

First, by encouraging people to get news from all different sources in the same place, Facebook leveled the playing field among publishers.

Second, whereas human editors used to be trained to select and emphasize stories based on their news value, Facebook’s news feed algorithm optimized for clicks, views, likes, and shares.

This move isn’t to repair the damage done to democracy, but rather to limit the damage done to its users.

Listened Ep. 74 Damien Williams from shows.pippa.io
Technology philosopher Damien Williams on how the algorithms running society are embedded with the same biases as the people who program them.
Douglas Rushkoff and Damien Williams discuss the biases that we build into our technology through their design and the problems that this creates for machine learning. In some respects, this touches on the work of Cathy O’Neil.

Rushkoff also begins with a reflection on the use of social media by schools. He wonders why is it so easy for people to losesight of the design and purpose behind these platforms? He argues that other than teaching media, social media (Twitter, Facebook, Instagram etc) should never be used by schools. Use blogs or a space you manage yourself and your story – something that I have touched upon in the past – but to feed the ad algorithms is the wrong approach.

Liked A Hitchhiker’s Guide to Consensus Algorithms
Behind every great cryptocurrency, there’s a great consensus algorithm. No consensus algorithm is perfect, but they each have their strengths. In the world of crypto, consensus algorithms exist to prevent double spending. Here’s a quick rundown on some of the most popular consensus algorithms to date, from Blockchains to DAGs and everything in-between.