Replied to Too Long; Didn’t Read #171 by Ian O'Byrne (W. Ian O'Byrne)
I’m currently reading Twitter and Tear Gas by Zeynep Tufekci. It’s a fascinating read that is making me question a lot of my thinking about these digital, social spaces.
I too have started reading Twitter and Tear Gas. I too am being challenged by it. I somehow thought that it wouldn’t be applicable in the field of EdTech. What it has me thinking is that in ‘networked publics’ there is not imaginary line where EdTech (whatever that actually means) starts and stops.

Thank you too for the shoutout. It definitely has sparked some interesting conversation. I read a post today about mindfulness apps, yet it overlooked the collection of data associated with the completion of various. We are asked to be conscious of our breathing, yet ignore the data that we share on a daily basis.

RSVPed Interested in Attending https://bryanalexander.org/book-club/our-next-book-club-reading-is-zeynep-tufekcis-twitter-and-tear-gas-the-power-and-fragility-of-networked-protest/
Our next book club reading has been decided! After a furious polling, the winner is… …Zeynep Tufekci, Twitter and Tear Gas: The Power and Fragility of Networked Protest.
Liked An Avalanche of Speech Can Bury Democracy (POLITICO Magazine)
It’s not speech per se that allows democracies to function, but the ability to agree—eventually, at least some of the time—on what is true, what is important and what serves the public good. This doesn’t mean everyone must agree on every fact, or that our priorities are necessarily uniform. But democracy can’t operate completely unmoored from a common ground, and certainly not in a sea of distractions.
via Mike Caulfield
Bookmarked How social media took us from Tahrir Square to Donald Trump by Zeynep Tufekci (MIT Technology Review)
To understand how digital technologies went from instruments for spreading democracy to weapons for attacking it, you have to look beyond the technologies themselves.
Zeynep Tufekci captures some of the complexities associated with fixing up big tech. A few things that stand out is that the answer is not splitting up big tech or simply respond to the threat of Russia. As she explains:

Russia did not instigate the moves that have reduced Americans’ trust in health authorities, environmental agencies, and other regulators. Russia did not create the revolving door between Congress and the lobbying firms that employ ex-politicians at handsome salaries. Russia did not defund higher education in the United States. Russia did not create the global network of tax havens in which big corporations and the rich can pile up enormous wealth while basic government services get cut.

Instead we need to:

Figure out how our institutions, our checks and balances, and our societal safeguards should function in the 21st century.

Liked Opinion | What Elon Musk Should Learn From the Thailand Cave Rescue (nytimes.com)
Just because you’re a successful tech mogul doesn’t mean you know how to rescue kids trapped underground.
Tufekci also takes this discussion further on Twitter:

Bookmarked Why Zuckerberg’s 14-Year Apology Tour Hasn’t Fixed Facebook (WIRED)
At a minimum, Facebook has long needed an ombudsman’s office with real teeth and power: an institution within the company that can act as a check on its worst impulses and to protect its users. And it needs a lot more employees whose task is to keep the platform healthier. But what would truly be disruptive and innovative would be for Facebook to alter its business model. Such a change could come from within, or it could be driven by regulations on data retention and opaque, surveillance-based targeting—regulations that would make such practices less profitable or even forbidden.
It is a little disconcerting when Facebook ever seems to do something positive for the ‘user’ in response to complaints. What is worse, Tufekci highlights how some of the changes they are promising now were promised years ago.

But the backlash wouldn’t die down. Attempting to respond to the growing outrage, Facebook announced changes. “It’s Time to Make Our Privacy Tools Easier to Find”, the company announced without a hint of irony—or any other kind of hint—that Zuckerberg had promised to do just that in the “coming few weeks” eight full years ago. On the company blog, Facebook’s chief privacy editor wrote that instead of being “spread across nearly 20 different screens” (why were they ever spread all over the place?), the controls would now finally be in one place.

Sadly, this has nothing to do with users or community:

As far as I can tell, not once in his apology tour was Zuckerberg asked what on earth he means when he refers to Facebook’s 2 billion-plus users as “a community” or “the Facebook community.” A community is a set of people with reciprocal rights, powers, and responsibilities. If Facebook really were a community, Zuckerberg would not be able to make so many statements about unilateral decisions he has made—often, as he boasts in many interviews, in defiance of Facebook’s shareholders and various factions of the company’s workforce. Zuckerberg’s decisions are final, since he controls all the voting stock in Facebook, and always will until he decides not to—it’s just the way he has structured the company.

Tim Wu argues that we need to replace Facebook with a trustworthy platform not driven by survelliance and advertising:

If today’s privacy scandals lead us merely to install Facebook as a regulated monopolist, insulated from competition, we will have failed completely. The world does not need an established church of social media.

Bookmarked Opinion | YouTube, the Great Radicalizer by Zeynep Tufekci (nytimes.com)
In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted about this by the health department and concerned citizens, the restaurant managers reply that they are merely serving us what we want.
Zeynep Tufekci highlights the problems with YouTube’s algorithm. There is a bias built in to support inflammatory content. In response to the post, Clive Thompson explains it this way:

It’s not that Youtube radicalize politics specifically. It radicalizes everything, and politics just gets swept along in the slurry of zomg.