Bookmarked

Zeynep Tufekci provides a thread documenting her experience of the Hong Kong Protests. Not only does she include various observations, but she also curates a number of other resources.
Bookmarked What Game of Thrones can teach us about technology: It’s changing the game that matters, not picking the winner by an author (Zeynep’s Eclectics)

As it stands, machine intelligence functions an extension of corporations and power.

And that’s why all the stories are interlinked: from Wall Street to venture capital; from ridiculous startups to Uber/Lyft model of burning VC money till (the company hopes) it becomes a monopoly; from stagnation in wages to automation in the workplace.

Machine intelligence isn’t only an extension of power, and it doesn’t even have to be mostly that. But it is mostly that where we are.

That’s a story much bigger than Zuckerberg, Dorsey, Schmidt, Sandberg, Brin who-have-you. It’s also a story of Wall Street and increasing financialization of the world; it’s a story of what people are calling neoliberalism that’s been underway for decades. It is also a technical story: of machine learning and data surveillance, and our current inability deal with the implications of the whole technological stack as it is composed: hardware firmware mostly manufactured in China. Software everywhere that I’ve previously compared to building skyscrapers on swampy land. Our fundamentally insecure designs. Perhaps, more importantly our lack of functioning, sustainable alternatives that respect us, rather than act as extensions of their true owners.

Zeynep Tufekci elaborates on her post explaining the problems with Game of Thrones. She explains how technology extends the human. In this sense, technology is a system.
Bookmarked The Real Reason Fans Hate the Last Season of Game of Thrones by an author (Scientific American Blog Network)

The show did indeed take a turn for the worse, but the reasons for that downturn goes way deeper than the usual suspects that have been identified (new and inferior writers, shortened season, too many plot holes). It’s not that these are incorrect, but they’re just superficial shifts. In fact, the souring of Game of Thrones exposes a fundamental shortcoming of our storytelling culture in general: we don’t really know how to tell sociological stories.

Zeynep Tufekci argues that the reason why so many fans are complaining about the last season of Game of Thrones is because the storytelling style changed from sociological to psychological.

The overly personal mode of storytelling or analysis leaves us bereft of deeper comprehension of events and history. Understanding Hitler’s personality alone will not tell us much about rise of fascism, for example. Not that it didn’t matter, but a different demagogue would probably have appeared to take his place in Germany in between the two bloody world wars in the 20th century. Hence, the answer to “would you kill baby Hitler?,” sometimes presented as an ethical time-travel challenge, should be “no,” because it would very likely not matter much. It is not a true dilemma.

Tufekci explains that this is the same reason we have problems talking about historic technological transition.

In my own area of research and writing, the impact of digital technology and machine intelligence on society, I encounter this obstacle all the time. There are a significant number of stories, books, narratives and journalistic accounts that focus on the personalities of key players such as Mark Zuckerberg, Sheryl Sandberg, Jack Dorsey and Jeff Bezos. Of course, their personalities matter, but only in the context of business models, technological advances, the political environment, (lack of) meaningful regulation, the existing economic and political forces that fuel wealth inequality and lack of accountability for powerful actors, geopolitical dynamics, societal characteristics and more.

Maybe this is a part of what Douglas Rushkoff touches on in his criticism of storytelling.

Liked Maria Ressa, Zeynep Tufekci, and others on the growing disinformation war (Columbia Journalism Review)

On one panel, Ressa; Emily Bell, of the Tow Center; and Zeynep Tufekci, a techno-sociologist who writes for The New York Times and Wired, discussed the overwhelming effect of junk information on our public sphere, and the role of social media platforms in disseminating it. Tufekci argued that, in the 21st century, a surfeit of information, rather than its absence, poses the biggest problem. “When I was growing up in Turkey, the way censorship occurred was there was one TV channel and they wouldn’t show you stuff. That was it,” she said. “Currently, in my conceptualization, the way censorship occurs is by information glut. It’s not that the relevant information isn’t out there. But it is buried in so much information of suspect credibility that it doesn’t mean anything.” Tufekci cited the frenzied reporting, during the 2016 election, on WikiLeaks’s dump of hacked Democratic Party emails—much of which lacked crucial context—as a malign example of the trend. “I don’t think traditional journalism has caught up on this,” she said.

Bookmarked Opinion | Think You’re Discreet Online? Think Again (nytimes.com)

What is to be done? Designing phones and other devices to be more privacy-protected would be start, and government regulation of the collection and flow of data would help slow things down. But this is not the complete solution. We also need to start passing laws that directly regulate the use of computational inference: What will we allow to be inferred, and under what conditions, and subject to what kinds of accountability, disclosure, controls and penalties for misuse?

Zeynep Tufekci explains that with the use of computational inference, purchasing of data and the creation of shadow profiles, companies know more about use than what we maybe explicitly sharing online.
Bookmarked Reporting a massacre: Why the ABC didn’t share the shooter’s ‘manifesto’ (ABC News)

Social media platforms have made some changes to tackle hate speech and violent behaviour, but they could choose to do more. They could set higher standards for removing offensive video and messages.

Free speech is unimaginable without the right to dissent — but commentators, opinion writers and politicians also have choices to make in the example they set.

In the end though it’s on all of us — in the news sources we rely on, the social networks we join and what we choose to watch and share.

Craig McMurtrie unpacks the decision by the ABC to not publish extracts of the Christchurch shooter’s ‘manifesto’. Every move made seems to have be orchestrated to grab attention. As Robert Evans from Bellingcat explains, it is an example of
Shit posting:

The act of throwing out huge amounts of content, most of it ironic, low-quality trolling, for the purpose of provoking an emotional reaction in less Internet-savvy viewers.

Zeynep Tufekci backed this stance on Twitter:

Tufekci linked to a couple of posts she wrote in response to Sandy Hook Massacre and the Virginia shooter explaining the dangers of feeding copycat scenarios.

This focus on media manipulation also reminded me of dana boyd’s discussion of 4Chan’s association with fake news.

Liked

Bookmarked Shouldn’t We All Have Seamless Micropayments By Now? (WIRED)

The web’s founders fully expected some form of digital payment to be integral to its functioning. But nearly three decades later, we’re still waiting.

Zeynep Tufekci discusses the problems with current online payment systems. She suggests that micropayments offer a potential for innovation and opportunity.

Marginalia

For all the talk of disruption, today’s internet is still young and hugely underinnovated. While it’s difficult to predict all the details—that’s the point of disruption!—I have little doubt that it’s technically possible to build a digital infrastructure that rewards creativity at many scales and protects our privacy. Bitcoin is not the answer, for a variety of reasons, but a blockchain scheme, along with a mixture of more conventional systems and cryptographic tools, might play a part. Whatever the solution is, we just need a combination of vision, smart regulation, and true innovation to advance it.

Right now, we’re stuck where the automobile industry was when cars were still “horseless carriages,” wagon-wheeled monstrosities with high centers of gravity and buggy seats. We’re still letting an older technology—credit cards, designed for in-­person transactions, with high fees and financial surveillance baked in—determine the shape of a new technological paradigm. As a result, that paradigm has become twisted and monopolized by its biggest players. This is one of the modern internet’s greatest errors; it’s past time that we encounter “402 Payment Required” for real.

Replied to Too Long; Didn’t Read #171 by Ian O’Byrne (W. Ian O’Byrne)

I’m currently reading Twitter and Tear Gas by Zeynep Tufekci. It’s a fascinating read that is making me question a lot of my thinking about these digital, social spaces.

I too have started reading Twitter and Tear Gas. I too am being challenged by it. I somehow thought that it wouldn’t be applicable in the field of EdTech. What it has me thinking is that in ‘networked publics’ there is not imaginary line where EdTech (whatever that actually means) starts and stops.

Thank you too for the shoutout. It definitely has sparked some interesting conversation. I read a post today about mindfulness apps, yet it overlooked the collection of data associated with the completion of various. We are asked to be conscious of our breathing, yet ignore the data that we share on a daily basis.

RSVPed Interested in Attending https://bryanalexander.org/book-club/our-next-book-club-reading-is-zeynep-tufekcis-twitter-and-tear-gas-the-power-and-fragility-of-networked-protest/

Our next book club reading has been decided! After a furious polling, the winner is…

…Zeynep Tufekci, Twitter and Tear Gas: The Power and Fragility of Networked Protest.

Liked An Avalanche of Speech Can Bury Democracy (POLITICO Magazine)

It’s not speech per se that allows democracies to function, but the ability to agree—eventually, at least some of the time—on what is true, what is important and what serves the public good. This doesn’t mean everyone must agree on every fact, or that our priorities are necessarily uniform. But democracy can’t operate completely unmoored from a common ground, and certainly not in a sea of distractions.

via Mike Caulfield
Bookmarked How social media took us from Tahrir Square to Donald Trump (MIT Technology Review)

To understand how digital technologies went from instruments for spreading democracy to weapons for attacking it, you have to look beyond the technologies themselves.

Zeynep Tufekci captures some of the complexities associated with fixing up big tech. A few things that stand out is that the answer is not splitting up big tech or simply respond to the threat of Russia. As she explains:

Russia did not instigate the moves that have reduced Americans’ trust in health authorities, environmental agencies, and other regulators. Russia did not create the revolving door between Congress and the lobbying firms that employ ex-politicians at handsome salaries. Russia did not defund higher education in the United States. Russia did not create the global network of tax havens in which big corporations and the rich can pile up enormous wealth while basic government services get cut.

Instead we need to:

Figure out how our institutions, our checks and balances, and our societal safeguards should function in the 21st century.

Liked Opinion | What Elon Musk Should Learn From the Thailand Cave Rescue (nytimes.com)

Just because you’re a successful tech mogul doesn’t mean you know how to rescue kids trapped underground.

Tufekci also takes this discussion further on Twitter:

Bookmarked Why Zuckerberg’s 14-Year Apology Tour Hasn’t Fixed Facebook (WIRED)

At a minimum, Facebook has long needed an ombudsman’s office with real teeth and power: an institution within the company that can act as a check on its worst impulses and to protect its users. And it needs a lot more employees whose task is to keep the platform healthier. But what would truly be disruptive and innovative would be for Facebook to alter its business model. Such a change could come from within, or it could be driven by regulations on data retention and opaque, surveillance-based targeting—regulations that would make such practices less profitable or even forbidden.

It is a little disconcerting when Facebook ever seems to do something positive for the ‘user’ in response to complaints. What is worse, Tufekci highlights how some of the changes they are promising now were promised years ago.

But the backlash wouldn’t die down. Attempting to respond to the growing outrage, Facebook announced changes. “It’s Time to Make Our Privacy Tools Easier to Find”, the company announced without a hint of irony—or any other kind of hint—that Zuckerberg had promised to do just that in the “coming few weeks” eight full years ago. On the company blog, Facebook’s chief privacy editor wrote that instead of being “spread across nearly 20 different screens” (why were they ever spread all over the place?), the controls would now finally be in one place.

Sadly, this has nothing to do with users or community:

As far as I can tell, not once in his apology tour was Zuckerberg asked what on earth he means when he refers to Facebook’s 2 billion-plus users as “a community” or “the Facebook community.” A community is a set of people with reciprocal rights, powers, and responsibilities. If Facebook really were a community, Zuckerberg would not be able to make so many statements about unilateral decisions he has made—often, as he boasts in many interviews, in defiance of Facebook’s shareholders and various factions of the company’s workforce. Zuckerberg’s decisions are final, since he controls all the voting stock in Facebook, and always will until he decides not to—it’s just the way he has structured the company.

Tim Wu argues that we need to replace Facebook with a trustworthy platform not driven by survelliance and advertising:

If today’s privacy scandals lead us merely to install Facebook as a regulated monopolist, insulated from competition, we will have failed completely. The world does not need an established church of social media.

Bookmarked Opinion | YouTube, the Great Radicalizer by Zeynep Tufekci (nytimes.com)

In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted about this by the health department and concerned citizens, the restaurant managers reply that they are merely serving us what we want.

Zeynep Tufekci highlights the problems with YouTube’s algorithm. There is a bias built in to support inflammatory content. In response to the post, Clive Thompson explains it this way:

It’s not that Youtube radicalize politics specifically. It radicalizes everything, and politics just gets swept along in the slurry of zomg.