Liked Back to the Blog (Dan Cohen)
It is psychological gravity, not technical inertia, however, that is the bigger antagonist of the open web. Human beings are social animals and centralized social media like Twitter and Facebook provide a powerful sense of ambient humanity—that feeling that “others are here”—that is often missing when one writes on one’s own site. Facebook has a whole team of Ph.D.s in social psychology finding ways to increase that feeling of ambient humanity and thus increase your usage of their service.
Bookmarked Mark Zuckerberg Is Doubly Wrong About Holocaust Denial by Yair Rosenberg (The Atlantic)
Truly tackling the problem of hateful misinformation online requires rejecting the false choice between leaving it alone or censoring it outright. The real solution is one that has not been entertained by either Zuckerberg or his critics: counter-programming hateful or misleading speech with better speech.
Yair Rosenberg touches on the dangers of simply suppressing disinformation. He explains that the only way to respond is to correct it. This continues some of the conversation associated with danah boyd’s keynote at SXSW.

via HEWN by Audrey Watters

Bookmarked Cory Doctorow: Zuck’s Empire of Oily Rags (Locus Online)
For 20 years, privacy advocates
Cory Doctorow provides a commentary on the current state of affairs involving Facebook and Cambridge Analytica. Rather than blame the citizens of the web, he argues that the fault exists with the mechanics in the garage and the corruption that they have engaged with. The question that seems to remain is if this is so and we still want our car fixed, where do we go?

Marginalia

Cambridge Analytica are like stage mentalists: they’re doing something labor-intensive and pretending that it’s something supernatural. A stage mentalist will train for years to learn to quickly memorize a deck of cards and then claim that they can name your card thanks to their psychic powers. You never see the unglamorous, unimpressive memorization practice. source

The comparison between Cambridge Analytica (and big data in general) with the stage mentalist is intriguing. I am left wondering about the disappointment and disbelief in the truth. Sometimes there is a part of us that oddly wants to be mesmerised and to believe.


It’s fashionable to treat the dysfunctions of social media as the result of the naivete of early technologists, who failed to foresee these outcomes. The truth is that the ability to build Facebook-like services is relatively common. What was rare was the moral recklessness necessary to go through with it. source

Facebook and Cambridge Analytica raise the question of just because we can, it doesn’t mean we should.


Facebook doesn’t have a mind-control problem, it has a corruption problem. Cambridge Analytica didn’t convince decent people to become racists; they convinced racists to become voters. source

In relation to the question of mind-control verses corruption, I wonder where the difference exists. Does corruption involve some element of ‘mind-control’ to convince somebody that this is the answer?

Replied to Too Long; Didn’t Read #158 (W. Ian O'Byrne)
Each week when I write this newsletter, it is always interesting to me to see stories that suggest that social media is downright bad for us. For people that are hooked, it is like a drug. For people that don’t use social media and networks, they don’t understand why people care, or use these tools.
Ian, the irony of the JSON change is that I downloaded my content and cleaned it up months ago. Really hoping that someone develops an easy to use parser one day so that I can store all my statuses and check-ins in my site, even if they are private.
Bookmarked Facebook’s Push for Facial Recognition Prompts Privacy Alarms by Natasha Singer (nytimes.com)
Facebook is working to spread its face-matching tools even as it faces heightened scrutiny from regulators and legislators in Europe and North America.
Natasha Singer discusses Facebook’s continual push for facial recognition. She discusses some of the history associated with Facebook’s push into this area, including various roadblocks such as GDPR. She also looks at some of the patent applications, such as:

A system that could detect consumers within stores and match those shoppers’ faces with their social networking profiles. Then it could analyze the characteristics of their friends, and other details, using the information to determine a “trust level” for each shopper.

And:

Cameras near checkout counters could capture shoppers’ faces, match them with their social networking profiles and then send purchase confirmation messages to their phones.

This made me wonder how many patents actually come to fruition and how many are a form of indirect marketing?

Replied to 👓 Turning off Facebook for Bridgy | snarfed.org by Chris AldrichChris Aldrich (Chris Aldrich | BoffoSocko)
Read Turning off Facebook for Bridgy by Ryan Barrett Ryan Barrett (snarfed.org) I announced recently that Bridgy Publish for Facebook would shut down soon. Facebook’s moves to restrict its API to improve privacy and security are laudable, and arguably ... This is so disappointing. Facebook is lite...
Now all we need is a tool to easily take a Facebook archive and back it up in our own space. I recently deleted all my posts and messages in FB, I just wish I could integrate it with my Read Write Collect site.
Liked Facebook Gave Device Makers Deep Access to Data on Users and Friends by Gabriel J.X. Dance (nytimes.com)
In the furor that followed, Facebook’s leaders said that the kind of access exploited by Cambridge in 2014 was cut off by the next year, when Facebook prohibited developers from collecting information from users’ friends. But the company officials did not disclose that Facebook had exempted the makers of cellphones, tablets and other hardware from such restrictions.
Bookmarked The “They Had Their Minds Made Up Anyway” Excuse by Mike Caulfield (Hapgood)
If Facebook was a tool for confirmation bias, that would kind of suck. It would. But that is not the claim. The claim is that Facebook is quite literally training us to be conspiracy theorists. And given the history of what happens when conspiracy theory and white supremacy mix, that should scare the hell out of you. I’m petrified. Mark Zuckerberg should be too.
Mike Caulfield explains the dangers of fake news and the way in which the repetition and familiarity with such lies can lead to an odd sense of truth.

People exposed themselves to Facebook multiple times a day, every single day, seeing headlines making all sorts of crazy claims, and filed them in their famil-o-meter for future reference.