Bookmarked Why Zuckerberg’s 14-Year Apology Tour Hasn’t Fixed Facebook (WIRED)
At a minimum, Facebook has long needed an ombudsman’s office with real teeth and power: an institution within the company that can act as a check on its worst impulses and to protect its users. And it needs a lot more employees whose task is to keep the platform healthier. But what would truly be disruptive and innovative would be for Facebook to alter its business model. Such a change could come from within, or it could be driven by regulations on data retention and opaque, surveillance-based targeting—regulations that would make such practices less profitable or even forbidden.
It is a little disconcerting when Facebook ever seems to do something positive for the ‘user’ in response to complaints. What is worse, Tufekci highlights how some of the changes they are promising now were promised years ago.

But the backlash wouldn’t die down. Attempting to respond to the growing outrage, Facebook announced changes. “It’s Time to Make Our Privacy Tools Easier to Find”, the company announced without a hint of irony—or any other kind of hint—that Zuckerberg had promised to do just that in the “coming few weeks” eight full years ago. On the company blog, Facebook’s chief privacy editor wrote that instead of being “spread across nearly 20 different screens” (why were they ever spread all over the place?), the controls would now finally be in one place.

Sadly, this has nothing to do with users or community:

As far as I can tell, not once in his apology tour was Zuckerberg asked what on earth he means when he refers to Facebook’s 2 billion-plus users as “a community” or “the Facebook community.” A community is a set of people with reciprocal rights, powers, and responsibilities. If Facebook really were a community, Zuckerberg would not be able to make so many statements about unilateral decisions he has made—often, as he boasts in many interviews, in defiance of Facebook’s shareholders and various factions of the company’s workforce. Zuckerberg’s decisions are final, since he controls all the voting stock in Facebook, and always will until he decides not to—it’s just the way he has structured the company.

Tim Wu argues that we need to replace Facebook with a trustworthy platform not driven by survelliance and advertising:

If today’s privacy scandals lead us merely to install Facebook as a regulated monopolist, insulated from competition, we will have failed completely. The world does not need an established church of social media.

Liked It's Time For an RSS Revival (WIRED)
The lasting appeal of RSS remains the parts that haven't changed: the unfiltered view of the open web, and the chance to make your own decisions about what you find there.
Chris Aldrich has written a useful response to this piece outlining a number of ideas overlooked as we truly move forward in regards to RSS.
Liked Theranos and Silicon Valley's 'Fake It Till You Make It' Culture (WIRED)
The scale of Theranos’ alleged fraud is unusual, but the forces behind it are not. Startup culture venerates the kind of “fake it till you make it” hustling that Holmes deployed. When Theranos was first exposed, tech industry leaders defended the company. As more reporting about its wrongdoing emerged, industry leaders characterized Theranos as an outlier, not indicative of the broader startup culture. A music video made by a venture firm even included the line, “Theranos doesn’t represent us, we are better.” But scores of minor scandals and lawsuits, combined with 2017’s series of scandals at the country’s most valuable private startup, Uber (former motto: “Always be hustlin’”), make it clear that faking it is more common than just Theranos.
via HEWN
Bookmarked Inside Facebook's Two Years of Hell (WIRED)
When social media started becoming driven by images, he bought Instagram. When messaging took off, he bought WhatsApp. When Snapchat became a threat, he copied it. Now, with all his talk of “time well spent,” it seems as if he’s trying to co-opt Tristan Harris too.
Nicholas Thompson and Fred Vogelstein disentangle the last two years in Facebook’s rise, with a particular focus on the way that they have embraced news. As with Google+, the picture is painted as to how Facebook ‘copied, then crushed’ Twitter and their hold on distributing news:

Back in 2012, the most exciting social network for distributing news online wasn’t Facebook, it was Twitter. The latter’s 140-character posts accelerated the speed at which news could spread, allowing its influence in the news industry to grow much faster than Facebook’s. “Twitter was this massive, massive threat,” says a former Facebook executive heavily involved in the decision making at the time.

So Zuckerberg pursued a strategy he has often deployed against competitors he cannot buy: He copied, then crushed. He adjusted Facebook’s News Feed to fully incorporate news (despite its name, the feed was originally tilted toward personal news) and adjusted the product so that it showed author bylines and headlines. Then Facebook’s emissaries fanned out to talk with journalists and explain how to best reach readers through the platform.

The catch with this change is that it is merely a focus on being THE platform. This therefore meant overlooking the multitude of complexities associated with ‘news’:

Facebook hired few journalists and spent little time discussing the big questions that bedevil the media industry. What is fair? What is a fact? How do you signal the difference between news, analysis, satire, and opinion? Facebook has long seemed to think it has immunity from those debates because it is just a technology company—one that has built a “platform for all ideas.”

The problem with this stance, to “never favour one kind of news”, is that “neutrality is a choice in itself.” This choice is one that can then be cajoled and manipulated:

While Facebook grappled internally with what it was becoming—a company that dominated media but didn’t want to be a media company—Donald Trump’s presidential campaign staff faced no such confusion. To them Facebook’s use was obvious. Twitter was a tool for communicating directly with supporters and yelling at the media. Facebook was the way to run the most effective direct-­marketing political operation in history.

In response to Trump’s use, the purchasing of ads and criticism for people such as Tristan Harris, Zuckerberg set out this year to right the wrongs:

One of the many things Zuckerberg seemed not to grasp when he wrote his manifesto was that his platform had empowered an enemy far more sophisticated than Macedonian teenagers and assorted low-rent purveyors of bull.

Ironically, he has now turned to the community to work as curators.

Along with investigations into the links between Facebook funding and research, these posts help highlight the tangled mess that we have gotten ourselves into.

Watched

Blockchain, the key technology behind Bitcoin, is a new network that helps decentralize trade, and allows for more peer-to-peer transactions. WIRED challenged political scientist and blockchain researcher Bettina Warburg to explain blockchain technology to 5 different people; a child, a teen, a college student, a grad student, and an expert.

This is a useful video as much for thinking about how we share ideas as it is for understanding the blockchain.

Liked It's the (Democracy-Poisoning) Golden Age of Free Speech by Zeynep Tufekei (WIRED)
Discussing the democratic problems with YouTube and Facebook, Zeynep Tufekei argues that we can decide how we want to handle digital surveillance, attention-­channeling, harassment, data collection, and algorithmic decision­making, we just need to start the discussion.
Zeynep Tufekei explains that just because we can all create a social media account in seconds this supposed ‘democracy’ is a phantom public. Although it may seem that we can all ‘connect the world’s, each of the platforms is controlled by algorithms designed to keep the prosumer engaged and advertised. This is something that Tufekei also discusses in her TEDTalk. The change needed is systemic:

We don’t have to be resigned to the status quo. Facebook is only 13 years old, Twitter 11, and even Google is but 19. At this moment in the evolution of the auto industry, there were still no seat belts, airbags, emission controls, or mandatory crumple zones. The rules and incentive structures underlying how attention and surveillance work on the internet need to change. But in fairness to Facebook and Google and Twitter, while there’s a lot they could do better, the public outcry demanding that they fix all these problems is fundamentally mistaken. There are few solutions to the problems of digital discourse that don’t involve huge trade-offs—and those are not choices for Mark Zuckerberg alone to make. These are deeply political decisions. In the 20th century, the US passed laws that outlawed lead in paint and gasoline, that defined how much privacy a landlord needs to give his tenants, and that determined how much a phone company can surveil its customers. We can decide how we want to handle digital surveillance, attention-­channeling, harassment, data collection, and algorithmic decision­making. We just need to start the discussion. Now.

Liked When It Comes to Gorillas, Google Photos Remains Blind (WIRED)
Google’s caution around images of gorillas illustrates a shortcoming of existing machine-learning technology. With enough data and computing power, software can be trained to categorize images or transcribe speech to a high level of accuracy. But it can’t easily go beyond the experience of that training. And even the very best algorithms lack the ability to use common sense, or abstract concepts, to refine their interpretation of the world as humans do.