There’s clearly no Russian Plan B for Ukraine. If that is indeed the case, then we know what’s likely to happen.
When Chechnya was being obliterated in 1999, most of us paid little attention. After all, it wasn’t a European country. But Ukraine is.
Our complacent post-1946 holiday has really come to an end.
Tag: John Naughton
One of the most striking things about Bellingcat’s success is that — at least up to this stage — its investigative methodology is (to use a cliché) not rocket science. It’s a combination of determination, stamina, cooperation, Internet-saviness, geolocation (where did something happen?), chronolocation (when did it happen?) and an inexhaustible appetite for social-media-trawling.
Not surprisingly, Signal has been staggering under the load of refugees from WhatsApp following Facebook’s ultimatum about sharing their data with other companies in its group. According to data from Sensor Tower Signal was downloaded 8.8m times worldwide in the week after the WhatsApp changes were first announced on January 4. Compare that with 246,000 downloads the week before and you get some idea of the step-change. I guess the tweet — “Use Signal” — from Elon Musk on January 7 probably also added a spike.
If you’re comfortable with Facebook’s use of data (or that of its much closer subsidiary Instagram), it might be difficult to care about this. The company was recently forced by Apple to provide a privacy “nutritional label” on its iOS app, revealing how it works with user data. The labels disclosed more than 100 different pieces of data that may be collected, many of which are directly linked to user profiles, including health and fitness data, “sensitive info” and search histories. For the typical user, who has an account on both services, adding in the small amount of information WhatsApp has is a drop in a bucket by comparison.
But the change does start to eat away at the idea that you can be on WhatsApp without a Facebook footprint. The two apps’ very different histories and intended uses have led to a split in demographics among their users, and a small but significant proportion of WhatsApp users, drawn by the encryption, ad-free nature and no-frills interface, avoid Facebook itself while still using the chat app it owns.
In response, Facebook has paused this change. For Charles Arthur, this says a lot in that Facebook were able to act so swiftly.
The irony is so thick you could spread it on toast. Misinformation spread on WhatsApp has been blamed for deaths in India and election distortion in Brazil, but the company slow-walked complaints there. But when people start defecting, that’s a different matter: it acts like it’s on fire.
I feel that the hardest thing with implementing such restrictions is having the good will of the people and a willingness to cop the critics. Dan Andrews was labelled a ‘dictator‘ because of it.
So the US electorate may have felt safe in taking a punt on Trump, on the grounds that if it turned out badly, well, then, the system would take care of them. In a way, that was also the thinking of my many liberal American friends who told me that, while Trump would be terrible, “we are a Republic of Laws” and the Constitution, the separation of powers and the court system would keep him under control and limit the damage, pending restoration of normalcy.
I was not really sure what it would be like, but I did not think that it would be like this.
I’m picking up a distinct impression that the novelty of WFH has begun to wear thin as we realise that the pandemic might turn out to be a very long haul indeed. And the more we are obliged to interact with the technology at home, the more acute our perceptions of its implications and downsides are becoming
Personally, I won’t believe that Trump has been defeated until the Marines drag him out of the White House on January 20th.
I started the day reading Peter Oborne’s piece on whether China will replace Islam as the West’s new enemy — and then got sucked into the rabbit-hole of whether we are sliding into a new Cold War, with China playing the role that the Soviet Union played in the old days. This is all about geopolitics, of course, about which I know little. But if you write about digital technology, as I do, this emerging Cold War is a perennial puzzle that pops up everywhere.
The economic challenges facing China have possible implications for U.S. policy. Rather than worrying so much about what Beijing is up to, Washington might be better off focusing on the home front and enhancing American advantages over China, by, for instance, strengthening the education system and investing in research and development.
What is the point of building this surveillance architecture if we can’t use it to save lives in a scary emergency like this one?
Of course, all of this would come at an enormous cost to our privacy. This is usually the point in an essay where I’d break out the old Ben Franklin quote: “those who would give up essential liberty to purchase a little temporary safety deserve neither.”
But this proposal doesn’t require us to give up any liberty that we didn’t already sacrifice long ago, on the altar of convenience. The terrifying surveillance infrastructure this project requires exists and is maintained in good working order in the hands of private industry, where it is entirely unregulated and is currently being used to try to sell people skin cream. Why not use it to save lives?
This is a wicked question. As John Naughton raises the concern that such a decision would consitute ‘crossing the rubicon’:
If we use the technology for this purpose we will have crossed the Rubicon into nightmare territory. And if we do cross, there’s unlikely to be a way back — because once states have acquired access to this technology, they rarely give it up. So will we do it?
I guess Ceglowski’s point is that the genie is already out of the bottle, the challenge is using such powers for good.
I continue to believe that living in a surveillance society is incompatible in the long term with liberty. But a prerequisite of liberty is physical safety. If temporarily conscripting surveillance capitalism as a public health measure offers us a way out of this crisis, then we should take it, and make full use of it. At the same time, we should reflect on why such a powerful surveillance tool was instantly at hand in this crisis, and what its continuing existence means for our long-term future as a free people.
John Naughton’s online diary
In a lecture in 1997, Nathan Myhrvold, who was once Bill Gates’s chief technology officer, set out his Four Laws of Software. 1: software is like a gas – it expands to fill its container. 2: software grows until it is limited by Moore’s law. 3: software growth makes Moore’s law possible – people buy new hardware because the software requires it. And, finally, 4: software is only limited by human ambition and expectation.
The blogosphere continues to be one of our greatest information resources. So why not log off social media, get yourself an RSS reader and wise up?
Shoshana Zuboff’s new book is a chilling exposé of the business model that underpins the digital world. Observer tech columnist John Naughton explains the importance of Zuboff’s work and asks the author 10 key questions
Marginalia
We’re living through the most profound transformation in our information environment since Johannes Gutenberg’s invention of printing in circa 1439. And the problem with living through a revolution is that it’s impossible to take the long view of what’s happening. Hindsight is the only exact science in this business, and in that long run we’re all dead.
So our contemporary state of awareness is – as Manuel Castells, the great scholar of cyberspace once put it – one of “informed bewilderment”.
“Surveillance capitalism,” she writes, “unilaterally claims human experience as free raw material for translation into behavioural data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioural futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behaviour.”
Surveillance capitalism was invented around 2001 as the solution to financial emergency in the teeth of the dotcom bust when the fledgling company faced the loss of investor confidence.
Nearly every product or service that begins with the word “smart” or “personalised”, every internet-enabled device, every “digital assistant”, is simply a supply-chain interface for the unobstructed flow of behavioural data on its way to predicting our futures in a surveillance economy.
Once we searched Google, but now Google searches us. Once we thought of digital services as free, but now surveillance capitalists think of us as free.
It is no longer enough to automate information flows about us; the goal now is to automate us. These processes are meticulously designed to produce ignorance by circumventing individual awareness and thus eliminate any possibility of self-determination. As one data scientist explained to me, “We can engineer the context around a particular behaviour and force change that way… We are learning how to write the music, and then we let the music make them dance.”
Surveillance capitalism is a human-made phenomenon and it is in the realm of politics that it must be confronted. The resources of our democratic institutions must be mobilised, including our elected officials.
For example, the idea of “data ownership” is often championed as a solution. But what is the point of owning data that should not exist in the first place? All that does is further institutionalise and legitimate data capture.
Users might get “ownership” of the data that they give to surveillance capitalists in the first place, but they will not get ownership of the surplus or the predictions gleaned from it – not without new legal concepts built on an understanding of these operations … In any confrontation with the unprecedented, the first work begins with naming.
have we become so subtly conditioned by digital technology that we don’t see what’s been happening to us? Have we been conditioned to accept a world governed by “smart” tech, trading convenience and cheap bliss to the point where we become a bit like machines ourselves?