2019 was the “I Told You So” year for privacy advocates and voice assistants: the year in which every company that wanted you to trust them to put an always-on mic in the most intimate places in your home was revealed to have allowed thousands of low-waged contractors to listen in on millions of clips, many of them accidentally recorded: first it was Amazon (and again!), then Google, then Apple, then Microsoft.
Many of us are resigned to – and perhaps even fine with – the idea that our employer can scan our emails or keep track of how much time we waste on social media. But we are entering a new world of workplace surveillance in which we are watched 24/7 and every move is scrutinised. And things are only going to get more intrusive as corporations treat us less like human beings and more like machines. Last year, for example, Amazon patented an “ultrasonic bracelet” to be worn by workers to “monitor performance of assigned tasks”. Meanwhile, companies are implanting chips under workers’ skin and China is monitoring employees’ brain waves. It won’t be long until we have all been implanted with chips that keep track of our productivity and trigger a self-combustion protocol when we are no longer deemed useful to our AI overlords. But, hey, while the future may look bleak, at least there is consistently prepared pizza to look forward to.
One of the challenges that really intrigues me is when someone else gives your consent for something without either asking or often even realising. In some ways shadow profiles touches upon this, but the worst is DNA tests:
What do we do when other family members list you as related when they do family tree ancestry stuff? My large extended family lists us and I literally don't even have their phone numbers. Never talk to them.
— RelentlessNRecovery (@RelentlessNReco) September 23, 2019
It is also kind of funny how in education the discussions seems to be about banning smartphones. However, as you touch upon with microphones and wearables, we will not even know what is and is not being captured. A part of me thinks that as a teacher you need to be mindful of this.
What concerns me most are those who feel that we should make the capturing of biometric data standard.
We live in wicked times.
Google has big plans to build a Jetsonian smart city on the waterfront, and Torontonians have strong opinions about it: is it the solution to all our problems or the end of the world as we know it? We asked 18 super-smart people to tell us what they think
Essays By Joe Berridge, Michael Bryant, Ann Cavoukian, Jan De Silva, Dan Doctoroff, Cory Doctorow, Richard Florida, Ken Greenberg, Alexander Josephson, Jennifer Keesmaat, Bruce Kuwabara, Mohamed Lachemi, Kwame Mckenzie, Gord Perks, Robert Prichard, Yung Wu, Bianca Wylie And Shoshana Zuboff
Our emotions are being manipulated, hacked and shared like never before. So what does this mean for their future, our relationships and the technology that’s reading them?
The social media tools that teens use are direct descendants of the hangouts and other public places in which teens have been congregating for decades. What the drive-in was to teens in the 1950s and the mall in the 1980s, Facebook, texting, Twitter, instant messaging, and other social media are to teens now. Teens flock to them knowing they can socialize with friends and become better acquainted with classmates and peers they don’t know as well. They embrace social media for roughly the same reasons earlier generations of teens attended sock hops, congregated in parking lots, colonized people’s front stoops, or tied up the phone lines for hours on end. Teens want to gossip, flirt, complain, compare notes, share passions, emote, and joke around. They want to be able to talk among themselves—even if that means going online.(Page 20-21)
This episode also raises the question about the internet of things and the potential to gather emotional data. This is a topic touched upon by Ben Williamson in his book Big Data in Education.
As organisations gather huge stockpiles of data, they seem to grow increasingly tightfisted with their data and insights. They’ve found a gold mine – why share? The problem with this line of reasoning is that it quickly dead-ends in a world where the only conceivable use of data is as zero-sum competitive advantage: “I know something you don’t.”
Amazon’s Ring doorbells are surveillance devices that conduct round-the-clock video surveillance of your neighborhood, automatically flagging “suspicious” faces and bombarding you and your neighbors with alerts using an app called “Neighbors”; it’s a marriage of Amazon’s Internet of Things platform with its “Rekognition” facial recognition tool, which it has marketed aggressively to cities, law enforcement, ICE, businesses and everyday customers as a security measure that can help ID bad guys, despite the absence of a database identifying which faces belong to good people and which faces belong to bad people.
Our guide recommends some tools you can use to spot internet-connected devices. But keep an eye out for random bottles of Mountain Dew, too.
Of course, foldable displays won’t be limited to devices we carry in our pockets. We’re going to see them pretty much everywhere — round our wrists, as part of our clothes, and eventually as ‘wallpaper’ in our houses. Eventually there won’t be a surface on the planet that won’t also potentially be a screen.
- What if everyone was talking at once? What would that look and sound like?
- What about the conversations that may not be appropriate for speaking out loud in public or in private.
My other question is uses beyond the novel. Yeah I can ask Google a question or play a track from The National, but what else? I am really interested in what particular workflows you develop in conjunction with your smart things.
NOTE: I have written this response in the open web and respect your desire to restrict such conversations to paying subscribers, which I am not one, sorry.
Just this month, the insurance company United Healthcare began partnering with employers to offer free Apple Watches to those who hit certain fitness goals. Insurers might also offer benefits to residents whose homes prove their fitness or brand loyalty—and punish those who don’t. Health insurers could use data from the kitchen as a proxy for eating habits, and adjust their rates accordingly. Landlords could use occupancy sensors to see who comes and goes, or watch for photo evidence of pets. Life-insurance companies could penalize smokers caught on camera. Online and in person, consumers are often asked to weigh privacy against convenience and personalization: A kickback on utilities or insurance payments may thumb the scales in Google’s favor.
To badly paraphrase Tolstoy: Secure products are all alike; every not-secure product is not secure in its own way.
Facebook – sure, we may have sold your most intimate data to the Russkies, installed a cryptofascist in the whitehouse, engendered genocide in Myanmar and the slaughter of hundreds of innocent people across the developing world, and (just this last week) got caught leaking user data of at least 50,000,000 people, but you should totally allow our always-on microphone and camera into your home! Trust us!
Although it uses incredibly imprecise language, it can be reasonablly inferred that the directive targets large service providers like Google and Facebook. It doesn’t target small communities or people who are independently hosting their content.
All of which means that peer-to-peer decentralized social networks are exempt, if you’re hosting your profile yourself. Nobody on the indie web is going to need to implement upload filters. Similarly, nobody on the federated social web, or using decentralized apps, will either. In these architectures, there are no service providers that store or provide access to large amounts of work. It’s in the ether, being hosted from individual servers, which could sit in datacenters or could sit in your living room.
I remember Ross Halliday focusing on what might be deemed as ‘IoT for education’ at GTASyd. It is an interesting space. I can see the potential for it in education, but at what cost? For what impact? Here I am reminded of Marshall McLuhan’s tetrid:
- What does the medium enhance?
- What does the medium make obsolete?
- What does the medium retrieve that had been obsolesced earlier?
- What does the medium reverse or flip into when pushed to extremes?
I recently finished reading Ben Williamson’s book on Big Data in Education. Although not solely on this topic, definitely relates and worth reading.
The only way to challenge surveillance is through counter-surveillance Source
It is interesting to juxtapose this with a comment that Mark Burden recently made that it is the Internet of Data Collection Instruments.
In terms of the device collectors, in some ways they are delighted about this passivity because it reveals behaviours that we wouldn’t necessarily reveal if we knew data about us was being recorded. So in that sense when you think about what is now called the internet of things, the very label ‘the internet of things’ is a misleading label, in fact it’s a label that I think should be put in a wastepaper basket. What we are really talking about is the internet of data collection instruments. And these instruments rely on our passive behaviours in order to collect the data from the environment and about us in relation to what we do in those environments. And what we are now starting to see is that the smart home, or what is becoming increasingly the smart home, is being packed with these devices.Source