the students arrange the sensors into a “public art piece” in the lobby – a table covered in sensors spelling out “NO!,” surrounded by Sharpie annotations decrying the program.
Meanwhile, students are still furious. It’s not just that the sensors are invasive, nor that they are scientifically incoherent, nor that they cost more than a year’s salary – they also emit lots of RF noise that interferes with the students’ own research.
Tag: Surveillance
A Black woman who feared for her safety creates a system. A white guy develops an iteration of this system later because he is annoyed that people are ringing his doorbell too often. This becomes a tool to manage Amazon’s loss prevention. Eventually, it leads to a boom not only in home security products like the Amazon suite and Google’s security cameras, along with a variety of others, but increasing measures to make the home, the neighborhood, and all public and private spaces a 24/7 watched fortress, complete with cameras, drones, security robots, and automated license plate readers. But amid this escalation, one urgent question arises: What are we defending ourselves against?
Prior to this week’s Pegasus Project, a global reporting effort by major newspapers to expose the fatal consequences of the NSO Group—the new private-sector face of an out-of-control Insecurity Industry—most smartphone manufacturers along with much of the world press collectively rolled their eyes at me whenever I publicly identified a fresh-out-of-the-box iPhone as a potentially lethal threat.
Pegasus is the hacking software – or spyware – that is developed, marketed and licensed to governments around the world by the Israeli company NSO Group. It has the capability to infect billions of phones running either iOS or Android operating systems.
The earliest version of Pegasus discovered, which was captured by researchers in 2016, infected phones through what is called spear-phishing – text messages or emails that trick a target into clicking on a malicious link.
Snowden argues that this is validation to what he has been saying for a long time, that the “phone in your hand exists in a state of perpetual insecurity, open to infection by anyone willing to put money in the hand of this new Insecurity Industry.” The challenge ahead is incentivising change and introducing a level of liability. A part of this is getting governments to understand that subsidising such organisations does not serve their purpose.
If we don’t do anything to stop the sale of this technology, it’s not just going to be 50,000 targets: It’s going to be 50 million targets, and it’s going to happen much more quickly than any of us expect.
The Guardian have also shared a number of posts and podcasts unpacking the topic further.
Weaponized care is not a monolith, and we must be attentive to how it can be wielded in different directions and for different purposes. Most insidiously, it seizes upon how care is necessary and essential for our social lives. But it can be weaponized in a different way: As Audre Lorde wrote, “Caring for myself is not self-indulgence, it is self-preservation, and that is an act of political warfare.” Here Lorde rejects gendered ideas of care and posits a different approach to its weaponization: not as a way to sell harmful surveillance technology but to protect herself from overextension and despair in the face of disease and the stigmas attached to several overlapping marginalized identities. Realizing and recognizing that care can be used as a weapon against the interests of our communities, our loved ones, and even ourselves is a step toward respecting this powerful construct.
Join us for a teach-in about surveillance, educational technologies, academic freedom, and student care — for an important cause!
Judges already requisition Fitbit data to solve crimes. No matter what Facebook say are their intentions around Project Aria, this data will end up in the hands of law enforcement, too.
It would be nice to live in a world with fewer impositions on privacy, one in which law enforcement did not use small quadcopters and the Department of Homeland Security did not redeploy large Predator drones to surveil protesters. And, for people in some parts of the world, it would be nice not to associate the sound of a drone with impending missile fire. But given that those eyes are in the sky, it’s good to know how to hide.
We talk a lot about the rise of surveillance capitalism and ponder the grim future to which that Orwellian path leads. But for students? That future is now, as they try to act dutiful in front of their glowing webcams.
The rise of proctoring software is a symptom of a deeper mistake, one that we keep making in the internet age: using tech to manage a problem that is fundamentally economic.
Rather than multiple choice tests, students could be given projects or essays. However, these assessments are much more time intensive to mark.
Cory Doctorow also discusses the rise in proctoring software in light of Ian Linkletter being sued for tweeting links to training videos designed for internal use explaining how Protorio works.
High-stakes tests are garbage, pedagogically bankrupt assessment tools that act as a form of empirical facewash for “meritocracy.”
They primarily serve as a way for wealthy parents to buy good grades for their kids, since expensive test-prep services can turn even the dimmest, inbred plute into a genius-on-paper.
All of this was true before the pandemic. Now it’s worse.
It is interesting to consider this alongside other strategies, such as plagiarism software and cameras being used for attendance purposes. As Audrey Watters’s suggests:
We have to change the culture of schools not just adopt kinder ed-tech. We have to stop the policing of classrooms in all its forms and support other models for our digital and analog educational practices.
The power to end it is in your hands.
While the system itself was not substantially changed—as a rule, governments are less interested in reforming their own behavior than in restricting the behavior and rights of their citizens—what did change was the public consciousness.
This is something that Doug Belshaw discusses in his mapping of the internet.
In response, Snowden discusses the power of language to challenge.
You have heard that when all you have is a hammer, every problem looks like a nail. Herein lies the folly of every system of rule whose future relies more heavily on the omnipotence of its methods than the popularity of its mandate. There were times when empires were won by bronze and boats and powder. None survive. What outlasts each forgotten flag is our greatest technology, language: the empire of the mind.
It is interesting to consider this alongside Audrey Watters’ discussion of luddite pedagogy.
I don’t think that ed-tech created “cop shit” in the classroom or created a culture of surveillance in schools by any means. But it has facilitated it. It has streamlined it. It has polished it and handed out badges for those who comply with it and handed out ClassDojo demerits for those who haven’t.
Marginalia
Chances are, if you want to focus on the tech because it’s tech, you’re selling “cop shit.”
Mainway’s purpose, in other words, was neither storage nor preparation of a simple list. Constant, complex, and demanding operations fed another database called the Graph-in-Memory.
Double a penny once a day and you reach $1 million in less than a month. That is what exponential growth looks like with a base of two. As contact chaining steps through its hops, the social graph grows much faster. If the average person calls or is called by 10 other people a year, then each hop produces a tenfold increase in the population of the NSA’s contact map. Most of us talk on the phone with a lot more than 10 others. Whatever that number, dozens or hundreds, you multiply it by itself to measure the growth at each hop.
…
Contact chaining on a scale as grand as a whole nation’s phone records was a prodigious computational task, even for Mainway. It called for mapping dots and clusters of calls as dense as a star field, each linked to others by webs of intricate lines. Mainway’s analytic engine traced hidden paths across the map, looking for relationships that human analysts could not detect. Mainway had to produce that map on demand, under pressure of time, whenever its operators asked for a new contact chain. No one could predict the name or telephone number of the next Tsarnaev. From a data scientist’s point of view, the logical remedy was clear. If anyone could become an intelligence target, Mainway should try to get a head start on everyone.
Instead of watching pandemic movies, my family started watching movies where bureaucracies fail to honestly account for contrary expert opinions, due to which the fallout is more widespread and dan…
You can’t teach someone to swim while they’re drowning.
Instead, he suggests we should be engaging with whatever is at hand.
What does kitchen math look like in an emergency online learning space? It is an engagement of the tools common to your environment and based fully in pedagogical principles. The technology informs the pedagogy. If the goal is instructor presence, why not film a short video on a mobile device reflecting on the relationship of the course and this time in life, and share it with students? If the concern is the validity of an examination, why not think about a way in which students could use those same cameras not to film or photograph themselves taking a multiple choice exam but constructing knowledge by building a manipulative at home that shows the relationship of the individual to instruction? If there’s a concern about some of the reading, use the telecommunications tools for discussion or even a read-aloud session. Working with what’s available not only eases the faculty burden, it grounds the learning in the environment of the learner. Everyone is dealing with the same emergency; the best tools to get through this are the ones we have regular interaction with, not those brought in as a panic buy with a significant instructional manual and learning curve.
In a separate response to the response of edtech, Audrey Watters shares her concerns about surveillance associated with the move to online learning.
One of my greatest fears right now is that this pandemic strengthens this surveillance culture in school. And the new technologies, adopted to ease the “pivot to digital,” will exacerbate existing educational inequalities, will put vulnerable students at even more risk. These technologies will for foreclose possibilities for students and for teachers alike, shutting down dissent and discussion and curiosity and community.
Adding to this concern around surveillance, David White highlights the importance of remembering trust and care when engaging in situations of negotiated risk.
As education moves online we are going to have to get better at stating, and upholding, our values around trust and care with the concomitant acknowledgment of the risk we are accepting to protect certain freedoms. If not, then education will continue to merge with the corporate/civic surveillance state we are now only too aware of. To avoid sleepwalking into this new normal there will be times where we must deliberately refuse to use aspects of the data and control which technology offers, even when there are demands framed in terms of fairness or reduction of risk.
Neil Selwyn captures this sentiment by highlighting the need for educators to remember the human aspects involved within technology.
Teachers need to have good awareness of the social, emotional and affective aspects of technology-based education, and feel confident in their capacity to respond appropriately. Teaching of any sort is never simply a technical process – this is certainly the case when teaching online.
What is the point of building this surveillance architecture if we can’t use it to save lives in a scary emergency like this one?
Of course, all of this would come at an enormous cost to our privacy. This is usually the point in an essay where I’d break out the old Ben Franklin quote: “those who would give up essential liberty to purchase a little temporary safety deserve neither.”
But this proposal doesn’t require us to give up any liberty that we didn’t already sacrifice long ago, on the altar of convenience. The terrifying surveillance infrastructure this project requires exists and is maintained in good working order in the hands of private industry, where it is entirely unregulated and is currently being used to try to sell people skin cream. Why not use it to save lives?
This is a wicked question. As John Naughton raises the concern that such a decision would consitute ‘crossing the rubicon’:
If we use the technology for this purpose we will have crossed the Rubicon into nightmare territory. And if we do cross, there’s unlikely to be a way back — because once states have acquired access to this technology, they rarely give it up. So will we do it?
I guess Ceglowski’s point is that the genie is already out of the bottle, the challenge is using such powers for good.
I continue to believe that living in a surveillance society is incompatible in the long term with liberty. But a prerequisite of liberty is physical safety. If temporarily conscripting surveillance capitalism as a public health measure offers us a way out of this crisis, then we should take it, and make full use of it. At the same time, we should reflect on why such a powerful surveillance tool was instantly at hand in this crisis, and what its continuing existence means for our long-term future as a free people.
Many consumers and businesses are using Zoom teleconferences during the coronavirus pandemic, raising privacy concerns. Consumer Reports has guidance for both users and hosts.
In another post, Joseph Cox discusses the way in which Zoom shares information about iOS users with Facebook:
The Zoom app notifies Facebook when the user opens the app, details on the user’s device such as the model, the time zone and city they are connecting from, which phone carrier they are using, and a unique advertiser identifier created by the user’s device which companies can use to target a user with advertisements
Surveillance does not equal safety; it undermines student trust in their learning environments, isn’t effective at keeping them safe, and reinforces systemic injustice. Schools need to slam the brakes and think about what kind of dystopia they’re creating for their students.
Ever since I have been in education, there have been applications to support surveillance. I think what has changed is their breadth and reach. Where the applications providing ability to view screens was restricted to the computer lab, the rise of portable devices and cloud computing has made such surveillance
.Although focusing on data, Ben Williamson provides his own take on the current climate within education:
Few people ever been that bothered by data in education.
But it's centre of battles over facial recognition. Huge datasets on millions of young people. Big tech all over schools. Emotion AI, brain tech & bioinformatics all being tried out.
Education is the experimental lab.
— Ben Williamson (@BenPatrickWill) February 29, 2020
The challenge as I see it is to understand that consent is something that we inadvertently give each time we tap into an application. I would argue then it is a constant state of becoming more informed. In an ever changing world, with goals forever moving, it is a case where we can never quite be fully informed.
One of the issues with this is the danger to be black and white with such conversations. I recently read a piece the discussed the problem with science research being one of narrative, rather than just explaining the facts. I think that the same applies for discussions around surveillance capitalism.
Although people like Douglas Rushkoff have raised concern about narrative and storytelling, I feel that until we have different people talking about the topic it is not going to go anywhere.
As the growing scale of facial recognition shows, more data can always be extracted.
As that kind of surveillance grows, catalyzed by free-range viral videos recorded wherever an embarrassing incident unfolds, coupled with a contest to name the bad actors and where they work, the demand for pseudonymity will require more than non-revealing Twitter handles. As yesterday’s locks are supplemented by today’s networked home-security cams, companies will market tools for us to secure the manifold ways in which our identities could leak. Nico Sell (which may or may not be her real name) has led the way: She’s a digital-security researcher who has worked hard to never be publicly photographed without wearing sunglasses. Researchers at Carnegie Mellon have designed special glasses to confuse facial recognition without requiring shades, and the artist Adam Harvey has pioneered an open tool kit of new fashions for the same purpose. Next up will be shoe inserts to stymie gait detection, and the commandeering of Auto-Tune to prevent voice recognition.
In contrast, Transcriptworld is the all visible world of online identity where surveillance is capturing our everyone and slowing manipulating us.
Transcriptworld may appear normal, but it’s really the Truman Show, a highly realistic but still completely tailored video game where nothing happens by chance. It’s a hall of mirrors whose horizons and features are digitally generated and honed for each person, in which even what constitutes “normal” is defined by the system: both in the type of world— violent or peaceful, pessimistic or hopeful—that’s presented, and in the ways that people will rapidly adjust to try to avoid the penalties of the system’s definition of negative behavior.
… when government doesn’t embrace the rule of law, Transcriptworld provides the soil—fertilized by commercial data processing—in which to grow the authoritarian nightmares we’ve come to call Orwellian.
New facial recognition technology and its potential uses are justifiably raising fears.
The use of facial recognition to deter marathon runners from cheating, relieving teachers of the burden of taking the class roll, and saving home-owners the worry of losing their door keys.
They also discuss some of the function creep, such as the whereabouts of Workforce and the tracking of customers in stores. The authors close with a call that we must all pay more attention, especially as governments begin the debates about regulation.
Amid growing calls in the US and elsewhere for outright bans on all forms of facial recognition, it’s time for Australia to begin to pay closer attention. At present, most uses of the technology remain speculative or in the early stages of development. There’s a brief window for us all to have an influence on what happens next. It will be important to regulate not just the use of public databases, but also the creation of large-scale private databases, and the uses to which these can be put.