A Black woman who feared for her safety creates a system. A white guy develops an iteration of this system later because he is annoyed that people are ringing his doorbell too often. This becomes a tool to manage Amazon’s loss prevention. Eventually, it leads to a boom not only in home security products like the Amazon suite and Google’s security cameras, along with a variety of others, but increasing measures to make the home, the neighborhood, and all public and private spaces a 24/7 watched fortress, complete with cameras, drones, security robots, and automated license plate readers. But amid this escalation, one urgent question arises: What are we defending ourselves against?
I was left thinking of something Chris Gilliard ironically tweeted:
It’s okay not to tweet today. Really it is. https://t.co/qOSDQgEowb
— if you can remote proctor me, you’re too close (@hypervisible) January 7, 2021
As we consider Facebook’s place in our lives and in our society, particularly during a revolutionary moment, when the abolition of technologies and institutions is now a serious discussion after being dismissed as impossible for so long, we should ask ourselves: How much white supremacy and hate are we willing to tolerate in exchange for whatever “good” one thinks Facebook does? It’s similar to asking “how much lead do you want in your water?”, “How much E. coli do you want in your food?”, or “How many heavy metals would you like in your farmland?”
For what it’s worth, my answer is “none.” A company whose business model necessitates that it consistently discharge poison into the environment should be dismantled.
How much toxic waste is Facebook willing to spill into the environment? Its answer seems to have been — and to remain — “as much as we can get away with.
What is happening is an example of what is sometimes called “performative wokeness.” These companies issuing a statement that they “stand with the Black community” is the absolute least they can do. It would be better to remain silent rather than reveal their rank hypocrisy. Many of these companies generate profit either by exploiting Black labor and/or by amplifying hate and extremism that directly harms Black folks. If Amazon truly felt that Black lives matter, its executives would change the way they treat their workforce, stop selling their facial recognition software Rekognition, and dismantle their Ring Doorbell and Neighbors programs. If Facebook truly stood with the Black community, it would eliminate the widespread organizing of white supremacy on its platform. But it’s unlikely that those changes will happen anytime soon.
Rather than ease or eliminate friction, these technologies often increase feelings of unease, anxiety, and fear on the part of both the watcher and the watched. Inasmuch as those tensions (whether acknowledged or not) come from a fear of the other, more cameras, devices, tracking, alerts, and notifications will not deliver on their promises. Rather, these technologies will continue to fuel a negative feedback loop between individuals and communities on both ends of the surveillance spectrum, where the only real winners are the companies who profit from the fear they help to manufacture.
ALPRs tend to be hidden. However, like so many aspects of police surveillance, they are not a secret. In true panoptic fashion, the preponderance of ALPRs establishes the possibility that you are always being observed. And as with Ring, powerful and connected surveillance tech in the hands of “regular” citizens ramps up fear with constant notices of “invasions” by outsiders.
Gilliard reflects on the influence that such technology has on various situations.
In a technologically-created environment where “crime” becomes content, people will be moved to find crime.
This call for privacy regulation by the New York Times editorial board serves invasive tracking scripts from at least 15 outside domains. I have asked the NYT to disclose this fundamental conflict of interest on their Privacy Project page, to no avail. https://t.co/9fhcL1E4BU pic.twitter.com/iy5n8SgREM
— Pinboard (@Pinboard) June 9, 2019
An experiment in privacy–and the discussion that ensued–offer unexpected lessons in who gets watched, and how.
The end game of a surveillance society, from the perspective of those being watched, is to be subjected to whims of black-boxed code extended to the navigation of spaces, which are systematically stripped of important social and cultural clues. The personalized surveillance tech, meanwhile, will not make people less racist; it will make them more comfortable and protected in their racism.