Bookmarked A Black Woman Invented Home Security. Why Did It Go So Wrong? by Chris Gilliard (WIRED)

A Black woman who feared for her safety creates a system. A white guy develops an iteration of this system later because he is annoyed that people are ringing his doorbell too often. This becomes a tool to manage Amazon’s loss prevention. Eventually, it leads to a boom not only in home security products like the Amazon suite and Google’s security cameras, along with a variety of others, but increasing measures to make the home, the neighborhood, and all public and private spaces a 24/7 watched fortress, complete with cameras, drones, security robots, and automated license plate readers. But amid this escalation, one urgent question arises: What are we defending ourselves against?

Chris Gilliard discusses the history of surveillance and its association with racism.
Replied to Shocked, but not surprised by wiobyrne (

The mob that rampaged the halls of Congress included infamous white supremacists and conspiracy theorists.

Thousands invaded the highest centers of power, and the first thing they did was take selfies and videos. They were making content as spoils to take back to the digital empires where they dwell.

Members of the mob also used a site called Dlive to livestream while they rampaged.

A coup with no plot, no end to achieve, no plan but to pose.

Ian, I have been been thinking about the Gram piece and wondering if sharing events is in fact fuelling things, even more so from abroad.

I was left thinking of something Chris Gilliard ironically tweeted:

Liked Facebook Cannot Separate Itself From the Hate It Spreads (Medium)

As we consider Facebook’s place in our lives and in our society, particularly during a revolutionary moment, when the abolition of technologies and institutions is now a serious discussion after being dismissed as impossible for so long, we should ask ourselves: How much white supremacy and hate are we willing to tolerate in exchange for whatever “good” one thinks Facebook does? It’s similar to asking “how much lead do you want in your water?”, “How much E. coli do you want in your food?”, or “How many heavy metals would you like in your farmland?”

For what it’s worth, my answer is “none.” A company whose business model necessitates that it consistently discharge poison into the environment should be dismantled.

How much toxic waste is Facebook willing to spill into the environment? Its answer seems to have been — and to remain — “as much as we can get away with.

Liked Tech companies caring about Black Lives Matter is too little, too late (Fast Company)

What is happening is an example of what is sometimes called “performative wokeness.” These companies issuing a statement that they “stand with the Black community” is the absolute least they can do. It would be better to remain silent rather than reveal their rank hypocrisy. Many of these companies generate profit either by exploiting Black labor and/or by amplifying hate and extremism that directly harms Black folks. If Amazon truly felt that Black lives matter, its executives would change the way they treat their workforce, stop selling their facial recognition software Rekognition, and dismantle their Ring Doorbell and Neighbors programs. If Facebook truly stood with the Black community, it would eliminate the widespread organizing of white supremacy on its platform. But it’s unlikely that those changes will happen anytime soon.

Replied to Digital Justice, Surveillance & Invisible Walls (

Ian and Kristen are joined by Dr. Chris Gilliard, a Professor of English at Macomb Community College in Michigan. His scholarship concentrates on privacy, institutional tech policy, digital redlining, and the re-inventions of discriminatory practices through data mining and algorithmic decision-making, especially as these apply to college students.

You can reach out to Dr. Gilliard at and on Twitter at @hypervisible.

Big questions:

In this episode, we discuss privacy, security, algorithms, & “surveillance capitalism.” How can parents help children to be more reflective about the activities in which they engage?

I really enjoyed Chris’ discussion of ‘consent’ and what this means in a world of increasing surveillance. It was a useful provocation to stop and consider the world around and the inplied consent we hand over each and every day.
Bookmarked Caught in the Spotlight | Urban Omnibus (Urban Omnibus)

Rather than ease or eliminate friction, these technologies often increase feelings of unease, anxiety, and fear on the part of both the watcher and the watched. Inasmuch as those tensions (whether acknowledged or not) come from a fear of the other, more cameras, devices, tracking, alerts, and notifications will not deliver on their promises. Rather, these technologies will continue to fuel a negative feedback loop between individuals and communities on both ends of the surveillance spectrum, where the only real winners are the companies who profit from the fear they help to manufacture.

Chris Gilliard explores how technologies that track create different spatial experiences for users. He compares ankle monitors with fitness tracks, and discusses the panoptic nature of Ring Doorbells and Automated License Plate Readers

ALPRs tend to be hidden. However, like so many aspects of police surveillance, they are not a secret. In true panoptic fashion, the preponderance of ALPRs establishes the possibility that you are always being observed. And as with Ring, powerful and connected surveillance tech in the hands of “regular” citizens ramps up fear with constant notices of “invasions” by outsiders.

Gilliard reflects on the influence that such technology has on various situations.

In a technologically-created environment where “crime” becomes content, people will be moved to find crime.


Chris Gilliard reflects on New York Times Privacy Project. This is something that the Luddbrarian has also critiqued. Pinboard also wonders about the irony of a series on privacy containing so many tracking cookies:

Bookmarked Privacy’s not an abstraction by Chris Gilliard (Fast Company)

An experiment in privacy–and the discussion that ensued–offer unexpected lessons in who gets watched, and how.

Chris Gilliard responds to a post by Kate Klonick in the New York Times exploring the teaching of privacy by modelling surveillance. This all just highlights how important it is to have discussions about privacy and how hard this can be.
Bookmarked Friction-Free Racism — Real Life by Chris Gilliard (Real Life)

The end game of a surveillance society, from the perspective of those being watched, is to be subjected to whims of black-boxed code extended to the navigation of spaces, which are systematically stripped of important social and cultural clues. The personalized surveillance tech, meanwhile, will not make people less racist; it will make them more comfortable and protected in their racism.

Chris Gilliard unpacks the inherent racism encoded into the operations of the surveillance state. See for example Spotify’s recent announcement to add genealogy data to their algorithm. As a part of this investigation, Gilliard provides a number of questions to consider when thinking about such data.