Bookmarked Caught in the Spotlight | Urban Omnibus (Urban Omnibus)

Rather than ease or eliminate friction, these technologies often increase feelings of unease, anxiety, and fear on the part of both the watcher and the watched. Inasmuch as those tensions (whether acknowledged or not) come from a fear of the other, more cameras, devices, tracking, alerts, and notifications will not deliver on their promises. Rather, these technologies will continue to fuel a negative feedback loop between individuals and communities on both ends of the surveillance spectrum, where the only real winners are the companies who profit from the fear they help to manufacture.

Chris Gilliard explores how technologies that track create different spatial experiences for users. He compares ankle monitors with fitness tracks, and discusses the panoptic nature of Ring Doorbells and Automated License Plate Readers

ALPRs tend to be hidden. However, like so many aspects of police surveillance, they are not a secret. In true panoptic fashion, the preponderance of ALPRs establishes the possibility that you are always being observed. And as with Ring, powerful and connected surveillance tech in the hands of “regular” citizens ramps up fear with constant notices of “invasions” by outsiders.

Gilliard reflects on the influence that such technology has on various situations.

In a technologically-created environment where “crime” becomes content, people will be moved to find crime.

Liked One Ring to rule them all: Surveillance ‘smart’ tech won’t make Canadian cities safer (The Conversation)

Ring represents an emerging governance system that, once established, we can neither vote for nor pull the curtains against. Framing Ring as a simple safety app fails to paint an accurate picture of the dangers of a makeshift corporate surveillance infrastructure.

People may assume there’s no risk to them, so long as they have nothing to hide. Regardless, surveillance of this kind still creates risks. At the societal level, the ocean of datafication created by pervasive smart technologies blurs the boundaries between financial, consumer and governmental systems. The datafication of our personal information ultimately reduces citizens to a collection of data points, open to misinterpretation, manipulation and monetization.

Bookmarked Opinion | We’re Banning Facial Recognition. We’re Missing the Point. (nytimes.com)

Today, facial recognition technologies are receiving the brunt of the tech backlash, but focusing on them misses the point. We need to have a serious conversation about all the technologies of identification, correlation and discrimination, and decide how much we as a society want to be spied on by governments and corporations — and what sorts of influence we want them to have over our lives.

Bruce Schneier argues that simply banning facial recognition is far too simplistic.

In all cases, modern mass surveillance has three broad components: identification, correlation and discrimination. Let’s take them in turn.

As Cory Doctorow summarises,

Schneier says that we need to regulate more than facial recognition, we need to regulate recognition itself — and the data-brokers whose data-sets are used to map recognition data to peoples’ identities.

Bookmarked Ten weird tricks for resisting surveillance capitalism in and through the classroom . . . next term! (HASTAC)

Check out these ten weird tricks for resisting surveillance capitalism in and through the classroom . . . next term! Listed with handy difficulty levels because we know Teach is busy! Add your own brilliant ideas and strategies by commenting here or on this tweet. And remember only we, the people, can truly bring the world closer together.

Erin Glass shares a number of strategies for responding to surveillance capitalism. They include engaging with community driven tools, exploring terms of services, owning your data and exploring the topic further. This touches on Audrey Watters’ discussion of a domain of one’s own, Glass’ presentation with Autumm Caines and the reading list from the Librarianshipwreck.
Liked Opinion | Twelve Million Phones, One Dataset, Zero Privacy (nytimes.com)

What we learned from the spy in your pocket.

Stuart Thompson and Charlie Warzel dig into the location data scrapped by apps and smartphones. To explain the systemic surveillance that we are all a part of, they unpack a single data source from a location data company.

The data reviewed by Times Opinion didn’t come from a telecom or giant tech company, nor did it come from a governmental surveillance operation. It originated from a location data company, one of dozens quietly collecting precise movements using software slipped onto mobile phone apps. You’ve probably never heard of most of the companies — and yet to anyone who has access to this data, your life is an open book. They can see the places you go every moment of the day, whom you meet with or spend the night with, where you pray, whether you visit a methadone clinic, a psychiatrist’s office or a massage parlor.

This information is often used in cc combination with other data points to create a shadow profile.

As revealing as our searches of Washington were, we were relying on just one slice of data, sourced from one company, focused on one city, covering less than one year. Location data companies collect orders of magnitude more information every day than the totality of what Times Opinion received.

Until governments step in to curb such practices, we need to be a little more paranoid , as Kara Swisher suggests. While John Naughton wonders how the west is any different to China?

It throws an interesting light on western concerns about China. The main difference between there and the US, it seems, is that in China it’s the state that does the surveillance, whereas in the US it’s the corporate sector that conducts it – with the tacit connivance of a state that declines to control it. So maybe those of us in glass houses ought not to throw so many stones.

Another example such supports Naughton’s point is presented by the Washington Post which reported on how some colleges have taken to using smartphones to track student movements.

Bookmarked Colleges are turning students’ phones into surveillance machines, tracking the locations of hundreds of thousands (Washington Post)

The systems highlight how widespread surveillance has increasingly become a fact of life: Students “should have all the rights, responsibilities and privileges that an adult has. So why do we treat them so differently?”

As someone who supports schools with attendance, I understand to a degree where this is all coming from. However, this does not mean it is right. Along with the take-up of video surveillance as perpetuated by companies such as Looplearn, the use of phones as a means of tracking is raising a lot of questions about the purpose and place of technology within learning.

The Chicago-based company has experimented with ways to make the surveillance fun, gamifying students’ schedules with colorful Bitmoji or digital multiday streaks. But the real value may be for school officials, who Carter said can split students into groups, such as “students of color” or “out-of-state students,” for further review. When asked why an official would want to segregate out data on students of color, Carter said many colleges already do so, looking for patterns in academic retention and performance, adding that it “can provide important data for retention. Even the first few months of recorded data on class attendance and performance can help predict how likely a group of students is to” stay enrolled.

What is most disconcerting is the hype around such data.

The company also claims to see much more than just attendance. By logging the time a student spends in different parts of the campus, Benz said, his team has found a way to identify signs of personal anguish: A student avoiding the cafeteria might suffer from food insecurity or an eating disorder; a student skipping class might be grievously depressed. The data isn’t conclusive, Benz said, but it can “shine a light on where people can investigate, so students don’t slip through the cracks.”

Here I am reminded of the work by Cathy O’Neil in regards to big data.

Liked Messaging app ToTok is reportedly a secret UAE surveillance tool (Mashable)

Rather than sticking to strictly messaging-app-like activities, ToTok reportedly intended to use that access to surveil its users. And by blocking other chat apps in the country, the U.A.E. practically ensured the app’s success.


“You don’t need to hack people to spy on them if you can get people to willingly download this app to their phone,” Wardle told the New York Times. “By uploading contacts, video chats, location, what more intelligence do you need?”

Bookmarked ‘Absolutely No Mercy’: Leaked Files Expose How China Organized Mass Detentions of Muslims (nytimes.com)

More than 400 pages of internal Chinese documents provide an unprecedented inside look at the crackdown on ethnic minorities in the Xinjiang region.

This exposé on the crackdown on Uighurs in Xinjiang continues to paints a daunting picture of the future.
Replied to Leaked documents document China’s plan for mass arrests and concentration-camp internment of Uyghurs and other ethnic minorities in Xinjiang (Boing Boing)

In addition to setting out a number of logistical and planning guidelines — such as sanitation and public health measures — the documents detail a system of points-based “behavior modification” tools to punish and reward prisoners who modify their conduct to the specifications of the Chinese state. This points-based system runs in parallel to the “predictive policing” tools that the Chinese state uses to identify and target people for rendering to its camps.

The use of points-based behavior modification sounds like ClassDojo?
Replied to Jon Andrews (@Obi_Jon_) | Twitter (twitter.com)
Just because Looplearn might be able to do something, it does not mean we should. As someone who works supports with admins on attendance I understand to a degree where they are coming from. However, this does not mean it is right. Sadly, not new. There was a report about Curtain University using facial recognition software in 2017.
Liked Opinion: Ban Facial Recognition Before It’s Too Late (BuzzFeed News)

There is no amount of regulation, transparency, or oversight that will fix the dangers inherent in widespread face surveillance. Only a full ban — a federal ban, covering the use of facial recognition by government agencies, in public places, and in public contracts with private entities — can prevent our nightmares from becoming reality.

Liked HEWN, No. 328 (hewn.substack.com)

Ed-tech is “grooming students for a lifetime of surveillance.” But let’s be clear: this grooming is happening at school, and it’s also happening at home.

Bookmarked The Delicate Ethics of Using Facial Recognition in Schools (Wired)

A growing number of districts are deploying cameras and software to prevent attacks. But the systems are also used to monitor students—and adult critics.

Tom Simonite and Gregory Barber discuss the rise in facial recognition within US schools. This software is often derived from situations such as Israeli checkpoints. It serves as a ‘free‘ and ‘efficient‘ means for maintaining student safety at the cost of standardising a culture of surveillance. What is worse is the argument that the use of facial recognition is a case of fighting fire with fire:

“You meet superior firepower with superior firepower,” Matranga says. Texas City schools can now mount a security operation appropriate for a head of state. During graduation in May, four SWAT team officers waited out of view at either end of the stadium, snipers perched on rooftops, and lockboxes holding AR-15s sat on each end of the 50-yard line, just in case.(source)

I am with Audrey Watters here, what is ‘delicate’ ethics?

Replied to How IBM’s Technology Powered the Holocaust (kottke.org)

It’s not difficult to see the relevance of this episode today. Should Microsoft-owned GitHub provide software to ICE for possible use in the agency’s state-sanctioned persecution of immigrants and asylum seekers? Should Twitter allow Donald Trump to incite terrorism on their service? Should Google provide AI to the Pentagon for the potential development of deadlier weapons? And Christ, where do you even start with Facebook? Palantir, Apple, and Amazon have also been criticized recently for allowing unethical usage of their technology and platforms. “It’s just business” and the belief in the neutrality of technology (and technology platforms) have combined to produce a shield that contemporary companies use to protect themselves from activists’ ethical criticisms. And increasingly, the customers and employees of these companies aren’t buying it because they don’t want history to repeat itself.
According to a book by human rights journalist Edwin Black, Hitler needed logistical help in carrying out the genocide of Europe

I wonder about the technology behind China’s social credit system and the links there. It would seem that what is different is that a lot of this technology is designed by the state for the state?
Liked The Domino’s ‘pizza checker’ is just the beginning – workplace surveillance is coming for you | Arwa Mahdawi (the Guardian)

Many of us are resigned to – and perhaps even fine with – the idea that our employer can scan our emails or keep track of how much time we waste on social media. But we are entering a new world of workplace surveillance in which we are watched 24/7 and every move is scrutinised. And things are only going to get more intrusive as corporations treat us less like human beings and more like machines. Last year, for example, Amazon patented an “ultrasonic bracelet” to be worn by workers to “monitor performance of assigned tasks”. Meanwhile, companies are implanting chips under workers’ skin and China is monitoring employees’ brain waves. It won’t be long until we have all been implanted with chips that keep track of our productivity and trigger a self-combustion protocol when we are no longer deemed useful to our AI overlords. But, hey, while the future may look bleak, at least there is consistently prepared pizza to look forward to.

Watched STEALING UR FEELINGS from stealingurfeelin.gs

‘Stealing Ur Feelings‘ uses dark humor to expose how Snapchat, Instagram, and Facebook can use AI to profit off users’ faces and emotions

Stealing Ur Feelings is an interactive documentary explains the science of facial emotion recognition technology and demystifies how the software picks out features like your eyes and mouth to understand if you’re happy, sad, angry, or disgusted. This is a part of a campaign to get SnapChat to openly state how they plan on using users emotional data in the future.
Bookmarked

Arvind Narayanan discusses three papers investigating the ways in which smart TVs watch the user, while the user is watching it.