Manoush Zomorodi explores the surprising environmental impact of the internet in this episode of IRL. Because while it’s easy to think of the internet as living only on your screen, energy demand for the internet is indeed powered by massive server farms, running around the clock, all over the world. What exactly is the internet’s carbon footprint? And, what can we do about it?
The philosopher Timothy Morton calls global warming a ‘hyperobject’: a thing that surrounds us, envelops and entangles us, but that is literally too big to see in its entirety.Page 77
The argument in the end is that with the rise of surveillance capitalism, we have moved over time from ‘we might use’ your data to ‘we will’ use your data, therefore making privacy policies seemingly null and void.
For more on privacy policies, Bill Fitzgerald argues that we need to move beyond compliance to focus on privacy:
The more we can ground these conversations [around privacy] in personal elements the better: what do you want to show? Why? How? Do you ever want to retract it?
Alternatively, Amy Collier provides the follow list to consider:
- Audit student data repositories and policies associated with third-party providers
- Have a standard and well-known policy about how to handle external inquiries for student data and information.
- Provide an audit of data to students who want to know what data is kept on them, how the data is kept, where it is kept, and who else has access.
- Have clear guidelines and regulations for how data is communicated and transmitted between offices.
- Take seriously the data policies of third-party vendors.
- Closely examine and rethink student-tracking protocols.
- Give students technological agency in interacting with the institution.
In regards to privacy policies associated with third-party vendors, Fitzgerald suggests looking for the following search words associated with consent: third party, affiliatuons, change, update and modify.
For a different approach, Amy Wang reports on the terms of services associated with Instagram. She also includes extracts from a lawyer, Jenny Afia, who rewrote the document in plain English. This is similar to Terms of Service, Didn’t Read, a site designed to not only summarise Terms of Services, but also highlight aspects to consider.
When you shop, your data may be the most valuable thing for sale. This isn’t just true online — your data follows you into brick and mortar stores now as well. Manoush Zomorodi explores the hidden costs of shopping, online and off. Meet Meta Brown, a data scientist who unveils the information Amazon captures about you when you make an online purchase; Joseph Turow, who discusses how retailers are stripping us of our privacy; and Alana Semuels, who talks about becoming a hoarder with the advent of online shopping. Plus, learn about a college coffee shop where you can actually buy a drink with your data. (Is it worth it?)
ClassDojo has been dealing with privacy concerns since its inception, and it has well-rehearsed responses. Its reply to The Times was: ‘No part of our mission requires the collection of sensitive information, so we don’t collect any. … We don’t ask for or receive any other information [such as] gender, no email, no phone number, no home address.’ But this possibly misses the point. The ‘sensitive information’ contained in ClassDojo is the behavioural record built up from teachers tapping reward points into the app.
It would seem that sometimes it is not the comments on social media we make or the food that we purchase from Uber Eats, but the actual purchasing of such items that matters. To focus on the noun, ignores the ‘information’ provided by the verb.
On this week’s episode of IRL, we sit down with Luke Dormehl, author of Thinking Machines and The Formula, to explore the impact of algorithms, on and offline. Staci Burns and James Bridle , author of “Something is wrong on the internet,” investigate the human cost of gaming YouTube recommendations. Anthropologist Nick Seaver talks about the danger of automating the status quo. And researcher Safiya Noble looks at how to prevent racial bias from seeping into code