Listened IRL Podcast: Checking Out Online Shopping by an author from irlpodcast.org

When you shop, your data may be the most valuable thing for sale. This isn’t just true online — your data follows you into brick and mortar stores now as well. Manoush Zomorodi explores the hidden costs of shopping, online and off. Meet Meta Brown, a data scientist who unveils the information Amazon captures about you when you make an online purchase; Joseph Turow, who discusses how retailers are stripping us of our privacy; and Alana Semuels, who talks about becoming a hoarder with the advent of online shopping. Plus, learn about a college coffee shop where you can actually buy a drink with your data. (Is it worth it?)

This discussion of big data reminds me a comment by Ben Williamson in regards to ‘sensitive’ data captured by Class Dojo:

ClassDojo has been dealing with privacy concerns since its inception, and it has well-rehearsed responses. Its reply to The Times was: ‘No part of our mission requires the collection of sensitive information, so we don’t collect any. … We don’t ask for or receive any other information [such as] gender, no email, no phone number, no home address.’ But this possibly misses the point. The ‘sensitive information’ contained in ClassDojo is the behavioural record built up from teachers tapping reward points into the app.

It would seem that sometimes it is not the comments on social media we make or the food that we purchase from Uber Eats, but the actual purchasing of such items that matters. To focus on the noun, ignores the ‘information’ provided by the verb.

Listened IRL Podcast Episode 12: Algorisky from irlpodcast.org
On this week's episode of IRL, we sit down with Luke Dormehl, author of Thinking Machines and The Formula, to explore the impact of algorithms, on and offline. Staci Burns and James Bridle , author of "Something is wrong on the internet," investigate the human cost of gaming YouTube recommendations. Anthropologist Nick Seaver talks about the danger of automating the status quo. And researcher Safiya Noble looks at how to prevent racial bias from seeping into code