Listened Does data science need a Hippocratic oath? from ABC Radio National

The use and misuse of our data can have enormous personal and societal consequences, so what ethical constraints are there on data scientists?

Continuing the conversation about forgetting and ethics, Antony Funnell speaks with Kate Eichhorn and Kate Mannell about digital forgetting.

Eichhorn, the author of The End of Forgetting, discusses the long and complicated history that children have and challenges associated with identity. She explains that our ability to control what is forgotten has been diminished in the age of social media. Although new solutions may allow us to connect, this also creates its own problems and consequences, such as the calcification of polarised politics. Eichhorn would like to say things are going to change, but she argues that there is little incentive for big tech. Although young people are becoming more cynical, there maybe resistance, but little hope for a return to an equitable utopian web.

Kate Mannell explores the idea of forcing a sense of ethics through the form of a hypocratic oath. Some of the problems with this is that there are many versions of the oath, it does not resolve the systemic problems and it is hard to have an oath of no harm when it is not even clear what harms are actually at play. In the end, it risks being a soft form of self regulation.

I found Eichhorn’s comments about resistance interesting when thinking about my engagement with the IndieWeb and Domain of One’s Own. I guess sometimes all we have is hope. While Mannell’s point about no harm when it is not even clear what harm is at play reminds me about Zeynep Tufekci’s discussion of shadow profilescomplications of inherited datasets and the challenges of the next machine age. In regards to education, the issue is in regards to artificial intelligence and facial recognition.