🎧 Does data science need a Hippocratic oath? (Future Tense)

Listened Does data science need a Hippocratic oath? from ABC Radio National

The use and misuse of our data can have enormous personal and societal consequences, so what ethical constraints are there on data scientists?

Continuing the conversation about forgetting and ethics, Antony Funnell speaks with Kate Eichhorn and Kate Mannell about digital forgetting.

Eichhorn, the author of The End of Forgetting, discusses the long and complicated history that children have and challenges associated with identity. She explains that our ability to control what is forgotten has been diminished in the age of social media. Although new solutions may allow us to connect, this also creates its own problems and consequences, such as the calcification of polarised politics. Eichhorn would like to say things are going to change, but she argues that there is little incentive for big tech. Although young people are becoming more cynical, there maybe resistance, but little hope for a return to an equitable utopian web.

Kate Mannell explores the idea of forcing a sense of ethics through the form of a hypocratic oath. Some of the problems with this is that there are many versions of the oath, it does not resolve the systemic problems and it is hard to have an oath of no harm when it is not even clear what harms are actually at play. In the end, it risks being a soft form of self regulation.

I found Eichhorn’s comments about resistance interesting when thinking about my engagement with the IndieWeb and Domain of One’s Own. I guess sometimes all we have is hope. While Mannell’s point about no harm when it is not even clear what harm is at play reminds me about Zeynep Tufekci’s discussion of shadow profiles,Β  complications of inherited datasets and the challenges of the next machine age. In regards to education, the issue is in regards to artificial intelligence and facial recognition.

4 responses on “🎧 Does data science need a Hippocratic oath? (Future Tense)”

  1. With the proposed changes to the right to abortion in United States, Zeynep Tufekci explains how we need take back our privacy. She provides a number of examples of data uses associated with Grindr, Uber and phone companies, highlighting the limits associated with de-anonymised data.

    In 2020, Consumer Reports exposed that GoodRX, a popular drug discount and coupons service, was selling information on what medications people were searching or buying to Facebook, Google and other data marketing firms. GoodRX said it would stop, but there is no law against them, or any pharmacy, doing this.
    That data becomes even more powerful whenmerged. A woman who regularly eats sushi and suddenly stops, or stops taking Pepto-Bismol, or starts taking vitamin B6 may be easily identified as someone following guidelines for pregnancy. If that woman doesn’t give birth she might find herself being questioned by the police, who may think she had an abortion. (Already, in some places, women who seek medical help after miscarriages have reported questioning to this effect.)
    https://www.nytimes.com/2022/05/19/opinion/privacy-technology-data.html

    When Tufekci says β€˜we’, she is talking about more than personal action, but rather collective change through law. She highlights how attempts to turn off location settings, use a burner phone or stay away from big tech are fraught, and explains how we need more systemic change.

    Congress, and states, should restrict or ban the collection of many types of data, especially those used solely for tracking, and limit how long data can be retained for necessary functions β€” like getting directions on a phone.
    Selling, trading and merging personal data should be restricted or outlawed. Law enforcement could obtain it subject to specific judicial oversight.
    https://www.nytimes.com/2022/05/19/opinion/privacy-technology-data.html

    Sadly, as she demonstrates with the example of Louis Brandeis in 1890 responding to Kodak camera small enough to carry and loaded with 100 shots, calls to protect privacy are not new.
    It is interesting to think of this in regards to discussions around digital forgetting and the idea of a hypocratic oath. I guess Tufekci’s point is that maybe some things should not be β€˜remembered’ in the first place. Often we worry about the threat of cyber attacks when it could be said the greatest fear is often in plain sight.

Leave a Reply

Your email address will not be published. Required fields are marked *