Bookmarked Privacy Postcards, or Poison Pill Privacy by Bill FitzgeraldBill Fitzgerald (FunnyMonkey)

For those who want to use this structure to create your own Privacy Postcards, I have created a skeleton structure on Github. Please, feel free to clone this, copy it, modify it, and make it your own.

Bill Fitzgerald provides a framework for unpacking privacy when it comes to apps, especially in the Play Store.
Bookmarked The 12-month turnaround: How the dumpers drove oBike out of town (The Age)
The controversial and distinctive yellow bicycles operated by Singaporean company oBike will soon disappear as quickly as they appeared.
I remember when I first saw an oBike in action, a guy rolled up to a train station and dumped it near the on ramp. In this article from The Age, Simone Fox Koob reflects on their rise and fall in Melbourne. The dockless bike share scheme is managed by an mobile app. After concerns were raised around Uber, I was sceptical of the data collected by the company. I feel the disruption may have gone too far and caused the creative revolt.

The ET oBike

It will be interesting to see how competitors respond and what – if any – changes they make.

Liked Facebook Gave Device Makers Deep Access to Data on Users and Friends by Gabriel J.X. Dance (nytimes.com)
In the furor that followed, Facebook’s leaders said that the kind of access exploited by Cambridge in 2014 was cut off by the next year, when Facebook prohibited developers from collecting information from users’ friends. But the company officials did not disclose that Facebook had exempted the makers of cellphones, tablets and other hardware from such restrictions.
Liked Digital Identities: Six Key Selves of Networked Publics | the theoryblog (theory.cribchronicles.com)

1. The Performative, Public Self

2. The Quantified – or Articulated – Self

3. The Participatory Self

4. The Asynchronous Self

5. The PolySocial – or Augmented Reality – Self

6. The Neo-Liberal, Branded Self

Bookmarked Better visions of ourselves: Human futures, user data, & The Selfish Ledger (W. Ian O'Byrne)
I think there is a reasoned response to technopanic. Perhaps a sense of technoagency is necessary. Now more than ever, faster than ever, technology is driving change. The future is an unknown, and that scares us. However, we can overcome these fears and utilize these new technologies to better equip ourselves and steer us in a positive direction.
Ian O’Byrne reflects on the internal video produced by Google Project X focusing on speculative design the notion of a ledger that does not actually belong to the user, but managed by some grand AI.

Although this was designed as a case of ‘what if’, it is a reminder of what could happen. It therefore provides a useful provocation, especially in light of Cambridge Analytica and GDPR. O’Byrne suggests that this is an opportunity to take ownership of our ledger, something in part captured by the #IndieWeb.

I agree with the thinking about this ledger, but do not agree with how it is situated in the video. I would see an opportunity for the individual to determine what information comes in to the ledger, and how it is displayed. As an example, each of the arrows coming pointing in to the ledger could be streams of information from your website, Twitter feed, Strava running app, and any other metrics you’d like to add. Each of these would come in with a modified read/write access, and sharing settings from the originating app/program/service. As the individual, you’d be in control of dictating what you present, and how you present this information in your ledger.

Interestingly, Douglas Rushkoff made the case in a recent episode of Team Human for including less not more on the ledger:

Bookmarked It’s time to be honest with parents about NAPLAN: your child’s report is misleading, here’s how by By Nicole Mockler (EduResearch Matters)

At the national level, however, the story is different. What NAPLAN is good for, and indeed what it was originally designed for, is to provide a national snapshot of student ability, and conducting comparisons between different groups (for example, students with a language background other than English and students from English-speaking backgrounds) on a national level.

This is important data to have. It tells us where support and resources are needed in particular. But we could collect the data we need this by using a rigorous sampling method, where a smaller number of children are tested (a sample) rather than having every student in every school sit tests every few years. This a move that would be a lot more cost effective, both financially and in terms of other costs to our education system.

Nicole Mockler summarises Margaret Wu’s work.around the limitations to NAPLAN in regards to statistical testing. Moving forward, Mockler suggests that NAPLAN should become a sample based test (like PISA) and is better suited as a tool for system wide analysis. To me, there is a strange balance where on the one hand many agree that NAPLAN is flawed, yet again and again we return to it as a source of ‘truth’.
Bookmarked What We Talk About When We Talk About Digital Capabilities: Keynote for #udigcap | Donna Lanclos–The Anthropologist in the Stacks (donnalanclos.com)

The history of Anthropology tells us that categorizing people is lesser than understanding them. Colonial practices were all about the describing and categorizing, and ultimately, controlling and exploiting. It was in service of empire, and anthropology facilitated that work.

It shouldn’t any more, and it doesn’t have to now.

You don’t need to compile a typology of students or staff. You need to engage with them.

In a keynote at the UCISA Digital Capabilities event at Warwick University, Donna Lanclos unpacks the effect of analytics and the problems of profiling when trying to identify improvements. A skills approach is an issue when decisions get made on your behalf based on the results of a pre-conceived checklist:

I want to draw a line from quiz-type testing that offers people an opportunity to profile themselves and the problems inherent in reducing knowledge work to a list of skills. And I also want to draw attention to the risks to which we expose our students and staff, if we use these “profiles” to predict, limit, or otherwise determine what might be possible for them in the future.

Lanclos suggests that we need to go beyond the inherent judgments of contained within metaphors and deficit models, and instead start with context:

We need to start with people’s practices, and recognize their practice as as effective for them in certain contexts.

And then ask them questions. Ask them what they want to do. Don’t give them categories, labels are barriers. Who they are isn’t what they can do.

Please, let’s not profile people.

When you are asking your students and staff questions, perhaps it should not be in a survey. When you are trying to figure out how to help people, why not assume that the resources you provide should be seen as available to all, not just the ones with “identifiable need?”

The reason deficit models persist is not a pedagogical one, it’s a political one.

She closes with the remark:

When we ask students questions, it shouldn’t be in a survey.

This reminds me of coaching the fluidity of the conversation. This also touches on my concern with emotional intelligences as a conversational tool.

The interior of the triangle is where people map the practices that are bounded by their
institution and the work they do in institutional digital platforms and places. The exterior of the triangle
is where they can map everything else–what they do that is not bounded by the institution. This can
be their personal lives, or their work that does not take place in official channels, but rather on the
open web, in self-hosted or commercial platforms.

There is also a recording of this presentation:

Listened Golden State Killer: the end of DNA privacy? Chips with Everything podcast by Jordan Erica Webber from the Guardian

US investigators recently tracked down the suspect of a 40-year-old murder case after uploading DNA to a genealogy website. Jordan Erica Webber weighs up the pros of finding ancestors with the cons of selling privacy

Jordan Erica Webber talks to Prof Charles Tumosa of the University of Baltimore, Prof Denise Syndercombe-Court of King’s College and Lee Rainie of the Pew Research Center. This is a challenging conversation and comes back to notions of ‘informed consent’.

Maggie Koerth-Baker discusses changes in data arguing that we need to stop seeing privacy as a ‘personal’ thing:

Experts say these examples show that we need to think about online privacy less as a personal issue and more as a systemic one. Our digital commons is set up to encourage companies and governments to violate your privacy. If you live in a swamp and an alligator attacks you, do you blame yourself for being a slow swimmer? Or do you blame the swamp for forcing you to hang out with alligators?

Bookmarked I am a data factory (and so are you) by Nicholas Carr (ROUGH TYPE)
The shift of data ownership from the private to the public sector may well succeed in reducing the economic power of Silicon Valley, but what it would also do is reinforce and indeed institutionalize Silicon Valley’s computationalist ideology, with its foundational, Taylorist belief that, at a personal and collective level, humanity can and should be optimized through better programming.
Nicholas Carr reflects on the metaphors that we use and demonstrates some of the flaws, particularly when they are used against us inadvertently. This is something brought to the for with Google’s effort to support wellbeing. As Arielle Pardes explains:

While Google says “digital wellness” is now part of the company’s ethos, not once during the Google I/O keynote did anyone mention “privacy.”