Liked Rethinking the Context of Edtech

If we know that we have reached the limits of what education technology can do (edtech 2.0), we now need to think about what education technology should do (edtech 3.0). I strongly believe we should be grounding edtech in the core of the disciplinary conversation, rather than leaving it at the periphery.

Listened Artificial intelligence, ethics and education from Radio National

AI holds enormous potential for transforming the way we teach, says education technology expert Simon Buckingham Shum, but first we need to define what kind of education system we want.

Also, the head of the UK’s new Centre for Data Ethics and Innovation warns democratic governments that they urgently need an ethics and governance framework for emerging technologies.

And Cognizant’s Bret Greenstein on when it would be unethical not to use AI.

Guests

Roger Taylor – Chair of the UK Government’s Centre for Data Ethics and Innovation

Simon Buckingham Shum – Professor of Learning Informatics, University of Technology Sydney, leader of the Connected Intelligence Centre; co-founder and former Vice-President of the Society for Learning Analytics Research

Bret Greenstein – Senior Vice President and Global head of AI and Analytics, Cognizant

In this episode of RN Future Tense, Antony Funnell leads an exploration of artificial intelligence, educational technology and ethics. Simon Buckingham Shum discusses the current landscape and points out that we need to define the education we want, while Roger Taylor raises the concern that if we do not find a position that fits with our state that we will instead become dictated by either America’s market based solutions or China’s focus on the state. This is a topic that has been discussed on a number of fronts, including by Erica Southgate. This also reminds me of Naomi Barnes’ 20 Thoughts on Automated Schooling.
Bookmarked Unraveling the Secret Origins of an AmazonBasics Battery (Medium)

The battery becomes less trackable the further it progresses down the chain. This is overwhelmingly due to U.S. shipping rules that allow companies to move product virtually in secret. And as Amazon expands into all modes of transport — cars, trucks, air and ocean freight — its logistics will likely become even more invisible.

Sarah Emerson reflects on her experience with the AmazonBasics Battery. In the process she follows the thread back to a Fujitsu factory in Indonesia. Although there are is not concrete data in regards to the environmental impact, she makes an effort to put the pieces of the puzzle back together. For example, she discusses a paper co-authored by Jay Turner and Leah Nugent, in which they argued that:

It takes more than 100 times the energy to manufacture an alkaline battery than is available during its use phase.” And when the entirety of a battery’s emissions are added up — including sourcing, production, and shipping — its greenhouse gas emissions are 30 times that of the average coal-fired power plant, per watt-hour.

Other than the problem that companies are not required to log such informed, Emerson also highlights that the impact is often only focued on the disposal of the item.

Bookmarked The Delicate Ethics of Using Facial Recognition in Schools (Wired)

A growing number of districts are deploying cameras and software to prevent attacks. But the systems are also used to monitor students—and adult critics.

Tom Simonite and Gregory Barber discuss the rise in facial recognition within US schools. This software is often derived from situations such as Israeli checkpoints. It serves as a ‘free‘ and ‘efficient‘ means for maintaining student safety at the cost of standardising a culture of surveillance. What is worse is the argument that the use of facial recognition is a case of fighting fire with fire:

“You meet superior firepower with superior firepower,” Matranga says. Texas City schools can now mount a security operation appropriate for a head of state. During graduation in May, four SWAT team officers waited out of view at either end of the stadium, snipers perched on rooftops, and lockboxes holding AR-15s sat on each end of the 50-yard line, just in case.(source)

I am with Audrey Watters here, what is ‘delicate’ ethics?

Replied to The War On the Smartphone: Has Data Cherry-Picking Destroyed a Generation? by an author

The truth is that most issues that are associated with “problem technology use” have their roots elsewhere. Bullying existed before smartphones, as did pornography, screen addiction, and social isolation. While it is true that smartphones can exacerbate or facilitate these things, they can also have significant positive benefits for learning, social connection, and communication. We can’t teach students to balance their screen time with personal interaction by taking the choice away from them. It is difficult to pursue lessons in the pernicious reality of data privacy and surveillance capitalism without a real and critical engagement with these issues.

I am not so concern about ‘access’ to smartphones Mike, as I am about the opportunity for ethical technology. Although we can preach digital minimalism or rooting devices, why can’t there be a solution that actually supports users rights and privacy by default?
Liked The Internet Can Make Us Feel Awful. It Doesn’t Have to Be That Way by an author (Time)

Over our history, we’ve found ways to create tools and spaces that call out and amplify the best parts of human nature. That’s the great story of -civilization—the development of technologies like written language that have moderated our animal impulses. What we need now is a new technological -enlightenment—a turn from our behaviorally optimized dark age to an era of online spaces that embrace what makes us truly human. We need online spaces that treat us as the unique, moral beings we are—that treat us, and encourage us to treat one another, with care, respect and dignity.

Liked Ethical design is not superficial by an author

We should embrace being uncomfortable. We live in a political and social hellscape. The majority of us have no job security, we can’t afford houses and we can’t afford to have families. Many of us can’t even afford healthcare. None of this is comfortable, so we may as well do something to change that for our futures, and for future generations.

Replied to AI and Human Freedom by Cameron Paterson

Historian Yuval Noah Harari writes, “The algorithms are watching you right now.  They are watching where you go, what you buy, who you meet.  Soon they will monitor all your steps, all your breaths, all your heartbeats.  They are relying on Big Data and machine learning to get to know you bette…

This is useful provocation Cameron. In part it reminds me of James Bridle’s contribution to the rethinking of Human Rights for the 21st century. I think we are entering or in a challenging time when consuming (or prosuming) comes before being informed. Something I elaborated elsewhere. With AI do we know the consequence anymore and what does it mean to discuss this in the humanities not just the tech class?

Also on: Read Write Collect

Bookmarked You can’t buy an ethical smartphone today (Engadget)

Right now, it’s impossible to buy a smartphone you can be certain was produced entirely ethically. Any label on the packaging wouldn’t stand a chance of explaining the litany of factors that go into its construction. The problem is bigger than one company, NGO or trade policy, and will require everyone’s effort to make things better.

Daniel Cooper explains the challenges associated with buying an ethical smartphone. He touches on the challenges associated with the construction (often in the Shenzhen province) and the number of rare materials involved.

Devices vary, but your average smartphone may use more than 60 different metals. Many of them are rare earth metals, so-called because they’re available in smaller quantities than many other metals, if not genuinely rare.

There is also limitations on the ability to recycle or refurbish devices, with significant challenges associated with replacing parts. This is also something that Adam Greenfield discusses in his book Radical Technologies.

via Douglas Rushkoff

Bookmarked Tools come and go. Learning should not. And what’s a “free” edtech tool, anyway? by Lyn (lynhilt.com)

Do I need this tool? Why? How does it really support learning?
What are the costs, both monetary and otherwise, of using this service? Do the rewards of use outweigh the risks?
Is there a paid service I could explore that will meet my needs and better protect the privacy of my information and my students’ information?
How can I inform parents/community members about our use of this tool and what mechanisms are in place for parents to opt their children out of using it?
When this tool and/or its plan changes, how will we adjust? What will our plans be to make seamless transitions to other tools or strategies when the inevitable happens?

Lyn Hilt reflects on Padlet’s recent pivot to a paid subscription. She argues that if we stop and reflect on what we are doing in the classroom, there are often other options. Hilt also uses this as an opportunity to remind us what ‘free’ actually means, and it is not free as in beer. We therefore need to address some of the ethical questions around data and privacy. A point highlighted by the revelations of the ever increasing Cambridge Analytica breach.
Listened Who needs ethics anyway? by an author from Chips with Everything podcast

There has been a quiet push lately by tech industry giants to get ethical about future technologies. But is anything more than PR? And how do we teach technology students to preempt a possible ethical disaster? Jordan Erica Webber explores the issues

This is a useful introduction to debate about ethics and technology. One of the interesting points made was in regards to Google and the situation where Google Photos mislabelled people with dark skin as gorillas. This is a consequence of years of racism or focus on whiteness within technology.

Watch Dr Simon Longstaff’s presentation for more on ethics.

On the 8th of December at The Overseas Passenger Terminal in Sydney Australia, BVN hosted its bi-annual conference – Futures Forum 2. The theme was ‘Knowledge and Ethics in the Next Machine Age’.

23:21 Larry Prusak: Knowledge and it’s Practices in the 21st Century

Prusak discusses the changes in knowledge over time and the impact that this has. This reminds me of Weinberger’s book Too Big To Know. Some quotes that stood out were:

Knowledge won’t flow without trust

and

Schools measure things they can measure even if it is not valuable

Again and again Prusak talks about going wide, getting out and meeting new people.

1:21:59 Professor Genevieve Bell: Being Human in a Digital Age

Bell points out that computing has become about the creation, circulation, curation and resistence of data. All companies are data companies now. For example, Westfield used to be a real estate company, but they are now a data company.

The problem with algorithms is that they are based on the familiar and retrospective, they do not account for wonder and serendipity.

As we design and develop standards for tomorrow, we need to think about the diversity associated with those boards and committees. If there are only white males at the table, how does this account for other perspectives.

We do want to be disconnected, even if Silicon Valley is built around being permanently connected. One of the things that we need to consider is what is means to have an analogue footprint.

Building on the discussion of data and trust, Bell makes the point:

The thing about trust is that you only get it once.

The question remains, who do we trust when our smart devices start selling our data.

In regards to the rise of the robots, our concern should be the artificial intelligence within them. One of the big problems is that robots follow rules and we don’t.

The future of technology that we need to be aspiring to develop a future where technology can support us with our art, wonder and curiosity.


A comment made during the presentation and shared after Bell had finished:

Is your current job the best place for you to make the world a better place?


2:49:51 Phillip Bernstein: The Future of Making Things: Design Practice in the Era of Connected Technology

Berstein unpacks six technical disruptions – data, computational design, simulation analysis, the internet of things, industrial construction and machine learning – and looks at the implications for architecture.

3:51:44 Dr Simon Longstaff: Ethics in the Next Machine Age

Dr Longstaff explores the ethics associated with technology. This includes the consideration of ethical design, a future vision – Athens or Eden – and the purpose to making. Discussing the technology of WWII, Longstaff states:

Technical mastery devoid of ethics is the root of all evil

He notes that just because we can, it does not mean we ought.

A collection of points to consider in regards to ethics in technology
A screenshot from Dr Longstaff

He also used two ads from AOL to contrast the choices for tomorrow:


H/T Tom Barrett