Liked Pedagogy, Not Outcomes – How to Do Maker Models for Language Arts by dave dave (davecormier.com)
But the journey of maker into language arts isn’t just a matter of finding time in the day. It makes sense because of narrative. So much of the creative is about coming up with a narrative for what you’re doing. Whether that’s just the name of the thing that has evolved out of your creative process or a whole story about it. The communication. The writing. The collaboration. The reflection. These are key skills that are needed for citizenship. Team that up with some coding and some maker skills and you’ve got a killer combination.
Listened Disrupting the Disruptors from Radio National
Has our contemporary embrace of disruption become a problem rather than a solution?
Antony Funnell speaks with a number of guests, including Mark Pesce – Honorary Associate, Ian Verrender and Professor Gregory Whitwell, about the idea of disruption today. However, the most interesting conversation is with Professor Andrew King. He has done considerable work testing Clayton Christensen’s theory and highlights some of the limitations to it. This includes Christensen’s approach to ongoing research and modelling, where he collects data, theorises and then tests with new data, adjusting his initial theory. You can read more on King’s work here.
Listened Ep. 75 Live From San Francisco at Gray Area Foundation for the Arts Pt.1: Annalee Newitz | Team Human by Douglas Rushkoff from Team Human
If you have slavery in any part of your culture, the entire culture is infected by it.
In this conversation between Annalee Newitz and Douglas Rushkoff, they talk about robots, ethics, autonomy, slavery, gender and cats.

📰 eLearn Update (February 2018)

Here is a collection of links and resources associated with GSuite and Hapara for February 2018.

Updates

Resources

Drive

Chrome

Docs

Slides

Forms

Sheets

Sites

Classroom

Drawings

Geo Tools

YouTube

Photos

  • Capture more of your favorite moments with Google Clips – Google announces the release of Clips, a new type of camera that captures the moments that happen in between posed pictures by using on-device machine learning to look for great facial expressions from the people—and pets—in your life. It turns these into short clips without you having to use video editing software.

General

Bookmarked The tools matter and the tools don’t matter (austinkleon.com)
You have to find the right tools to help your voice sing.
Austin Kleon reflects on the pride and place of the tool for the artist. Although he suggests that it does not necessarily matter, he also argues that we need to find the right tool that helps us sing. So often we talk about transformation or redefinition, but how often do we consider that the tool for each student ‘sing’ maybe different?
Bookmarked Opinion | The Tyranny of Convenience (nytimes.com)
All the personal tasks in our lives are being made easier. But at what cost?
Tim Wu plots a convienient history, with the first revolution being of the household (Oven, Vacuum etc) and then the personal revolution (Walkman, Facebook etc). He argues that the irony of this individualisation is the creation of ‘templated selfs’:

The paradoxical truth I’m driving at is that today’s technologies of individualization are technologies of mass individualization. Customization can be surprisingly homogenizing. Everyone, or nearly everyone, is on Facebook: It is the most convenient way to keep track of your friends and family, who in theory should represent what is unique about you and your life. Yet Facebook seems to make us all the same. Its format and conventions strip us of all but the most superficial expressions of individuality, such as which particular photo of a beach or mountain range we select as our background image.

I do not want to deny that making things easier can serve us in important ways, giving us many choices (of restaurants, taxi services, open-source encyclopedias) where we used to have only a few or none. But being a person is only partly about having and exercising choices. It is also about how we face up to situations that are thrust upon us, about overcoming worthy challenges and finishing difficult tasks — the struggles that help make us who we are. What happens to human experience when so many obstacles and impediments and requirements and preparations have been removed?

Wu argues that struggling and working things out is about identity:

We need to consciously embrace the inconvenient — not always, but more of the time. Nowadays individuality has come to reside in making at least some inconvenient choices. You need not churn your own butter or hunt your own meat, but if you want to be someone, you cannot allow convenience to be the value that transcends all others. Struggle is not always a problem. Sometimes struggle is a solution. It can be the solution to the question of who you are.

I recently reflected on the impact of convienience on learning. I guess that is a part of my ‘identity’.

via Audrey Watters

📓 Technology is a System

Responding to yet another school shooting, Audrey Watters pushes back on those who argue that guns are not ‘ed-tech’. Instead she argues that what we define as ‘technology’ is the problem. She provides a quote from Ursula Franklin’s 1989 CBC Massey Lectures that captures this thinking:

Technology is not the sum of the artefacts, of the wheels and gears, of the rails and electronic transmitters. Technology is a system. It entails far more than its individual material components. Technology involves organization, procedures, symbols, new words, equations, and, most of all, a mindset.

Watters explains that this includes many elements within schools and should not be merely reduced to ‘computers’. In a second post, she explains that:

“Hardening schools” is an education technology endeavor, whether or not we take seriously anyone’s suggestions about giving teachers guns. For now, “hardening schools” explicitly calls for hardware like those items listed by Governor Scott: metal detectors and bulletproof windows, as well as surveillance cameras and various sensors that can detect gunfire. It also implies software – social media monitoring and predictive analytics tools, for example, that claim they can identify students “at risk” of violence or political extremism.

Coming at this problem from a different perspective, Genevieve Bell responded to questions of data and ‘neutrality’ in the Q&A associated with her Boyer Lectures. Given the example of the supposed innocence of a train timetable, she explained how Amazon use variables such as timetables to continually adjust the price of goods.

Bookmarked The Case Against Google by Charles Duhigg (nytimes.com)
Antitrust has never been just about costs and benefits or fairness. It’s never been about whether we love the monopolist. People loved Standard Oil a century ago, and Microsoft in the 1990s, just as they love Google today. Rather, antitrust has always been about progress. Antitrust prosecutions are part of how technology grows. Antitrust laws ultimately aren’t about justice, as if success were something to be condemned; instead, they are a tool that society uses to help start-ups build on a monopolist’s breakthroughs without, in the process, being crushed by the monopolist. And then, if those start-ups prosper and make discoveries of their own, they eventually become monopolies themselves, and the cycle starts anew. If Microsoft had crushed Google two decades ago, no one would have noticed. Today we would happily be using Bing, unaware that a better alternative once existed. Instead, we’re lucky a quixotic antitrust lawsuit helped to stop that from happening. We’re lucky that antitrust lawyers unintentionally guaranteed that Google would thrive.
Charles Duhigg takes a look at the history of Anti-Trust laws and the breaking up of monopolies. From oil to IBM, he explains why it is important for this large companies to be broken up. Not because of the consumer, but rather for the sack of developnent and innovation.

He uses the case of the vertical search site, Foundem.com, to demonstrate the way in which Google kills competition by removing them from searches.

In 2006, Google instituted a shift in its search algorithm, known as the Big Daddy update, which penalized websites with large numbers of subpages but few inbound links. A few years later, another shift, known as Panda, penalized sites that copied text from other websites. When adjustments like these occurred, Google explained to users, they were aimed at combating “individuals or systems seeking to ‘game’ our systems in order to appear higher in search results — using low-quality ‘content farms,’ hidden text and other deceptive practices.”

Left unsaid was that Google itself generates millions of new subpages without inbound links each day, a fresh page each time someone performs a search. And each of those subpages is filled with text copied from other sites. By programming its search engine to ignore other sites doing the same thing that Google was doing, critics say, the company had made it nearly impossible for competing vertical-search engines, like Foundem, to show up high in Google’s results.

Rather than living off their innovation, Adam and Shivaun Raff have spent the last twelve years campaigning against Google. Supported by Gary Reback, they took their case to European Commission in Brussels.

Reback had told Adam and Shivaun that it was important for them to keep up their fight, no matter the setbacks, and as evidence he pointed to the Microsoft trial. Anyone who said that the 1990s prosecution of Microsoft didn’t accomplish anything — that it was companies like Google, rather than government lawyers, that humbled Microsoft — didn’t know what they were talking about, Reback said. In fact, he argued, the opposite was true: The antitrust attacks on Microsoft made all the difference. Condemning Microsoft as a monopoly is why Google exists today, he said.

If such changes and challenges is dependent on individuals such as the Raff’s standing up, it makes you wondering how many just throw it all in. Cory Doctorow captures this scenario in his novel, The Makers.

Listened Ep. 74 Damien Williams from shows.pippa.io
Technology philosopher Damien Williams on how the algorithms running society are embedded with the same biases as the people who program them.
Douglas Rushkoff and Damien Williams discuss the biases that we build into our technology through their design and the problems that this creates for machine learning. In some respects, this touches on the work of Cathy O’Neil.

Rushkoff also begins with a reflection on the use of social media by schools. He wonders why is it so easy for people to losesight of the design and purpose behind these platforms? He argues that other than teaching media, social media (Twitter, Facebook, Instagram etc) should never be used by schools. Use blogs or a space you manage yourself and your story – something that I have touched upon in the past – but to feed the ad algorithms is the wrong approach.

Bookmarked The offloading ape: the human is the beast that automates – Antone Martinho-Truswell | Aeon Essays (Aeon)
It’s not tools, culture or communication that make humans unique but our knack for offloading dirty work onto machines
Antone Martinho-Truswell looks into the differences between humans and animals, suggesting that what stands us apart is cognitive and physical automation.

There are two ways to give tools independence from a human, I’d suggest. For anything we want to accomplish, we must produce both the physical forces necessary to effect the action, and also guide it with some level of mental control. Some actions (eg, needlepoint) require very fine-grained mental control, while others (eg, hauling a cart) require very little mental effort but enormous amounts of physical energy. Some of our goals are even entirely mental, such as remembering a birthday. It follows that there are two kinds of automation: those that are energetically independent, requiring human guidance but not much human muscle power (eg, driving a car), and those that are also independent of human mental input (eg, the self-driving car). Both are examples of offloading our labour, physical or mental, and both are far older than one might first suppose.

Although it can be misconstrued as making us stupid, the intent of automation is complexity:

The goal of automation and exportation is not shiftless inaction, but complexity. As a species, we have built cities and crafted stories, developed cultures and formulated laws, probed the recesses of science, and are attempting to explore the stars. This is not because our brain itself is uniquely superior – its evolutionary and functional similarity to other intelligent species is striking – but because our unique trait is to supplement our bodies and brains with layer upon layer of external assistance.

My question is whether some automation today is actually intended to be stupid or too convenient as a means of control. This touches on Douglas Rushkoff’s warning ‘program or be programmed. I therefore wonder what the balance is between automation and manually completing various tasks in order to create more complexity.