A year ago, the city of Dunkirk in France made its bus system entirely free — causing a boom in ridership, as well as a drop in car usage.
Behold StoryAI: Type in a few sentences, and the model autocompletes your story, writing a few hundred more words that, by its calculations, could plausibly follow. It was created by the folks at B…
Is it possible the crabs devoured the human body and dragged the bones back to their burrows? Back in 1940 when the researcher originally found the site with the 13 bones, he noted that “coconut crabs had scattered many bones.” To test if this crab-theft were possible, a Earhart-hunting expedition that has been exploring the island performed a few experiments:
Clive Thompson discusses the hypothesis that Amelia Earhart crashed in the Pacific and her body was broken up by coconut crabs. This is an interesting resource in regards to the interpretative nature of history.
On the upside, the rise of AI tools could spur entirely new genres. Fresh music technologies often do. The electric guitar gave us rock, the synth helped create new wave, electronic drum machines and samplers catalyzed the growth of hip-hop. Auto-Tune was a dirty little secret of the record industry, a way to clean up bad singing performances, until artists like Cher and T-Pain used it to craft entirely new, wild vocal styles. The next great trend in music could be sparked by an artist who takes the AI capabilities and runs with them. “Someone can make their own and really develop an identity of, I’m the person who knows how to use this,” says Magenta project engineer Adam Roberts. “A violin—this was technology that when you give it to Mozart, he goes, ‘Look what I can do with this piece of technology!’” exclaims Cohen, the Orchard co-founder. “If Mozart was a teenager in 2019, what would he do with AI?”
Clive Thompson looks at the marriage of music and machine learning to create tracks on demand. He discusses some of the possibilities, such as generating hours of ambient music on the fly or creating quick and easy soundtracks. It is interesting to think about this alongside software music and the innovation driven by broken machines.
Rather than adjusting the stylesheet, I have taken to using Granary to make a feed and follow tweets via a Feed Reader. I then respond using Brid.gy via own site. I have not gotten to the point of regularly using micropub client, like Indigenous, but have tinkered with it. Although I miss out of some of the nuansances, I like avoiding the kruft, such as who to follow and adverts.
I remember when I first saw anin action in Melbourne, a guy rolled up to a train station and dumped it near the on ramp. It just seemed wrong. I understand there are limitations to having organised parking spots, however I think that merely dumping the scooters or bikes where ever is irresponsible. My other concern is the data collection associated with such ventures.
Clive Thompson discusses the power of big data to support making clearer decisions around climate change. In the New Dark Age, James Bridle argues that there is a certain irony associated with using technology to solve the problems of technology.
Thinking about climate change is degraded by climate change itself, just as communications networks are undermined by the softening ground, just as our ability to debate and act on entangled environmental and technological change is diminished by our inability to conceptualise complex systems. And yet at the heart of our current crisis is the hyperobject of the network: the internet and the modes of life and ways of thinking it weaves together (Page 79)
The other problem is where the data gets manipulated to support vested interests.
Clive Thompson explains how the values of code become the norm, and how some coders are successfully avoiding the Lust for Scale.
Clive Thompson and Douglas Rushkoff reminisce about the early days of coding when it was more akin to origami. Thompson argues that scale and efficiency created by the venture capital model breaks the social system. This produces a focus on measurement and engagement, A/B testing, over humans and representation. It is interesting to consider this alongside Thompson’s previous book Smarter Than You Think. The challenge we have is to move away from the magic of coding and automation, the focus on coding as a career.
I recently finished reading Team Human. Although I should not be surprised, but I found it less about technology (what I have come to expect from Rushkoff) and more about the capacity of humans. In some respects it feels like a modern day book of meditations.
I have started reading Coders. I had read Thompson’s post on women and learning to code, but like yourself bought the book on reputation. Am loving it so far. He really his a knack of telling a story.
In light on Clive Thompson’s new book, he reflects on the ten lessons associated with learning to code:
- The online world is your friend. Start there.
- Don’t stress over what language to pick.
- Code every day.
- Automate your life.
- Prepare for constant, grinding frustration.
- Build things. Build lots of things.
- “View Source”: Take other people’s code, pick it apart, and reuse it.
- Build things for you—code you need and want.
- Learn how to learn.
- Reach out to other coders.
Two points that stood out to me from Thompson’s was coding every day and doing so with purpose. I have been doing quite a bit with Google Sheets lately. I find myself needing to relearn things after leaving things for a few weeks. Repetition is important.
I was also reminded of Richard Olsen’s post on why coding is the vanguard for modern learning.
Opps, forgot to pre-order, but just purchased it if that helps 🤷♂️
Enjoyed the extracts so far so excited to read the full book.
Enjoyed the extracts so far so excited to read the full book.
The current political moment is incredibly interesting. Anyone who wants to deal with climate change may have only a brief window to sell the public on a plan. In his new book The Uninhabitable Earth: Life After Warming, the writer David Wallace-Wells talks about the value of panic to pushing collective action; Doctorow says it’s the point “where you divert your energy from convincing people there’s a problem to convincing them there’s a solution.”
Computer programming once had much better gender balance than it does today. What went wrong?
In an article adapted from “Coders: The Making of a New Tribe and the Remaking of the World,” Clive Thompson breaks down the gender divide when it comes to coding. He profiles the early journeys of Mary Allen Wilkes and Elsie Shutt. This is contrasted with the current context of a male dominated space. He explores a number of reports, practices and policies that led to the transition from these early years to today, as well as attempts to push back on this.
Once the first generation of personal computers, like the Commodore 64 or the TRS-80, found their way into homes, teenagers were able to play around with them, slowly learning the major concepts of programming in their spare time. By the mid-’80s, some college freshmen were showing up for their first class already proficient as programmers. They were remarkably well prepared for and perhaps even a little jaded about what Computer Science 101 might bring. As it turned out, these students were mostly men, as two academics discovered when they looked into the reasons women’s enrollment was so low.
What Margolis heard from students — and from faculty members, too — was that there was a sense in the classroom that if you hadn’t already been coding obsessively for years, you didn’t belong. The “real programmer” was the one who “had a computer-screen tan from being in front of the monitor all the time,” as Margolis puts it. “The idea was, you just have to love being with a computer all the time, and if you don’t do it 24/7, you’re not a ‘real’ programmer.” The truth is, many of the men themselves didn’t fit this monomaniacal stereotype. But there was a double standard: While it was O.K. for the men to want to engage in various other pursuits, women who expressed the same wish felt judged for not being “hard core” enough. By the second year, many of these women, besieged by doubts, began dropping out of the program. (The same was true for the few black and Latino students who also arrived on campus without teenage programming experience.)
By the ’80s, the early pioneering work done by female programmers had mostly been forgotten. In contrast, Hollywood was putting out precisely the opposite image: Computers were a male domain. In hit movies like “Revenge of the Nerds,” “Weird Science,” “Tron,” “WarGames” and others, the computer nerds were nearly always young white men.
If biology were the reason so few women are in coding, it would be impossible to explain why women were so prominent in the early years of American programming, when the work could be, if anything, far harder than today’s programming. It was an uncharted new field, in which you had to do math in binary and hexadecimal formats, and there were no helpful internet forums, no Google to query, for assistance with your bug. It was just your brain in a jar, solving hellish problems.
Changing the culture at schools is one thing. Most female veterans of code I’ve spoken to say that what is harder is shifting the culture of the industry at large, particularly the reflexive sexism and racism still deeply ingrained in Silicon Valley. Some, like Sue Gardner, sometimes wonder if it’s even ethical for her to encourage young women to go into tech. She fears they’ll pour out of computer-science programs in increasing numbers, arrive at their first coding job excited and thrive early on, but then gradually get beaten down by industry. “The truth is, we can attract more and different people into the field, but they’re just going to hit that wall in midcareer, unless we change how things happen higher up,” she says.
This reminds me of Clive Thompson’s discussion of memory in his book Smarter Than You Think. From memory his discovery is that our memory is never as good as we think:
Our brains are remarkably bad at remembering details. They’re great at getting the gist of something, but they consistently muff the specifics. Whenever we read a book or watch a TV show or wander down the street, we extract the meaning of what we see—the parts of it that make sense to us and fit into our overall picture of the world—but we lose everything else, in particular discarding the details that don’t fit our predetermined biases. This sounds like a recipe for disaster, but scientists point out that there’s an upside to this faulty recall. If we remembered every single detail of everything, we wouldn’t be able to make sense of anything. Forgetting is a gift and a curse: by chipping away at what we experience in everyday life, we leave behind a sculpture that’s meaningful to us, even if sometimes it happens to be wrong.Page 28
In an age where we write more than ever, emoji is the new language of the heart.