Bookmarked crumple zones by Alex Hern ([object Object])

In 2018, Rafaela Vasquez was working as a “safety driver” for Uber in Arizona. Employed to sit in a “self-driving car”, and seize control if something went wrong, she was behind the wheel when the car, a modified Volvo, hit and killed a pedestrian. The details, as they always are, are messy. The car had been altered to disable Volvo’s own automatic braking function, so as to test Uber’s machine learning system. The pedestrian was crossing the road outside of a designated spot. Arizona had passed wildly permissive laws allowing testing of self-driving vehicles with minimal oversight, in an effort to tempt valuable engineering jobs from companies like Google and Uber. And Vasquez, at the time of the collision, was watching TV.

Alex Hern discusses self-driving cars and the difference between Level 4 and Level 5 autonomy:

Waymo now says that experience was crucial in guiding how it approached self-driving cars. Rather than aiming for so-called “level 4” autonomy, where the car can mostly drive itself but a human needs to take over in emergencies, the company decided to jump straight to “level 5” – where a human driver is never needed. Their experience was that human drivers simply weren’t capable of serving as a back-up to a nearly-but-not-entirely infallible robot.

The reality in the end is that although full automation is the goal, companies like Uber are still reliant on humans to step in when needed and this is easier said than done.