Ricochet is the best place on the internet to discuss the issues of the day, either through commenting on posts or writing your own for our active and dynamic community in a fully moderated environment. In addition, the Ricochet Audio Network offers over 50 original podcasts with new episodes released every day.
Last month, a Tesla was involved in a fatal single-car collision near Houston in which both the owner of the vehicle and one passenger died. The incident has raised some controversy because of preliminary reports suggesting that there was no one in the driver’s seat at the time of the collision and that the car was traveling at high speed. The car’s battery was damaged in the crash, resulting in a fire that burned for several hours and destroyed both the car and its event data recorder (black box).
It now seems likely that the car was not in self-driving mode, that there was a driver behind the wheel, and that the accident was not a case of a self-driving Tesla accelerating wildly and running off the road. (Glenn Reynolds links a Car and Driver story on the incident.)
Elon Musk commented shortly after the accident that “standard Autopilot would require lane lines to turn on, which this street did not have.” He also noted that the owner of the car had not purchased the FSD (full self-driving) package.
The National Transportation Safety Board’s (NTSB) preliminary report on the accident, issued yesterday, notes that NTSB was unable to activate the self-steering feature at the location where the accident occurred, offering confirmation that the feature was likely not enabled.
A few people here may remember the mystery of the Audi 5000 and its “unintended acceleration” problem. Back in the 1980s, some drivers of Audi 5000s experienced what they described as uncontrollable acceleration, often backward, often with catastrophic consequences. The drivers claimed that their cars accelerated wildly despite the driver’s foot being firmly pressed down on the brake pedal. The incidents achieved considerable notoriety, including a feature on CBS’s 60 Minutes program that appeared to back up the claims.
The resulting lawsuits and bad press led to a collapse in Audi sales, almost driving Audi from the US market entirely.
It seems remarkable, now, that people actually believed the stories of drivers frantically pressing the brakes as their cars sped wildly out of control. Post-accident investigations revealed that the brakes worked perfectly, and there is not a production car in America that will accelerate with the brakes firmly engaged. Audis were crashing because Audi drivers were pressing the accelerator, not the brake — pressing it just as hard as they could, likely in mute terror as their cars did exactly as they were being told to do.
Audi wasn’t blameless. It was possible to shift their cars into reverse with the accelerator, rather than the brake, depressed. (The incidents led to the widespread addition of now-common shift interlocks requiring the brake to be depressed before shifting out of park.) Audis also featured narrower pedal placement than is common on U.S. automobiles, which may have made inexperienced drivers more likely to press the wrong pedal (particularly when turned in their seats while backing up, a common scenario for the unintended acceleration events).
Whatever the problem, what is obvious now, and should have been obvious then, is that the cars weren’t failing in the way people claimed they were failing. The fundamental problem was one of operator error.
There was nothing equivalent to Tesla’s “black boxes” back then to record precisely what the drivers did, but engineers in the 1980s had two advantages that Tesla engineers today lack: they were dealing with relatively simple mechanical systems; and there was no question as to who, or what, was in the driver’s seat. Even so, they didn’t trust what common sense and simple engineering told them.
I become uneasy when I read comments claiming that the Houston Tesla “couldn’t have been in self-driving mode.” The confident predictions of computer programmers and electronic engineers that “the machine can’t do that” have led to tragedy in the past, and will again. I suspect that the accident in Houston was the result of a driver being reckless with a fast car, but it’s a little unnerving to read comments suggesting that, since the software isn’t supposed to let such-and-such happen, it follows that such-and-such couldn’t possibly have happened. Compared to hydraulics and gears and the simple muscle of a 1980s automobile, the Tesla is a technological enigma.
Never put that much faith in software and in the people who write it — not even when they work for someone as cool as Elon Musk.Published in