What That Houston Tesla Couldn’t Possibly Do

 

Last month, a Tesla was involved in a fatal single-car collision near Houston in which both the owner of the vehicle and one passenger died. The incident has raised some controversy because of preliminary reports suggesting that there was no one in the driver’s seat at the time of the collision and that the car was traveling at high speed. The car’s battery was damaged in the crash, resulting in a fire that burned for several hours and destroyed both the car and its event data recorder (black box).

It now seems likely that the car was not in self-driving mode, that there was a driver behind the wheel, and that the accident was not a case of a self-driving Tesla accelerating wildly and running off the road. (Glenn Reynolds links a Car and Driver story on the incident.)

Elon Musk commented shortly after the accident that “standard Autopilot would require lane lines to turn on, which this street did not have.” He also noted that the owner of the car had not purchased the FSD (full self-driving) package.

The National Transportation Safety Board’s (NTSB) preliminary report on the accident, issued yesterday, notes that NTSB was unable to activate the self-steering feature at the location where the accident occurred, offering confirmation that the feature was likely not enabled.


A few people here may remember the mystery of the Audi 5000 and its “unintended acceleration” problem. Back in the 1980s, some drivers of Audi 5000s experienced what they described as uncontrollable acceleration, often backward, often with catastrophic consequences. The drivers claimed that their cars accelerated wildly despite the driver’s foot being firmly pressed down on the brake pedal. The incidents achieved considerable notoriety, including a feature on CBS’s 60 Minutes program that appeared to back up the claims.

The resulting lawsuits and bad press led to a collapse in Audi sales, almost driving Audi from the US market entirely.

It seems remarkable, now, that people actually believed the stories of drivers frantically pressing the brakes as their cars sped wildly out of control. Post-accident investigations revealed that the brakes worked perfectly, and there is not a production car in America that will accelerate with the brakes firmly engaged. Audis were crashing because Audi drivers were pressing the accelerator, not the brake — pressing it just as hard as they could, likely in mute terror as their cars did exactly as they were being told to do.

Audi wasn’t blameless. It was possible to shift their cars into reverse with the accelerator, rather than the brake, depressed. (The incidents led to the widespread addition of now-common shift interlocks requiring the brake to be depressed before shifting out of park.) Audis also featured narrower pedal placement than is common on U.S. automobiles, which may have made inexperienced drivers more likely to press the wrong pedal (particularly when turned in their seats while backing up, a common scenario for the unintended acceleration events).

Whatever the problem, what is obvious now, and should have been obvious then, is that the cars weren’t failing in the way people claimed they were failing. The fundamental problem was one of operator error.

There was nothing equivalent to Tesla’s “black boxes” back then to record precisely what the drivers did, but engineers in the 1980s had two advantages that Tesla engineers today lack: they were dealing with relatively simple mechanical systems; and there was no question as to who, or what, was in the driver’s seat. Even so, they didn’t trust what common sense and simple engineering told them.


I become uneasy when I read comments claiming that the Houston Tesla “couldn’t have been in self-driving mode.” The confident predictions of computer programmers and electronic engineers that “the machine can’t do that” have led to tragedy in the past, and will again. I suspect that the accident in Houston was the result of a driver being reckless with a fast car, but it’s a little unnerving to read comments suggesting that, since the software isn’t supposed to let such-and-such happen, it follows that such-and-such couldn’t possibly have happened. Compared to hydraulics and gears and the simple muscle of a 1980s automobile, the Tesla is a technological enigma.

Never put that much faith in software and in the people who write it — not even when they work for someone as cool as Elon Musk.

Published in Science & Technology
This post was promoted to the Main Feed by a Ricochet Editor at the recommendation of Ricochet members. Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 8 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. DonG (2+2=5. Say it!) Coolidge
    DonG (2+2=5. Say it!)
    @DonG

    Human factors engineering is underappreciated.   Radar in cars is cool.

    • #1
  2. She Member
    She
    @She

    Henry Racette: Never put that much faith in software and in the people who write it

    Umm, yeah.  Still, I can’t help but think that, as with many other things, it’s often the case that more damage is done by the cover-up than was originally caused by the bugs.  Especially when large sums of money, prestige, or national reputations are involved.  Here’s one example:

    QOTD: Which office do I go to, to get my reputation back?

    I sure there are others…. 

    • #2
  3. The Reticulator Member
    The Reticulator
    @TheReticulator

    It wasn’t just Audi’s that were accused of having that problem, was it? It seems that some Fords were so-accused as well. It stuck in my mind because at my workplace I was driving one of the accused car models for local errands. Pretty sure it was Ford but it was definitely American-made. If it knows what’s good for it, a Michigan public university buys only American-made cars. This would have been in the very early 1980s; maybe even as early as 1979.

    • #3
  4. Henry Racette Member
    Henry Racette
    @HenryRacette

    The Reticulator (View Comment):

    It wasn’t just Audi’s that were accused of having that problem, was it? It seems that some Fords were so-accused as well. It stuck in my mind because at my workplace I was driving one of the accused car models for local errands. Pretty sure it was Ford but it was definitely American-made. If it knows what’s good for it, a Michigan public university buys only American-made cars. This would have been in the very early 1980s; maybe even as early as 1979.

    Toyota subsequently faced a bout of it as well. Operator error is confined to no single brand.

    • #4
  5. Jerry Giordano (Arizona Patrio… Member
    Jerry Giordano (Arizona Patrio…
    @ArizonaPatriot

    Henry Racette (View Comment):

    The Reticulator (View Comment):

    It wasn’t just Audi’s that were accused of having that problem, was it? It seems that some Fords were so-accused as well. It stuck in my mind because at my workplace I was driving one of the accused car models for local errands. Pretty sure it was Ford but it was definitely American-made. If it knows what’s good for it, a Michigan public university buys only American-made cars. This would have been in the very early 1980s; maybe even as early as 1979.

    Toyota subsequently faced a bout of it as well. Operator error is confined to no single brand.

    Yeah, it was multiple vehicle lines.  I had at least one SUA (“sudden unintended acceleration”) case back in the day, when I was doing automotive products liability defense.

    The BTSI was the solution that Hank mentioned in the OP.  Pronounced “bitsy,” BTSI means “brake-transmission shift interlock.”  The introduction of the BTSI essentially ended the SUA cases, incidentally demonstrating that they were operator error.

    • #5
  6. Randy Webster Inactive
    Randy Webster
    @RandyWebster

    PJ O’Rourke had a chapter about the Audi 5000 in Parliament of Whores.  Actually, the chapter was about the NTSB, but the Audi problem was what it was dealing with when PJ was writing the book.

    • #6
  7. John Racette Inactive
    John Racette
    @JohnRacette

    Good post, Bro.

    Oh, and, speaking of Musk, I installed two more Starlinks this past week, and have another one coming up on Friday. 

    But there have been reports of rapid, uncontrollable download speeds, even when nobody is on the Information Superhighway, so…

    It’s probably a hardware problem.

    • #7
  8. Doug Watt Member
    Doug Watt
    @DougWatt

    Transference of blame on anything but the driver is not uncommon. Traffic light malfunctions claimed by those that have run a red light is made to try and avoid a cite.

    There was one intersection that I monitored because there was high number of what is called Ped Struck (Pedestrians Struck in a cross walk). Before I wrote that cite I watched the lights to be sure they were all cycling correctly for about ten minutes. If I had a dollar for every time a driver claimed that they had a green light and would claim the signal malfunctioned I’d be a rich man. 

    • #8
Become a member to join the conversation. Or sign in if you're already a member.