A ‘No Driver’ Tesla?

 

Imagine my surprise when I saw a Tesla backing out of a parking space and no one—not even in the driver’s seat—was inside! But then I should tell you the whole story, and then I’d like to get your input.

The other day we were working out at our gym. An older fellow has been showing up fairly regularly on the days we’re there: he’s very incapacitated and works with a personal trainer; he’s tall, thin and although his mind seems intact, his body seems severely limited, as he moves very slowly. I didn’t pay much attention to him since I keep busy monitoring my own workout.

When my husband and I left the facility, we headed to our car parked in the lot. We noticed a fellow standing near our car, staring at the car next to ours—which was slowly backing out of its parking space and turning toward our car. But no one was in the moving car!

We all called out, and suddenly the car stopped. And then we saw the man I described earlier, slowly making his way toward us. He explained that he is able to drive the car remotely to where he is standing, and it works fine as long as no one or nothing is in the way. He basically has the car pick him up.

Right.

Meanwhile, he moved his car forward with his remote control, got in his car, and spent a couple of minutes orienting himself. We looked at the logo on the car, and it was a Tesla.

Figures.

As we got in our own car and finally caught our breaths, we started to ask ourselves questions: is it legal for him to do that? Should he even be responsible for “moving” the car? What if the car had hit our car? What if he had hit a person in the parking lot? Should we have talked to him more? Should we have reported him to someone?

If I see him again at the gym, I may initiate a polite conversation with him.

Is this crazy or what?

Published in Culture
Tags: ,

This post was promoted to the Main Feed by a Ricochet Editor at the recommendation of Ricochet members. Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 51 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. CACrabtree Coolidge
    CACrabtree
    @CACrabtree

    I was watching TV last night and a commercial came on with a Safety Engineer stating that the Tesla software was no good.  I’m not sure who is putting these out.

    • #1
  2. Henry Racette Member
    Henry Racette
    @HenryRacette

    Susan,

    I don’t like self-driving cars. I think it’s wildly over-hyped technology, something I don’t want and hope never to have foisted on me (and I’m old enough to be pretty confident it won’t be). I think we are many years from it being truly able to operate vehicles in an unattended fashion.

    Having said that, I think there’s nothing to be alarmed about in the situation you describe. The fellow wasn’t actually “driving” the car in the usual sense of driving. What he was doing was summoning it — sending it a signal that told the car to come to him. The car was then handling all the details of checking its environment, backing out, etc.

    This is actually about the easiest thing a self-driving care can do. There is very little traffic, it is moving at a very low speed, it can come to a complete stop without blocking traffic, and the traffic laws are the simplified subset that define driving in a parking lot. This “automated valet parking” idea is, in my opinion, a sensible application (forgetting the cost for a moment) of this technology: it’s practical and safe, and a real boon for people with limited mobility.

    I haven’t looked at the statistics for a couple of years, but I suspect that self-driving Teslas are probably now safer than human-driven cars, in terms of fatalities and serious injury. Last I checked they were prone to minor fender-benders and very low-speed collisions, particularly because they don’t merge well and so tend to slow down unexpectedly, but usually good about paying attention at higher speeds and erring on the side of caution.

    They aren’t good enough to drive on busy roads with the driver’s hands away from the steering wheel, and I think that “almost good enough” state is a recipe for lulling people into a false sense of security and getting them into trouble. Automation is great for helping people pay attention, warning when they’re approaching too fast, keeping them in their lane, managing slippery roads, etc. But the almost-competent self-driving features just seem to invite daydreaming and inattention, and that seems counter-productive to me.

    But remotely asking your car to carefully back out of a parking spot and come over to you — or to parallel park for you, for that matter — is fine. I don’t want it, but I understand that some people do. I wouldn’t worry.

    H.


    PS Regarding CACrabtree’s comment #1: There’s a guy running for office who is on an anti-Tesla kick, saying he knows how to build essentially perfect self-driving software and condemning Tesla’s efforts. He has competing software products that he develops for other car companies, and he isn’t good about disclosing that. I don’t trust him. — H.

    • #2
  3. Stad Coolidge
    Stad
    @Stad

    I like the concept of self-driving vehicles.  Just test the beejeezus out of them before unleashing them on the public.

    Personally, I can’t see the technology ever advancing to the point where autonomous vehicles become safe and effective.  There are so many variables involved with driving, that even the most advanced biological computer (the human brain) has difficulty behind the wheel now and then.

    • #3
  4. Susan Quinn Contributor
    Susan Quinn
    @SusanQuinn

    Henry Racette (View Comment):
    But remotely asking your car to carefully back out of a parking spot and come over to you — or to parallel park for you, for that matter — is fine.

    Thanks, Hank. I’m feeling a little more comfortable about the situation. There is the point, however, whether he should be driving the car, once he’s inside. If he had to re-test for a driver’s license, I question whether he’d pass. But that’s a different issue.

    • #4
  5. Mad Gerald Coolidge
    Mad Gerald
    @Jose

    Henry Racette (View Comment):
    They aren’t good enough to drive on busy roads with the driver’s hands away from the steering wheel, and I think that “almost good enough” state is a recipe for lulling people into a false sense of security and getting them into trouble.

    There are cases of drivers taking a nap while the vehicle is on auto-pilot.  The current systems are just not safe enough for that.

    Until the self driving cars reach the point where they are safe I think a “dead man’s switch” is appropriate.  It requires the driver to consciously make an input on a recurring basis or the vehicle will stop.  These are currently mandatory, I believe, on all modern locomotives.

    Personally I would find this feature extremely annoying.  But if they are required to drive a locomotive, I don’t see a reason not to require them on self driving cars, until the cars are actually safe enough for the driver to take a nap.

    • #5
  6. Full Size Tabby Member
    Full Size Tabby
    @FullSizeTabby

    I have not seen this function operate in person, but I have seen videos of the self-parking (and unparking) function in operation, not just on Tesla cars. The legal liabilities when things go wrong are still being worked out.

    My 2019 Ford Escape will supposedly park itself, though I need to be in the driver’s seat while it does so. I have never tried the function. 

    I don’t think the function is crazy. Parking and unparking is one of the simpler “self-driving” functions to include in a car. 

    • #6
  7. Susan Quinn Contributor
    Susan Quinn
    @SusanQuinn

    Full Size Tabby (View Comment):
    I don’t think the function is crazy. Parking and unparking is one of the simpler “self-driving” functions to include in a car.

    I guess I was taking the entire situation into account, including the owner’s obvious limitations. I guess I’m old fashioned, too. I just think there should always be someone in the driver’s seat. I just can’t accept that it is wise to let a car operate without anyone inside.

    Edit: you might have felt a little different if your car had been parked next to his!

    • #7
  8. OldPhil Coolidge
    OldPhil
    @OldPhil

    I saw a TV ad (can’t remember the brand) that was bragging because the car parks itself. The occupants were pulling into a bar/restaurant/something, and the one spot left was too narrow for them to open their doors. So they got out and the car pulled itself into a spot with about 6 inches on each side.

    Brilliant!

    • #8
  9. Rodin Member
    Rodin
    @Rodin

    Eventually its going to be only AI driven vehicles. That’s when it will be safe(st). But the transition is going to be messy.

    • #9
  10. Henry Racette Member
    Henry Racette
    @HenryRacette

    Rodin (View Comment):

    Eventually its going to be only AI driven vehicles. That’s when it will be safe(st). But the transition is going to be messy.

    I live half an hour south of Canada, where the roads are snow-covered for three or four months out of the year, and snowbanks make seeing around corners nearly impossible. I will be very impressed when a self-driving car can navigate our icy winter streets and highways.

    I’ve been thinking of writing a short story about someone like me, an old guy who doesn’t want a self-driving car, so keeps moving further north, into snowier and more challenging environments, as self-driving cars are mandated at more southern latitudes.

    • #10
  11. Rodin Member
    Rodin
    @Rodin

    Henry Racette (View Comment):

    Rodin (View Comment):

    Eventually its going to be only AI driven vehicles. That’s when it will be safe(st). But the transition is going to be messy.

    I live half an hour south of Canada, where the roads are snow-covered for three or four months out of the year, and snowbanks make seeing around corners nearly impossible. I will be very impressed when a self-driving car can navigate our icy winter streets and highways.

    I’ve been thinking of writing a short story about someone like me, an old guy who doesn’t want a self-driving car, so keeps moving further north, into snowier and more challenging environments, as self-driving cars are mandated at more southern latitudes.

    With climate change you might have to move south :).

    • #11
  12. Gossamer Cat Coolidge
    Gossamer Cat
    @GossamerCat

    Susan Quinn (View Comment):

    Henry Racette (View Comment):
    But remotely asking your car to carefully back out of a parking spot and come over to you — or to parallel park for you, for that matter — is fine.

    Thanks, Hank. I’m feeling a little more comfortable about the situation. There is the point, however, whether he should be driving the car, once he’s inside. If he had to re-test for a driver’s license, I question whether he’d pass. But that’s a different issue.

    You are correct Susan.  There is the issue of fetching his parked car and the issue of whether he should be driving at all.  You didn’t say exactly how old he is, but if he is that physically frail, I would say the answer is a firm “no”.   Reaction times matter in driving, even if you let the car do most of the driving itself at this point in time.

    I have a beautiful cousin whose life was destroyed by an elderly driver at age 22.  She was severely brain damaged.  I think eventually self driving cars will be a good answer for elderly drivers.  I’m just not sure quite yet.

    • #12
  13. Susan Quinn Contributor
    Susan Quinn
    @SusanQuinn

    Henry Racette (View Comment):

    Rodin (View Comment):

    Eventually its going to be only AI driven vehicles. That’s when it will be safe(st). But the transition is going to be messy.

    I live half an hour south of Canada, where the roads are snow-covered for three or four months out of the year, and snowbanks make seeing around corners nearly impossible. I will be very impressed when a self-driving car can navigate our icy winter streets and highways.

    I’ve been thinking of writing a short story about someone like me, an old guy who doesn’t want a self-driving car, so keeps moving further north, into snowier and more challenging environments, as self-driving cars are mandated at more southern latitudes.

    I like it! I’d buy a copy!

    • #13
  14. Susan Quinn Contributor
    Susan Quinn
    @SusanQuinn

    Is there ever going to be a time when, the most innocuous conditions reflect our willingness to give up too much of our own power?

    • #14
  15. Barfly Member
    Barfly
    @Barfly

    Henry Racette (View Comment):
    I don’t like self-driving cars.

    And everything else Henry said. 

    Driving a car requires human intelligence. Think about it – it’s natural that the task of “driving” should evolve to comfortably suit its available resources. If the mass of people were measurably smarter or dumber, then “driving” would be different too.

    Machine learning as implemented today is nothing like human intelligence. I don’t mean “way less,” I mean “not the same thing at all.” And when we do get to where a machine can drive a car well enough to join us on the road, we’ll have bigger things to worry about.

    • #15
  16. Limestone Cowboy Coolidge
    Limestone Cowboy
    @LimestoneCowboy

    @susanquinn 

    My problem with self driving cars is that driving involves making moral judgements as well as technical  judgements. Here’s a trivial example.

    I’m driving under low visibility conditions in a school zone. In my peripheral vision I see some motion on my right front side.  A dog or a kid? I’m not sure…if it’s a dog I’m not going to go through heroic and risky (for me  and any possible bystanders) of avoiding a dog and tough luck for the dog. But if it’s a kid, it’s a completely different moral question. But we can’t be sure nor can in all likelihood an onboard AI system. Maybe it’s been programmed to minimize damage to a) driver and passengers and b) vehicle and c) external objects in that order.

    Me? If I believed it possible that it was a kid, I’d risk totalling my car.

    All this is to say that driving involves frequent assessment of risk and value. I’m pretty sure that AI is nowhere near where it needs to be to do that. AI driver assist.. OK. But driver must be awake, alert and hands on the wheel.

    • #16
  17. Full Size Tabby Member
    Full Size Tabby
    @FullSizeTabby

    OldPhil (View Comment):

    I saw a TV ad (can’t remember the brand) that was bragging because the car parks itself. The occupants were pulling into a bar/restaurant/something, and the one spot left was too narrow for them to open their doors. So they got out and the car pulled itself into a spot with about 6 inches on each side.

    Brilliant!

    Of course then the operators of the adjacent cars cannot open their doors, and therefore cannot get into their vehicles. So are your prepared to accept liability for the fact that you are effectively holding them prisoner? Or can they hire a truck to drag your vehicle out forcibly so that they can access their own vehicles?

    • #17
  18. Susan Quinn Contributor
    Susan Quinn
    @SusanQuinn

    Barfly (View Comment):
    Machine learning as implemented today is nothing like human intelligence. I don’t mean “way less,” I mean “not the same thing at all.” And when we do get to where a machine can drive a car well enough to join us on the road, we’ll have bigger things to worry about.

    An interesting take, Barfly. I think programming a car is only tangentially about human intelligence. Maybe that’s part of my discomfort.

    • #18
  19. Susan Quinn Contributor
    Susan Quinn
    @SusanQuinn

    Limestone Cowboy (View Comment):
    My problem with self driving cars is that driving involves making moral judgements as well as technical  judgements. Here’s a trivial example.

    Well said! A person could say, how often is that likely to happen? But do you want to be driving when it does?

    • #19
  20. CACrabtree Coolidge
    CACrabtree
    @CACrabtree

    Susan Quinn (View Comment):

    Limestone Cowboy (View Comment):
    My problem with self driving cars is that driving involves making moral judgements as well as technical judgements. Here’s a trivial example.

    Well said! A person could say, how often is that likely to happen? But do you want to be driving when it does?

    On the plus side, it’s tough to flip off one that’s driving badly…

    • #20
  21. Full Size Tabby Member
    Full Size Tabby
    @FullSizeTabby

    Limestone Cowboy (View Comment):

    @ susanquinn

    My problem with self driving cars is that driving involves making moral judgements as well as technical judgements. Here’s a trivial example.

    I’m driving under low visibility conditions in a school zone. In my peripheral vision I see some motion on my right front side. A dog or a kid? I’m not sure…if it’s a dog I’m not going to go through heroic and risky (for me and any possible bystanders) of avoiding a dog and tough luck for the dog. But if it’s a kid, it’s a completely different moral question. But we can’t be sure nor can in all likelihood an onboard AI system. Maybe it’s been programmed to minimize damage to a) driver and passengers and b) vehicle and c) external objects in that order.

    Me? If I believed it possible that it was a kid, I’d risk totalling my car.

    All this is to say that driving involves frequent assessment of risk and value. I’m pretty sure that AI is nowhere near where it needs to be to do that. AI driver assist.. OK. But driver must be awake, alert and hands on the wheel.

    I read a while ago some more about that problem of the nebulous concept of “judgment,” looking at the crashes that involved driverless cars being tested. At the time, the driverless car was technically not at fault in any of the crashes, but detailed examination did find that many of the crashes occurred because the driverless car was following some specific rule that meant it did not behave the way the human drivers around it expected. For example, a common type of crash occurred when an obstacle appeared in a driving lane, the car abruptly stopped instead of squeezing slightly over the road center-line marking to go around the obstacle, leading to a following car to crash into the suddenly stopped self-driving car. Technically the crash is not the fault of the self-driving car, but the following driver would not have been expecting the car to stop if there were room to squeeze by the obstacle by crossing the center-line marking. But, the self-driving car had been programmed never to cross the center-line marking. You and I know that sometimes (no oncoming traffic, good visibility) it is perfectly safe to cross the center-line marking in order to avoid something on our side of the road. 

    • #21
  22. Susan Quinn Contributor
    Susan Quinn
    @SusanQuinn

    CACrabtree (View Comment):
    On the plus side, it’s tough to flip off one that’s driving badly…

    Now that would be really annoying!

    • #22
  23. Susan Quinn Contributor
    Susan Quinn
    @SusanQuinn

    Full Size Tabby (View Comment):
    I read a while ago some more about that problem of the nebulous concept of “judgment,” looking at the crashes that involved driverless cars being tested.

    Now there’s another good point. Unless we are all on a “monorail” type system, we are leaving a lot to chance–or supposed judgment.

    • #23
  24. WillowSpring Member
    WillowSpring
    @WillowSpring

    Having been a programmer of real-time control systems for most of my career, it would scare me to death to be responsible for software responsible for driving on the road.  Even in purely mechanical/thermal systems, a great deal of the code is spent dealing with edge conditions – many of which take some testing to uncover.

    If you put a human in the loop, I defy you to determine all of the edge conditions.  Years ago, I read of a fire in a supposedly super safe (according to the models, anyway) nuclear power plant (Brown’s Ferry??).  The fire was caused by operators using a lit candle to find a leak in some duct-work.  I suspect that behavior wasn’t thought of by the system designers.

    It would be the same with an automatically driving car.

    I also think the possibility of the driver being lulled into not paying attention is pretty serious.  Last week, I had a loaner car which had some sort of “lane-keeping” software which would nudge the car back into the lane if it detected a change without the turn signals.  It drove me crazy and when I tried to work out what its rules were, it seemed pretty unreliable.

     

    • #24
  25. Susan Quinn Contributor
    Susan Quinn
    @SusanQuinn

    WillowSpring (View Comment):
    Member WillowSpring @WillowSpring 4 Minutes Ago

    Having been a programmer of real-time control systems for most of my career, it would scare me to death to be responsible for software responsible for driving on the road. 

    Thanks so much for weighing in, especially with your background, Willow Spring. You’ve expanded on the conversation in a powerful way!

    • #25
  26. OldPhil Coolidge
    OldPhil
    @OldPhil

    Full Size Tabby (View Comment):

    OldPhil (View Comment):

    I saw a TV ad (can’t remember the brand) that was bragging because the car parks itself. The occupants were pulling into a bar/restaurant/something, and the one spot left was too narrow for them to open their doors. So they got out and the car pulled itself into a spot with about 6 inches on each side.

    Brilliant!

    Of course then the operators of the adjacent cars cannot open their doors, and therefore cannot get into their vehicles. So are your prepared to accept liability for the fact that you are effectively holding them prisoner? Or can they hire a truck to drag your vehicle out forcibly so that they can access their own vehicles?

    Of course, my “brilliant” comment was in italics, my usual sarcasm font.

    • #26
  27. Blue Yeti Admin
    Blue Yeti
    @BlueYeti

    Tesla owner here. The summon feature (that’s what it’s called) is very limited, only works at 5MPH or less, and you can only get it with a software upgrade that costs an additional $6,000 (the full self driving upgrade costs $12,000, which is ridiculous and no I did not buy it).

    I have tried it though (you can also pay $200 to try out the Full Self Driving package for a month) and it’s really more of a parlor trick than anything else. It’s pretty good at backing the car out of or into a packed home garage from the driveway (in a straight line) but in my limited experience with it, in a crowded parking lot it failed (by just stopping in place when it got confused) more often than it actually worked  Maybe it would be nice to have in the rain and snow, but here in Southern California, that’s not a use case we have much need for.

    Every Tesla comes with Auto Pilot which will steer and maintain a speed and distance from the other cars. It works very well on the highway and I use it all the time. It’s especially handy in bumper to bumper freeway traffic, which we sadly have no shortage of around these parts. But it does nag you to keep your hands on the wheel and move it slightly every 15 to 30 seconds and there is a camera inside the car pointed at you at all times. If it sees you pick up your phone or look away, it will immediately alert you to put your hands back on the wheel.  I know that sounds creepy and it’s certainly not for everyone. I know some people who have put tape over the inside camera and don’t use Auto Pilot at all. I think it’s worth it though, and soon all new cars are going to have some version of this — including the inside camera.

    The Full Self Driving package will drive your car (freeway only), change lanes, and even take the exit you designate on the navigation system. The FSD package also has a beta program that you have to qualify for in order to get into (yes, the car monitors and scores your driving) and that system will navigate city streets from your starting point to your destination. There are tons of videos on YouTube of it in action and it’s gotten much better in the past year. But you can’t get in your car and take a nap while it drives you to the office. At least, not yet.

    • #27
  28. I Walton Member
    I Walton
    @IWalton

    I’m ok with turning them loose as long as they are not subsidized and have proved themselves in controlled environments and we work out the liability issues.   That’s any subsidy with  manufactures  paying  for all the R&D of course.

    • #28
  29. J Ro Member
    J Ro
    @JRo

    When I fly, I often watch the airport workers in various vehicles transfer luggage, meals, trash, and fuel to/from commercial aircraft. These vehicles operate in highly restricted areas on rather short, simple, well marked paths between very precise locations. E.g. From pump #14 on the fuel farm to the spot under the wing of a 777 precisely parked at Gate 51. Or from the Delta cargo handling area under the terminal to the cargo door of the 737 parked with its nose wheels right on a painted mark at Gate B22.

    There must be hundreds of thousands of these short, routine trips every single day. So far as I can tell, each one always has a human driver. I bet it costs lots of money, like having a third pilot (a “flight engineer”) in the cockpit just to manage fuel and electrical systems. They’re all gone now because airlines love to cut costs if they can do it safely. 

    When the AI for these simple driving tasks inside the fence lines of our airports is safe and reliable enough that drivers are no longer needed, I might think about having some “self driving” cars and trucks on our busy, crowded, and complicated public roads. 

    • #29
  30. Miffed White Male Member
    Miffed White Male
    @MiffedWhiteMale

    My 2022 Hyundai Sonata has a “remote park assist” feature.  Essentially, with the remote in  my hand, I can push a button and have the car slowly move backward or forward about 20-25 feet.  The direction of travel has to be pretty much a straight line, although the wheels will turn slightly if an obstruction is noted.  It’s not going to parallel park.  All of the forward and rear collision sensors are engaged to prevent it hitting anything – in fact I can’t use it to park the car in  my garage because there’s a storage rack on the back wall and the car refuses to get close enough to it to be able to close the garage door.

    It’s primary purpose is for being able to pull the car in or out of a tight space, or if there’s a puddle or post/obstruction that you don’t want to have to deal with.  It’s good for parking next to snow banks too.

    The doors have to be locked to use it, so nobody can try jumping in or out while the car is in motion.

     

    • #30
Become a member to join the conversation. Or sign in if you're already a member.