A ‘No Driver’ Tesla?

 

Imagine my surprise when I saw a Tesla backing out of a parking space and no one—not even in the driver’s seat—was inside! But then I should tell you the whole story, and then I’d like to get your input.

The other day we were working out at our gym. An older fellow has been showing up fairly regularly on the days we’re there: he’s very incapacitated and works with a personal trainer; he’s tall, thin and although his mind seems intact, his body seems severely limited, as he moves very slowly. I didn’t pay much attention to him since I keep busy monitoring my own workout.

When my husband and I left the facility, we headed to our car parked in the lot. We noticed a fellow standing near our car, staring at the car next to ours—which was slowly backing out of its parking space and turning toward our car. But no one was in the moving car!

We all called out, and suddenly the car stopped. And then we saw the man I described earlier, slowly making his way toward us. He explained that he is able to drive the car remotely to where he is standing, and it works fine as long as no one or nothing is in the way. He basically has the car pick him up.

Right.

Meanwhile, he moved his car forward with his remote control, got in his car, and spent a couple of minutes orienting himself. We looked at the logo on the car, and it was a Tesla.

Figures.

As we got in our own car and finally caught our breaths, we started to ask ourselves questions: is it legal for him to do that? Should he even be responsible for “moving” the car? What if the car had hit our car? What if he had hit a person in the parking lot? Should we have talked to him more? Should we have reported him to someone?

If I see him again at the gym, I may initiate a polite conversation with him.

Is this crazy or what?

Published in Culture
Tags: ,

This post was promoted to the Main Feed by a Ricochet Editor at the recommendation of Ricochet members. Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 51 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Bishop Wash Member
    Bishop Wash
    @BishopWash

    Henry Racette (View Comment):
    I’ve been thinking of writing a short story about someone like me, an old guy who doesn’t want a self-driving car, so keeps moving further north, into snowier and more challenging environments, as self-driving cars are mandated at more southern latitudes.

    Like it. My coworker said his brother retired out of the Air Force in a northern state. His plan was to strap a snow blower to his roof and drive south until people asked him what was on his roof. 

    • #31
  2. TBA Coolidge
    TBA
    @RobtGilsdorf

    Blue Yeti (View Comment):

    Tesla owner here. The summon feature (that’s what it’s called) is very limited, only works at 5MPH or less, and you can only get it with a software upgrade that costs an additional $6,000 (the full self driving upgrade costs $12,000, which is ridiculous and no I did not buy it).

    I have tried it though (you can also pay $200 to try out the Full Self Driving package for a month) and it’s really more of a parlor trick than anything else. It’s pretty good at backing the car out of or into a packed home garage from the driveway (in a straight line) but in my limited experience with it, in a crowded parking lot it failed (by just stopping in place when it got confused) more often than it actually worked Maybe it would be nice to have in the rain and snow, but here in Southern California, that’s not a use case we have much need for.

    Every Tesla comes with Auto Pilot which will steer and maintain a speed and distance from the other cars. It works very well on the highway and I use it all the time. It’s especially handy in bumper to bumper freeway traffic, which we sadly have no shortage of around these parts. But it does nag you to keep your hands on the wheel and move it slightly every 15 to 30 seconds and there is a camera inside the car pointed at you at all times. If it sees you pick up your phone or look away, it will immediately alert you to put your hands back on the wheel. I know that sounds creepy and it’s certainly not for everyone. I know some people who have put tape over the inside camera and don’t use Auto Pilot at all. I think it’s worth it though, and soon all new cars are going to have some version of this — including the inside camera.

    The Full Self Driving package will drive your car (freeway only), change lanes, and even take the exit you designate on the navigation system. The FSD package also has a beta program that you have to qualify for in order to get into (yes, the car monitors and scores your driving) and that system will navigate city streets from your starting point to your destination. There are tons of videos on YouTube of it in action and it’s gotten much better in the past year. But you can’t get in your car and take a nap while it drives you to the office. At least, not yet.

    ~designs a self-driving car that maps other self-driving cars’ parameters so it can drive like an a-hole~ 

    • #32
  3. Blue Yeti Admin
    Blue Yeti
    @BlueYeti

    Here’s a guy in San Francisco on the FSD beta letting a Tesla Model Y drive itself to In n’ Out:

    • #33
  4. Blue Yeti Admin
    Blue Yeti
    @BlueYeti

    Miffed White Male (View Comment):

    My 2022 Hyundai Sonata has a “remote park assist” feature. Essentially, with the remote in my hand, I can push a button and have the car slowly move backward or forward about 20-25 feet. The direction of travel has to be pretty much a straight line, although the wheels will turn slightly if an obstruction is noted. It’s not going to parallel park. All of the forward and rear collision sensors are engaged to prevent it hitting anything – in fact I can’t use it to park the car in my garage because there’s a storage rack on the back wall and the car refuses to get close enough to it to be able to close the garage door.

    It’s primary purpose is for being able to pull the car in or out of a tight space, or if there’s a puddle or post/obstruction that you don’t want to have to deal with. It’s good for parking next to snow banks too.

    The doors have to be locked to use it, so nobody can try jumping in or out while the car is in motion.

    Part of the Tesla FSD package includes auto park, but it’s not very good. Our other car is a 2019 Ford Edge SUV and it has auto park (both parallel and straight in) and it’s much better than Tesla’s version. 

    • #34
  5. Franco Member
    Franco
    @Franco

    I have made a bit of a study on Tesla in the last two years. In part because it’s been a victim ( along with Musk himself) of fake news.

    I told my financial advisor friend to buy the stock at $180 before the split and before it went up to $ 1200 after the split. Why? Because the company was attacked in business media which is over-aligned with legacy car makers, who happen to spend a lot of money advertising on their networks for one thing. Secondarily these people are stuck in old models and outdated frameworks.

    There were quite a few influential short-sellers, including Bill Gates of all people, who had a vested interest in talking the company down trying to protect their previous bets when they were spectacularly wrong about the company’s long term prospects.

    As to conservatives overall naysaying, it comes mostly from the climate change hoax, which I understand in a way but it’s not entirely valid. That’s a whole other comment or post, but EVs are superior technology and because Tesla has succeeded so completely other car makers have to catch up or become  obsolete. A Tesla has a life span of 500,000 miles and needs much less maintenance besides being cheaper to run by the mile even when gas is cheap.

    As to self- driving, Tesla is far in advance of other car makers on several levels, and will likely be licensing their FSD technology to Ford GM and VW, in five years. FSD is still developing and it’s easy to highlight various flaws here and there, however they are dealing with each one continually.

    It is already proven that generally, FSD is much safer than humans driving. In individual circumstances, humans are better but those kinks are being worked out iteratively. 
    Yes, it’s creepy to see a car ‘drive’ itself. But in five years, it will be pretty common and we will get used to it.

    I remember when my Princeton U alumni friend of my stepfather claimed in the 80’s computers would go the way of the CB radio.

    For myself, I could not understand how Google could possibly be worth as much as the stock price indicated, and when I couldn’t fathom how a company that sold books online (Amazon) could justify such a high P/E ratio. Well, I didn’t do any research for one thing.

    AI is here, and it will be quite capable of doing a lot of things we have trouble imagining.

    Did we imagine the smart phone? That’s Dick Tracy stuff! Did we imagine rockets landing perfectly on a barge in the ocean? (thanks to Elon) 

    Musk is not an ideologue on climate change BTW. He tweeted recently that we should drill for more oil when gas prices rose abruptly. 

     

    • #35
  6. Stad Coolidge
    Stad
    @Stad

    Let me provide and example of why human thought can be superior:

    Ships in the ocean operate by the “Rules of the Road.”  These rules are designed primarily to tell ships how to operate in order to avoid collisions.  However, there is something within the Rules called the General Prudential Rule.  In plain English, it says you can operate outside the Rules of the Road as necessary to avoid a collision.  This was added because circumstances may arise where blind adherence to the Rules would still result in a collision.

    Now, let’s look at a hypothetical programmer tasked with designing an autonomous ship.  The first thing he’s going to do is write code that incoporates the rules.  However, when it comes to the General Prudential Rule, he runs into a snag.  How does he write a subroutine that allows the autonomous ship to exit the coded rules and operate on its own?  How would he ensure the AI picks the correct actions to take to avoid hitting another ship?  If the programmer doesn’t know what the situation is, how can he write the code?

    I’m not saying it can’t be done, just that I have my doubts . . .

    • #36
  7. GrannyDude Member
    GrannyDude
    @GrannyDude

    Susan Quinn (View Comment):

    Is there ever going to be a time when, the most innocuous conditions reflect our willingness to give up too much of our own power?

    Interestingly, this is the reason people who are really too old to drive…keep driving. 

    They don’t want to give up too much of their own power. 

    Having said that, in my state at least, you can contact the Secretary of State’s office and tell them you believe someone is driving who really ought not to be. That person can be contacted by the SoS to be re-tested.  Nobody has the right to a driver’s license. 

    Don’t know how that works in Florida, though! 

    • #37
  8. Full Size Tabby Member
    Full Size Tabby
    @FullSizeTabby

    GrannyDude (View Comment):

    Susan Quinn (View Comment):

    Is there ever going to be a time when, the most innocuous conditions reflect our willingness to give up too much of our own power?

    Interestingly, this is the reason people who are really too old to drive…keep driving.

    They don’t want to give up too much of their own power.

    Having said that, in my state at least, you can contact the Secretary of State’s office and tell them you believe someone is driving who really ought not to be. That person can be contacted by the SoS to be re-tested. Nobody has the right to a driver’s license.

    Don’t know how that works in Florida, though!

    I like the concept of self-driving cars if we can keep government away from controlling when and where they go because the self-driving cars would allow people to stay mobile (have power over where and when they go places) as they age or develop limitations on their capacities. 

    • #38
  9. Flicker Coolidge
    Flicker
    @Flicker

    Stad (View Comment):

    Let me provide and example of why human thought can be superior:

    Ships in the ocean operate by the “Rules of the Road.” These rules are designed primarily to tell ships how to operate in order to avoid collisions. However, there is something within the Rules called the General Prudential Rule. In plain English, it says you can operate outside the Rules of the Road as necessary to avoid a collision. This was added because circumstances may arise where blind adherence to the Rules would still result in a collision.

    Now, let’s look at a hypothetical programmer tasked with designing an autonomous ship. The first thing he’s going to do is write code that incoporates the rules. However, when it comes to the General Prudential Rule, he runs into a snag. How does he write a subroutine that allows the autonomous ship to exit the coded rules and operate on its own? How would he ensure the AI picks the correct actions to take to avoid hitting another ship? If the programmer doesn’t know what the situation is, how can he write the code?

    I’m not saying it can’t be done, just that I have my doubts . . .

    I’ve said this before, but I don’t think it’s being done, because driverless cars still can’t drive safely unless the human driver occasionally takes control.

    Take a hundred or a thousand very good drivers who can drive in any conditions safely, courteously and effectively, and equip their cars with eye movement monitoring equipment.  This can be done passively.  Then watch what the drivers look at and concentrate on every millisecond while driving.  People’s eyes flash back and forth not only to see, but to evaluate distances, and relative speeds, and changes in relative speeds of everything on the road.  The drivers’ reactions to every glance not only indicate what they do, but why they do it, and how they prioritize decisions.

    Then anybody can deduce what the human algorithm for safe and effective driving is based on how humans actually drive, as opposed to what the rules of the road are and what the car’s braking, steering, and accelerating capabilities are.  This includes courtesies like slowing and driving onto the yellow line when passing bicyclists, slowing down when approaching cars poised to pull out into traffic in front of you, and, yes, even driving onto the shoulder when approaching an obstruction.

    As an aside, I’ve always been intrigued by how when a moderately fast-moving line of traffic suddenly has to stop (often presaged by brake lights a half a mile up the road — but never mind that), each driver swerves and brakes to the opposite direction from the car braking ahead.  Essentially a 300-foot line of cars collapses down to a staggered double line of cars no longer than maybe 100 feet.  It’s amazing to watch.

    • #39
  10. GrannyDude Member
    GrannyDude
    @GrannyDude

    Full Size Tabby (View Comment):

    GrannyDude (View Comment):

    Susan Quinn (View Comment):

    Is there ever going to be a time when, the most innocuous conditions reflect our willingness to give up too much of our own power?

    Interestingly, this is the reason people who are really too old to drive…keep driving.

    They don’t want to give up too much of their own power.

    Having said that, in my state at least, you can contact the Secretary of State’s office and tell them you believe someone is driving who really ought not to be. That person can be contacted by the SoS to be re-tested. Nobody has the right to a driver’s license.

    Don’t know how that works in Florida, though!

    I like the concept of self-driving cars if we can keep government away from controlling when and where they go because the self-driving cars would allow people to stay mobile (have power over where and when they go places) as they age or develop limitations on their capacities.

    Me, too. I want one, because I desperately want to knit, and nap, and read in the car instead of having to drive the damned thing.

    • #40
  11. Blue Yeti Admin
    Blue Yeti
    @BlueYeti

    Flicker (View Comment):

    Stad (View Comment):

     

    I’ve said this before, but I don’t think it’s being done, because driverless cars still can’t drive safely unless the human driver occasionally takes control.

    This is the basis for all iterations of Tesla’s FSD system and there’s no chance that regulators are going to approve a fully autonomous driving system any time soon. 

    • #41
  12. Susan Quinn Contributor
    Susan Quinn
    @SusanQuinn

    GrannyDude (View Comment):

    Full Size Tabby (View Comment):

    GrannyDude (View Comment):

    Susan Quinn (View Comment):

    Is there ever going to be a time when, the most innocuous conditions reflect our willingness to give up too much of our own power?

    Interestingly, this is the reason people who are really too old to drive…keep driving.

    They don’t want to give up too much of their own power.

    Having said that, in my state at least, you can contact the Secretary of State’s office and tell them you believe someone is driving who really ought not to be. That person can be contacted by the SoS to be re-tested. Nobody has the right to a driver’s license.

    Don’t know how that works in Florida, though!

    I like the concept of self-driving cars if we can keep government away from controlling when and where they go because the self-driving cars would allow people to stay mobile (have power over where and when they go places) as they age or develop limitations on their capacities.

    Me, too. I want one, because I desperately want to knit, and nap, and read in the car instead of having to drive the damned thing.

    I get it!

    • #42
  13. Stad Coolidge
    Stad
    @Stad

    Flicker (View Comment):
    driverless cars still can’t drive safely unless the human driver occasionally takes control.

    And I envision situations where the human driver cannot take control fast enough.

    • #43
  14. Stad Coolidge
    Stad
    @Stad

    GrannyDude (View Comment):

    Full Size Tabby (View Comment):

    GrannyDude (View Comment):

    Susan Quinn (View Comment):

    Is there ever going to be a time when, the most innocuous conditions reflect our willingness to give up too much of our own power?

    Interestingly, this is the reason people who are really too old to drive…keep driving.

    They don’t want to give up too much of their own power.

    Having said that, in my state at least, you can contact the Secretary of State’s office and tell them you believe someone is driving who really ought not to be. That person can be contacted by the SoS to be re-tested. Nobody has the right to a driver’s license.

    Don’t know how that works in Florida, though!

    I like the concept of self-driving cars if we can keep government away from controlling when and where they go because the self-driving cars would allow people to stay mobile (have power over where and when they go places) as they age or develop limitations on their capacities.

    Me, too. I want one, because I desperately want to knit, and nap, and read in the car instead of having to drive the damned thing.

    That’s one of the problems.  If you get into a situation where you have to take manual control, you might be too busy doing something else to realize it.

    One reason often cited for driverless cars is for people who drink too much.  Well, that won’t work either.  You still have to be sober enough to take control, so letting your autonomous car carry your drunken butt home won’t work . . .

    • #44
  15. Blue Yeti Admin
    Blue Yeti
    @BlueYeti

    Stad (View Comment):

    Flicker (View Comment):
    driverless cars still can’t drive safely unless the human driver occasionally takes control.

    And I envision situations where the human driver cannot take control fast enough.

    So we should just stop trying to develop this technology? What does that solve?

    You can make similar arguments about lots of things that are commonplace now: aircraft autopilots, electronic banking, automated rail systems, etc, etc. 

    • #45
  16. Henry Racette Member
    Henry Racette
    @HenryRacette

    Blue Yeti (View Comment):

    Stad (View Comment):

    Flicker (View Comment):
    driverless cars still can’t drive safely unless the human driver occasionally takes control.

    And I envision situations where the human driver cannot take control fast enough.

    So we should just stop trying to develop this technology? What does that solve?

    You can make similar arguments about lots of things that are commonplace now: aircraft autopilots, electronic banking, automated rail systems, etc, etc.

    I didn’t get the impression that Stad (who, let’s all admit, is crazy) is arguing that we shouldn’t continue developing self-driving technology. I think it’s a boutique boondoggle, though some useful safety features will come out of it, but I wouldn’t ask the free market to stop spending time on it. I love the free market, and what Tesla does with Tesla’s revenue is up to Tesla.

    I share what I think is Stad’s skepticism about it being as good as a careful driver. And I think there are some difficult-to-resolve liability issues that will come out when self-driving vehicles start making those tough decisions about which of two things to run into. (Good luck convincing a jury that the resulting tragedy was really “the most acceptable outcome,” and that the car “chose well.” We have sympathy for people making impossible choices; I’m not sure that sympathy will transfer to software.)

    Anyway. I don’t want people taken out of the loop. That’s how we got Skynet.

    • #46
  17. Blue Yeti Admin
    Blue Yeti
    @BlueYeti

    Henry Racette (View Comment):

    Blue Yeti (View Comment):

    Stad (View Comment):

    Flicker (View Comment):
    driverless cars still can’t drive safely unless the human driver occasionally takes control.

    And I envision situations where the human driver cannot take control fast enough.

    So we should just stop trying to develop this technology? What does that solve?

    You can make similar arguments about lots of things that are commonplace now: aircraft autopilots, electronic banking, automated rail systems, etc, etc.

    I didn’t get the impression that Stad (who, let’s all admit, is crazy) is arguing that we shouldn’t continue developing self-driving technology. I think it’s a boutique boondoggle, though some useful safety features will come out of it, but I wouldn’t ask the free market to stop spending time on it. I love the free market, and what Tesla does with Tesla’s revenue is up to Tesla.

    I share what I think is Stad’s skepticism about it being as good as a careful driver. And I think there are some difficult-to-resolve liability issues that will come out when self-driving vehicles start making those tough decisions about which of two things to run into. (Good luck convincing a jury that the resulting tragedy was really “the most acceptable outcome,” and that the car “chose well.” We have sympathy for people making impossible choices; I’m not sure that sympathy will transfer to software.)

    Anyway. I don’t want people taken out of the loop. That’s how we got Skynet.

    Which is why I said we won’t have fully autonomous self driving for a long, long time, if ever. 

    • #47
  18. Miffed White Male Member
    Miffed White Male
    @MiffedWhiteMale

    Stad (View Comment):

    Flicker (View Comment):
    driverless cars still can’t drive safely unless the human driver occasionally takes control.

    And I envision situations where the human driver cannot take control fast enough.

    And if people are no longer driving routinely, their available skills when things go bad won’t exactly be up to snuff.  Think about it – you rarely if ever drive anymore, but now the car wants you to take over when the weather is so bad that the sensors and self-driving system can’t handle it?

    • #48
  19. Miffed White Male Member
    Miffed White Male
    @MiffedWhiteMale

    Henry Racette (View Comment):

    Blue Yeti (View Comment):

    Stad (View Comment):

    Flicker (View Comment):
    driverless cars still can’t drive safely unless the human driver occasionally takes control.

    And I envision situations where the human driver cannot take control fast enough.

    So we should just stop trying to develop this technology? What does that solve?

    You can make similar arguments about lots of things that are commonplace now: aircraft autopilots, electronic banking, automated rail systems, etc, etc.

    I didn’t get the impression that Stad (who, let’s all admit, is crazy) is arguing that we shouldn’t continue developing self-driving technology. I think it’s a boutique boondoggle, though some useful safety features will come out of it, but I wouldn’t ask the free market to stop spending time on it. I love the free market, and what Tesla does with Tesla’s revenue is up to Tesla.

    I share what I think is Stad’s skepticism about it being as good as a careful driver. And I think there are some difficult-to-resolve liability issues that will come out when self-driving vehicles start making those tough decisions about which of two things to run into. (Good luck convincing a jury that the resulting tragedy was really “the most acceptable outcome,” and that the car “chose well.” We have sympathy for people making impossible choices; I’m not sure that sympathy will transfer to software.)

    Anyway. I don’t want people taken out of the loop. That’s how we got Skynet.

    I think self-driving vehicles will be useful and usable, if not ubiquitous,  on limited access interstate highways probably within the next decade or so.  

    I don’t think they’ll be useful and available on urban city streets, especially in areas with weather,  in my lifetime.

    • #49
  20. Flicker Coolidge
    Flicker
    @Flicker

    Stad (View Comment):

    Flicker (View Comment):
    driverless cars still can’t drive safely unless the human driver occasionally takes control.

    And I envision situations where the human driver cannot take control fast enough.

    That has always been the case.  But at least a human driver has reasonable intelligence.  If they don’t improve driverless cars by a factor of ten, AI drivers will be no better.

    It’s possible, I suppose, that once cars are all driverless that they can avoid all collisions [and traffic jams], but that removes from me the joy of driving (the freedom, not the collisions).

    • #50
  21. Stad Coolidge
    Stad
    @Stad

    Blue Yeti (View Comment):

    Stad (View Comment):

    Flicker (View Comment):
    driverless cars still can’t drive safely unless the human driver occasionally takes control.

    And I envision situations where the human driver cannot take control fast enough.

    So we should just stop trying to develop this technology? What does that solve?

    You can make similar arguments about lots of things that are commonplace now: aircraft autopilots, electronic banking, automated rail systems, etc, etc.

    I never said we should stop trying.  I said I have my doubts.  We’ve many a lot of advances when people said something couldn’t be done, so I’m glad there are engineers and programmers out there who aren’t listening to me . . .

    • #51
Become a member to join the conversation. Or sign in if you're already a member.