Your Self-driving Car Is Programmed to Kill You

 

car_lifeFrom MIT Technology Review comes this vision of the future:

How should [a self-driving] car be programmed to act in the event of an unavoidable accident? Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random?

Not sure “random” is what I’d choose, but lets keep going:

Here is the nature of the dilemma. Imagine that in the not-too-distant future, you own a self-driving car. One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?

The problem, for people who will build and sell self-driving cars, is that if we know they’re programmed to kill us, we may be less inclined to buy them. We’re funny that way.

But researchers decided to explore public opinion on this very topic. They help focus groups and did some polling and came up with this conclusion. Brace yourselves:

People are in favor of cars that sacrifice the occupant to save other lives—as long they don’t have to drive one themselves.

That about sums up the entirety of the human experience.

Like anything, though, the more questions you ask the more complex it gets:

Is it acceptable for an autonomous vehicle to avoid a motorcycle by swerving into a wall, considering that the probability of survival is greater for the passenger of the car, than for the rider of the motorcycle? Should different decisions be made when children are on board, since they both have a longer time ahead of them than adults, and had less agency in being in the car in the first place? If a manufacturer offers different versions of its moral algorithm, and a buyer knowingly chose one of them, is the buyer to blame for the harmful consequences of the algorithm’s decisions?

Or, how about this: Drive your own [expletive redacted] car.

Published in Culture, Technology
Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 47 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. DrewInWisconsin Member
    DrewInWisconsin
    @DrewInWisconsin

    When the people who want to force us all into driverless cars are the same ones who want to take away all guns, I daresay the game is given away.

    This line . . .

    If you want a vision of the future, Winston, imagine a boot, stamping on the floor where once there was an accelerator pedal, forever.

    . . . is why James Lileks is a national treasure.

    • #1
  2. James Gawron Inactive
    James Gawron
    @JamesGawron

    Rob,

    It doesn’t make any difference what the programmers choose to do. My lawyer just can’t wait for the self-driving car no matter how it’s programmed. He says that he intends to make a fortune (cookie) off of it. I’m not sure what he means by that.

    Whiplash Willie

    All I know is that he’s only lost the one case. I hope those car manufacturers know what they’re doing. Somebody could get hurt.

    Regards,

    Jim

    • #2
  3. Eric Hines Inactive
    Eric Hines
    @EricHines

    Rob Long: If a manufacturer offers different versions of its moral algorithm, and a buyer knowingly chose one of them, is the buyer to blame for the harmful consequences of the algorithm’s decisions?

    Nothing magic about this.  The user of the car is responsible for everything the car does, just as he would be for every other tool or toy he uses.

    Eric Hines

    • #3
  4. Don Tillman Member
    Don Tillman
    @DonTillman

    Aftermarket accessory.  Problem solved.  :-)

    • #4
  5. Aaron Miller Inactive
    Aaron Miller
    @AaronMiller

    Cars might not even be available for purchase in the future. They might be provided as services, rather than products. As with software, the user agreement might “reserve the right” to alter or cancel your “user experience” at any time.

    But this certainly gives new meaning to canceling your user experience.

    • #5
  6. Ontheleftcoast Inactive
    Ontheleftcoast
    @Ontheleftcoast

    Don’t forget, these are all going to be massively connected; expect all parties’ auto and medical insurance companies to be interested parties whose interconnected AIs will have a say in the outcome.

    Tesla is pioneering push updates for your onboard software; once cars with this capability are a significant presence on the roads, one of the first things to happen will probably be for the car to autonomously drive to a safe spot, lock the doors and call for the cops if any mandatory insurance (car, medical, whatever) has lapsed.

    • #6
  7. John Hanson Coolidge
    John Hanson
    @JohnHanson

    They’ll just lower the national speed limit to about 10 mph, or whatever speed guarantees the occupants can’t be killed, to a high probability, then drive the car to protect those outside the car.  The slow speed won’t be a problem, since the central planners will not let you leave your house except to work for the state anyway, and they will be working towards the day when no one has to go anywhere for anything.  One will sit at home on the dole, watch the can’t shut it off  TV whenever one is awake, and get all medical services etc from the Robots, working to ensure man is wiped from the face of the earth, so the environmentalists will be happy.  The trips to work will be ended, when you are moved into the work place to live, until your job too can be ended.

    A rather gloomy view of the future driven by the big step of removing the freedom a self-driven car allows.  It terrifies us when our kids, get their licenses, but in a world without drivers it becomes way too easy for the visions of 1984, or Brave New World to actually be implemented.

    • #7
  8. Matty Van Inactive
    Matty Van
    @MattyVan

    James: “My lawyer just can’t wait for the self-driving car no matter how it’s programmed. He says that he intends to make a fortune (cookie) off of it. I’m not sure what he means by that.”

    MV: “Maybe this. If traffic deaths go from thirty thousand to three thousand, will car manufacturers get 27,000 thank you notes? No. They’ll get three thousand law suits. Just like in medicine, lawyers will see this as an opportunity.”

    Drew: “When the people who want to force us all into driverless cars are the same ones who want to take away all guns, I daresay the game is given away.”

    MV: “But what if in some wonderful future we have private transportation systems rather than our current socialized ones? And the owner of a freeway decides to only allow driverless cars in order to improve the safety of his customers?”

    Personally, I can’t wait. I have things I’d much rather be doing with my travel time than driving. Or cruising for parking. I’d have it drop me off in front of my destination, send it off somewhere, and call it back when I need it. I imagine I’ll be sending it to pick up some groceries after calling in my order. Paying next to nothing for insurance (assuming we can keep lawyers out of the game). Or traveling a hundred and twenty miles an hour while reading a book on the above mentioned private freeway. The possibilities are endless when we let free markets work their magic.

    Or maybe I won’t even have the hassles of personal ownership. I’ll just call a driverless Uber whenever I need one. With millions of others doing the same, there will certainly be one waiting just around the corner.

    • #8
  9. Marion Evans Inactive
    Marion Evans
    @MarionEvans

    The car should have a programmed instinct for self-preservation, just as a person does. Otherwise, it will be bumping into all sorts of things when smaller dilemmas arise too. For example, swerving and losing a side mirror instead of honking at the old lady crossing the road. It should therefore do what causes the least damage to itself. Anything else will result in a lot of minor accidents.

    • #9
  10. Eric Hines Inactive
    Eric Hines
    @EricHines

    Matty Van: Personally, I can’t wait. I have things I’d much rather be doing with my travel time than driving.

    So do I.  But if I don’t have a manual override–I’m a better driver than any machine–I’m not interested.  This is what hackers are for.

    Eric Hines

    • #10
  11. Ball Diamond Ball Member
    Ball Diamond Ball
    @BallDiamondBall

    Rob Long: (quoted material) Is it acceptable for an autonomous vehicle to avoid a motorcycle by swerving into a wall, considering that the probability of survival is greater for the passenger of the car, than for the rider of the motorcycle?

    Not so simple.  Is the motorcycle showboating?  Is the car speeding?  Without taking into account the good or bad behavior of both sides, this will devolve into a system which can be gamed.  I submit that it can’t be done, because we only loosely manage to do it with people instead of computers.  Or to re-phrase that, by the time this algorithm is reliable in a sustainable way, humans won’t be making these sorts of decisions.

    • #11
  12. Ball Diamond Ball Member
    Ball Diamond Ball
    @BallDiamondBall

    Marion Evans:The car should have a programmed instinct for self-preservation, just as a person does. Otherwise, it will be bumping into all sorts of things when smaller dilemmas arise too. For example, swerving and losing a side mirror instead of honking at the old lady crossing the road. It should therefore do what causes the least damage to itself. Anything else will result in a lot of minor accidents.

    The car will know that that’s an endangered variant of possum but this is a common oak.

    • #12
  13. DrewInWisconsin Member
    DrewInWisconsin
    @DrewInWisconsin

    Matty Van: Personally, I can’t wait. I have things I’d much rather be doing with my travel time than driving. Or cruising for parking. I’d have it drop me off in front of my destination, send it off somewhere, and call it back when I need it. I imagine I’ll be sending it to pick up some groceries after calling in my order. Paying next to nothing for insurance (assuming we can keep lawyers out of the game). Or traveling a hundred and twenty miles an hour while reading a book on the above mentioned private freeway. The possibilities are endless when we let free markets work their magic.

    The fantasy of a world with driverless cars is 180 degrees flipped from the reality of such a world. The reality will have nothing to do with the free market or with your independence and everything to do with government central planning. Surely you have enough real-world experience to know this.

    • #13
  14. Matty Van Inactive
    Matty Van
    @MattyVan

    Drew, I hope you are wrong but fear you are right. As with computers, though, maybe this new technology can move too fast for govt to keep up, and offer too good a thing for people to allow too much govt interference.

    • #14
  15. Aaron Miller Inactive
    Aaron Miller
    @AaronMiller

    Squirrels will become as common as cockroaches without drivers to keep the population in check.

    • #15
  16. Ontheleftcoast Inactive
    Ontheleftcoast
    @Ontheleftcoast

    Goes to show that even though you’re a PhD chemist with an IQ around 160 you don’t necessarily see all the ramifications of your ideas:

    A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the FirstLaw. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

    • #16
  17. Bob W Member
    Bob W
    @WBob

    What if a paper bag blows in front of the car? A person instantly knows what it is, and not to take possibly dangerous evasive maneuvers. Will the driverless car know what it is? Can it distinguish a bag from a dog from a child? Either it will be programmed to try to distinguish between these, in which case, unlike a person, it will not always be right and therefore kill a dog or child. Or it will be programmed to always take evasive maneuvers without trying to figure out what the obstacle is. And that would get old quick. Either way, I think people are going to wish they could drive their own cars.

    Not to mention, everyone talks about this as if people only use cars to get from point A to point B. They don’t. How does a driverless car handle that? Let’s just drive around this part of the city and see what houses are for sale or just to get a feel for what the neighborhoods are like… The car could respond to voice directions but when you think how you actually drive in that type of situation, voicing it really sounds arduous. There’s going to be major buyers remorse with this gee wiz novelty.

    • #17
  18. 1967mustangman Inactive
    1967mustangman
    @1967mustangman

    Can we please not descend into this mass hysteria every single time someone mentions a self-driving car?

    • The lawsuits aren’t going to bankrupt anyone (did the Pinto related lawsuits bankrupt Ford).  Self-driving cars will get in accidents at such a small rate the automakers will simply buy insurance and increase the price they charge for the car by .005%.
    • No, they aren’t going to become a tool of an oppressive state or they will immediately be “jailbroken”.
    • No people aren’t going to be dying left and right in cars.  Yes it might suck to die when the software malfunctions but that will probably happen at a rate 1000x-4000x lower than people currently die in cars.

    Seriously people.  Chill out, pour yourself a bourbon, and imagine all exciting Ricochet posts you could write during that 90 minute commute your car now does for you.

    • #18
  19. DrewInWisconsin Member
    DrewInWisconsin
    @DrewInWisconsin

    Matty Van:Drew, I hope you are wrong but fear you are right. As with computers, though, maybe this new technology can move too fast for govt to keep up, and offer too good a thing for people to allow too much govt interference.

    People don’t have a say in government anymore. : (

    • #19
  20. Percival Thatcher
    Percival
    @Percival

    Doctor Susan Calvin, please pick up the white courtesy phone…

    • #20
  21. Ball Diamond Ball Member
    Ball Diamond Ball
    @BallDiamondBall

    Percival:Doctor Susan Calvin, please pick up the white courtesy phone…

    Spiff?

    • #21
  22. Percival Thatcher
    Percival
    @Percival

    Ontheleftcoast:Goes to show that even though you’re a PhD chemist with an IQ around 160 you don’t necessarily see all the ramifications of your ideas:

    A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the FirstLaw. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

    I scanned through too fast and missed that.

    • #22
  23. Percival Thatcher
    Percival
    @Percival

    Ball Diamond Ball:

    Percival:Doctor Susan Calvin, please pick up the white courtesy phone…

    Spiff?

    Nah.

    • #23
  24. Jim Kearney Member
    Jim Kearney
    @JimKearney

    1967mustangman: Seriously people.  Chill out, pour yourself a bourbon …

    Bourbon seems to have hired kale’s publicist. It’s everywhere!

    It’s probably been everywhere for awhile. I’m a late-if-ever adopter. The bourbon blitz found me via Steve’s Small Batch Bourbon Vanilla Ice Cream on a recent high pressure visit back east. Yum!

    Great piece, Rob! Wish George Orwell were still around to expand MIT Technology Review’s little foray into “algorithmic morality” into a series treatment for Syfy.

    • #24
  25. James Gawron Inactive
    James Gawron
    @JamesGawron

    Percival:

    Ball Diamond Ball:

    Percival:Doctor Susan Calvin, please pick up the white courtesy phone…

    Spiff?

    Nah.

    Dr. Asimov please pick up the white courtesy phone….whoops I think he’s dead already. Now who will explain the Three Laws of Robotics.

    I sure hope it isn’t this guy.

    Regards,

    Jim

    • #25
  26. Ontheleftcoast Inactive
    Ontheleftcoast
    @Ontheleftcoast

    Not so simple. Is the motorcycle showboating? Is the car speeding? Without taking into account the good or bad behavior of both sides, this will devolve into a system which can be gamed. I submit that it can’t be done, because we only loosely manage to do it with people instead of computers. Or to re-phrase that, by the time this algorithm is reliable in a sustainable way, humans won’t be making these sorts of decisions.

    Sorry, showboaters’ lives matter, speeders’ lives matter. That part of the algorithm is easy.

    • #26
  27. Tim H. Inactive
    Tim H.
    @TimH

    Aaron Miller:Squirrels will become as common as cockroaches without drivers to keep the population in check.

    This ought to be on a bumper sticker.  It’s genius!  And I say that as someone fond of squirrels.

    • #27
  28. Marion Evans Inactive
    Marion Evans
    @MarionEvans

    Ball Diamond Ball:

    Marion Evans:The car should have a programmed instinct for self-preservation, just as a person does. Otherwise, it will be bumping into all sorts of things when smaller dilemmas arise too. For example, swerving and losing a side mirror instead of honking at the old lady crossing the road. It should therefore do what causes the least damage to itself. Anything else will result in a lot of minor accidents.

    The car will know that that’s an endangered variant of possum but this is a common oak.

    The point is: no one will ride a car that would drive itself into a wall to save something or someone else.

    • #28
  29. Dustoff Inactive
    Dustoff
    @Dustoff

    Off topic Rob, but you’ve got me thinking. Next time my Uber driver shows up, before entering I’ll ask the question: In the event of an unavoidable accident, do you intend to minimize the loss of life, even if it means sacrificing the life of your occupant……..?

    Assuming he or she doesn’t immediately lock the doors, drive-off and give me a Five-Star Insanity rating, but instead answers the question, at least it will be a window into the soul. A programming feature surely missing in our future driverless cars.

    • #29
  30. Misthiocracy Member
    Misthiocracy
    @Misthiocracy

    What do human drivers normally do when a group of idiots step into the middle of the street? I find it hard to believe they do not attempt to swerve out of the way.

    • #30
Become a member to join the conversation. Or sign in if you're already a member.