Moral Dilemma That Self-Driving Cars and Our Society Face Today

Reading Time: 3 minutes

What Are Self-Driving Cars and How Do They Work? 

An autonomous car is a vehicle capable of sensing its environment and operating without any human involvement by using a combination of different cameras, sensors, radar and undoubtedly artificial intelligence (AI). That means that a driverless car can go anywhere a traditional car goes and do everything that an experienced human driver does.

What is the most obvious advantage of implementing such technologies into our everyday lives? The first thing that comes to mind is eradicating human error.  The Department of Transport estimated that last year 42,000 people died from traffic crashes in the US alone. Worldwide, 1.3 million people die every year in traffic accidents. If there was a way we could eliminate 90 percent of those accidents, would you support it? Of course you would. This is what driverless car technology promises to achieve by eliminating the main source of accidents – human error.

However, is our society ready for that considering the moral dilemma issues? 

Human decision making is influenced by an incredibly large number of factors. Driving experience, moral principles, life experience, psychological and physical condition at the time of the accident, and so on. This reveals that moral choices are not universal. Moreover, in the set of traffic rules we are accustomed to, we can clearly observe a division of responsibility between the car manufacturer and its driver.

Now let me ask you a couple of questions…

How, then, should we train artificial intelligence to make decisions if we do not have a clear algorithm of the right choices? Who should take responsibility for the decisions which were made? Driver, programmer, manufacturer?

Let me try to present a moral dilemma with a practical example of a dead-end situation.

Try to imagine yourself in a self-driving car in the year 2029. You are sitting at the back watching some series on Netflix.  All of a sudden, the car experiences mechanical failure and is unable to stop. If the car continues, it will crash into a bunch of pedestrians crossing the street, but the car may swerve, hitting one bystander, killing them to save the pedestrians. What if instead the car could swerve into a wall, crashing and killing you, the passenger, in order to save those pedestrians? What should the car do, who should decide and who should take the responsibility of the cars actions? 

Should the car take its course even if that’s going to harm more people or take the action which will minimize total harm by killing one single human being? 

This situation can be considered from different perspectives. From the point of view of an ordinary pedestrian, sacrificing the passenger would definitely be the right decision, but manufacturers of such technologies will definitely choose the option of minimizing total harm by killing a bystander, not a passenger. The reason is pretty obvious. Would you still be interested in buying such a car after this kind of an accident? I don’t think so. That means that this choice may lead to a decrease in market demand or to its complete absence. 

Despite the fact that we do not have the right solution, we must understand that technological progress is impossible without errors, mistakes and some sort of sacrifices. At the moment, any solution may seem unacceptable and terrible, but in order to achieve progress and complete security in this area we will have to take the risk in order to improve traffic safety in the future.

Let me know what you think in the comments!

Sources:

https://www.synopsys.com/automotive/what-is-autonomous-car.html

https://www.forbes.com/advisor/legal/auto-accident/car-accident-deaths/

One thought on “Moral Dilemma That Self-Driving Cars and Our Society Face Today

  1. Wiktor Kuranowski says:

    The self-driving car industry, despite its high advancement, still has some big problems to deal with. And we are still far from fully autonomous cars. The problem you presented is hard to solve. The prospect of making a decision to kill a human being by artificial intelligence is terrifying. Personally, I don’t think there should be such a thing as fully self-driving cars. For me it should be limited to motorways or expressways to minimize the risk of such situations.

Leave a Reply