Is Tesla’s autopilot killing people?

Reading Time: 4 minutes
Подтверждено: по вине «автопилота» Tesla погибло уже четыре человека -  Hi-News.ru


If we start talking about statistics, then it all started much earlier. But it was in 2018 that 21 people died due to accidents caused by the Tesla autopilot (more than in previous years), next year the number of autopilot victims increased to 50 people. Someone will say that then the autopilot was “raw” and did not work properly, but unfortunately the deaths continue. In 2021, it is also possible to state deaths as a result of driving a Tesla with automatic control. In April of this year, 2 people in the United States crashed a Tesla Model S when the electric car was controlled using autopilot. And one of the most recent serious accidents occurred in November of this year, when a Tesla accelerated to 200 kilometers per hour, killed three road users and finally crashed into the stage, during which the driver was hospitalized. The electric car did not react to the brake pedal. Presumably, there was a software glitch in the electric car. Representatives of Tesla expressed their willingness to assist the investigation and said they would actively provide the necessary assistance.

Elon Musk acknowledged the problem

Only in August 2021, Tesla developer Elon Musk admitted that there were serious problems with the operation of Tesla’s autopilot. At the same time, Musk clarified that version 9.2 is problematic, and allegedly in 9.3 the problems will be fixed. But people were dying in Tesla even before version 9.2 was released…

In defense of Autopilot and Tesla

On the one hand, Tesla continues to kill people, but on the other hand, drivers are guilty of not following the instructions. The instructions say that the driver must constantly keep his hands on the steering wheel and, if necessary, in the event of a dangerous situation on the road, immediately take control. In other words, owners are informed that it is impossible to rely entirely on Tesla’s autopilot.

Criticism of autopilot

On the other hand, if a company that aspires to be called the best electric car developer in the world and dictates the fashion in the electric car industry creates an autopilot function, then it should work flawlessly. Otherwise, what is the point of autopilot if it needs to be controlled 100% of the time? Whether with or without autopilot, the driver must constantly monitor the road. But then why do we need such an autopilot? Or are we still seeing a demo version, some developments, but why did they then come out with a finished product?
Usually, automakers recall entire batches of cars if something does not work correctly, and here the node that is responsible for the safety of the driver, passengers and other road users does not work correctly! At the same time, Musk does not even think of programmatically disabling autopilot, and moreover, does not call for refusing to use it. Saying that the next version will be better, he directly appeals to the owners of Tesla with a request to pray for their lives and wait for a miracle update.

This is not the first time the author of the Dan O’down Media channel has criticized Tesla, including for their desire to call the car control assistance system “autopilot”. He conducted an experiment, proving that Full-Self Driving is capable, even in simple conditions, of causing serious injuries or even death to the latter when a car and a pedestrian collide. The test was conducted in clear weather, the car was moving strictly in a straight line. A dummy of a child was installed on the Tesla path, which was static, and did not move or appear suddenly. In all three cases, the autopilot failed to recognize the obstacle and crashed into a human figure. Moreover, the system did not even attempt to slow down, which indicates the “blindness” of Full-Self Driving in this case. Or about a factory marriage?

Tesla “sees” the world around it with the help of stereo cameras that transmit a picture of what is happening “here and now”, and does not rely on pre-built 3D maps of the area. Well, Dan O’downe’s experience proved that the Tesla system has a serious flaw. He recalled that more than one hundred thousand Tesla cars drive on the roads, whose drivers completely rely on Full-Self Driving. Tesla may make a mistake “at the most inopportune moment” on those streets where children go to school.

Автомобили Tesla убивают людей», — за эти слова Huawei уволила главу отдела  беспилотного вождения

Here are two more examples of Tesla’s autopilot does not work properly.

Last month, two cases were recorded when Tesla cars provoked an accident on the motorway at night. Both cases ended with the death of motorcyclists. It turned out that in some situations, automatic driving systems incorrectly determine the road situation, and do not “see” motorcycles.
It is assumed that an automated driver assistance system is to blame for both accidents. In both cases, the motorcyclists were riding in front of the cars, and the car simply caught up with the motorcycle and crashed into it.
Michael Brooks, executive director of the Center for Automotive Safety, called on the NHTSA (National Highway Traffic Safety Administration) to recall the Tesla autopilot because it does not recognize motorcyclists, ambulances or pedestrians. “It’s already clear to me and to many Tesla owners that this thing is not working properly, will not meet expectations and is endangering innocent people on the roads,” he said.

The error of the autopilot or the driver is up to you, but so far the cases and statistics indicate that there are clearly questions with autopilot. As a Tesla car is not bad, compact and eco-friendly, but is it worth trusting the car with your life, taking your hands off the steering wheel and enjoying the ride?

Sources: https://eco-drive.biz/avtopilot-tesla-ubivaet-lyudej/ https://impakter.com/tesla-autopilot-crashes-with-at-least-a-dozen-dead-whos-fault-man-or-machine/ https://www.tesladeaths.com/index-amp.html

Tagged , ,

3 thoughts on “Is Tesla’s autopilot killing people?

  1. 47524 says:

    something like this needs a huge amount of data and this can only come with time, in my opinion, we are no were close to cars being automated to drive us around safely.

  2. Mariia Golovchenko says:

    As for me, our society is not ready for a complete transition to self-driving cars at the moment. And so far neither technologically nor from a moral point of view. However, in the future, the introduction of such technologies into our daily lives is the only way to achieve complete security in this area by eliminating the main factor in the origin of accidents, namely human error.

  3. 47544 says:

    I watched many videos with tesla autopilot crashes. Personally i am not trusting this feature and i think it will take a lot of time to change my opinion

Leave a Reply