Tag Archives: autopilot

Is Tesla’s autopilot killing people?

Reading Time: 4 minutes
Подтверждено: по вине «автопилота» Tesla погибло уже четыре человека -  Hi-News.ru


If we start talking about statistics, then it all started much earlier. But it was in 2018 that 21 people died due to accidents caused by the Tesla autopilot (more than in previous years), next year the number of autopilot victims increased to 50 people. Someone will say that then the autopilot was “raw” and did not work properly, but unfortunately the deaths continue. In 2021, it is also possible to state deaths as a result of driving a Tesla with automatic control. In April of this year, 2 people in the United States crashed a Tesla Model S when the electric car was controlled using autopilot. And one of the most recent serious accidents occurred in November of this year, when a Tesla accelerated to 200 kilometers per hour, killed three road users and finally crashed into the stage, during which the driver was hospitalized. The electric car did not react to the brake pedal. Presumably, there was a software glitch in the electric car. Representatives of Tesla expressed their willingness to assist the investigation and said they would actively provide the necessary assistance.

Elon Musk acknowledged the problem

Only in August 2021, Tesla developer Elon Musk admitted that there were serious problems with the operation of Tesla’s autopilot. At the same time, Musk clarified that version 9.2 is problematic, and allegedly in 9.3 the problems will be fixed. But people were dying in Tesla even before version 9.2 was released…

In defense of Autopilot and Tesla

On the one hand, Tesla continues to kill people, but on the other hand, drivers are guilty of not following the instructions. The instructions say that the driver must constantly keep his hands on the steering wheel and, if necessary, in the event of a dangerous situation on the road, immediately take control. In other words, owners are informed that it is impossible to rely entirely on Tesla’s autopilot.

Criticism of autopilot

On the other hand, if a company that aspires to be called the best electric car developer in the world and dictates the fashion in the electric car industry creates an autopilot function, then it should work flawlessly. Otherwise, what is the point of autopilot if it needs to be controlled 100% of the time? Whether with or without autopilot, the driver must constantly monitor the road. But then why do we need such an autopilot? Or are we still seeing a demo version, some developments, but why did they then come out with a finished product?
Usually, automakers recall entire batches of cars if something does not work correctly, and here the node that is responsible for the safety of the driver, passengers and other road users does not work correctly! At the same time, Musk does not even think of programmatically disabling autopilot, and moreover, does not call for refusing to use it. Saying that the next version will be better, he directly appeals to the owners of Tesla with a request to pray for their lives and wait for a miracle update.

This is not the first time the author of the Dan O’down Media channel has criticized Tesla, including for their desire to call the car control assistance system “autopilot”. He conducted an experiment, proving that Full-Self Driving is capable, even in simple conditions, of causing serious injuries or even death to the latter when a car and a pedestrian collide. The test was conducted in clear weather, the car was moving strictly in a straight line. A dummy of a child was installed on the Tesla path, which was static, and did not move or appear suddenly. In all three cases, the autopilot failed to recognize the obstacle and crashed into a human figure. Moreover, the system did not even attempt to slow down, which indicates the “blindness” of Full-Self Driving in this case. Or about a factory marriage?

Tesla “sees” the world around it with the help of stereo cameras that transmit a picture of what is happening “here and now”, and does not rely on pre-built 3D maps of the area. Well, Dan O’downe’s experience proved that the Tesla system has a serious flaw. He recalled that more than one hundred thousand Tesla cars drive on the roads, whose drivers completely rely on Full-Self Driving. Tesla may make a mistake “at the most inopportune moment” on those streets where children go to school.

Автомобили Tesla убивают людей», — за эти слова Huawei уволила главу отдела  беспилотного вождения

Here are two more examples of Tesla’s autopilot does not work properly.

Last month, two cases were recorded when Tesla cars provoked an accident on the motorway at night. Both cases ended with the death of motorcyclists. It turned out that in some situations, automatic driving systems incorrectly determine the road situation, and do not “see” motorcycles.
It is assumed that an automated driver assistance system is to blame for both accidents. In both cases, the motorcyclists were riding in front of the cars, and the car simply caught up with the motorcycle and crashed into it.
Michael Brooks, executive director of the Center for Automotive Safety, called on the NHTSA (National Highway Traffic Safety Administration) to recall the Tesla autopilot because it does not recognize motorcyclists, ambulances or pedestrians. “It’s already clear to me and to many Tesla owners that this thing is not working properly, will not meet expectations and is endangering innocent people on the roads,” he said.

The error of the autopilot or the driver is up to you, but so far the cases and statistics indicate that there are clearly questions with autopilot. As a Tesla car is not bad, compact and eco-friendly, but is it worth trusting the car with your life, taking your hands off the steering wheel and enjoying the ride?

Sources: https://eco-drive.biz/avtopilot-tesla-ubivaet-lyudej/ https://impakter.com/tesla-autopilot-crashes-with-at-least-a-dozen-dead-whos-fault-man-or-machine/ https://www.tesladeaths.com/index-amp.html

Tagged , ,

Tesla, fatal accidents and false advertising

Reading Time: 3 minutes

Unfortunately on the 18 of April 2021 two people died after crushing into a tree in fatal accident in Woodlands, Texas. The car that these two people were in, was a Tesla and there was no-one in the drivers seat. This crash has sparked a discussion whether Tesla is false advertising its cars driving abilities or at least whether their naming scheme is misleading. So far Tesla denied that the vehicle was operating either in Full Self Driving mode, because the vehicle didn’t have an active subscription to this system, but they also said that according to the cars data logs autopilot was also not engaged. People tried disproving that the vehicle could be operating in these modes by showing that if you unbuckle the seatbelts, the car warns the driver and eventually pulls over, as it is shown in this short video:

Video showing how Tesla behaves after you unbuckle the drivers seatbelt

What the video fails to show is that this entire system can be extremely easily cheated by just bucking the seatbelt behind your back. By then the seats weight sensor is disengaged (it foolishly assumes that when the seatbelt us buckled there is someone sitting there). The last thing that separates you from “driving” from any other seat is cheating the sensors embedded in the steering wheel, which has been proven to be extremely easy (for example by hanging a water bottle on the steering wheel).

Of course, Tesla shouldn’t be responsible for their clients stupidity, but they should prevent them from exploiting these systems, especially this easily. Although now, on their website Tesla is explicitly saying that their Autopilot and Full Self Driving systems are only drivers assistance and so called “hands-on” (the steering wheel) systems, why name them as if they allowed you to leave everything up to the car, when this is clearly not the case. Even recently, in emails to California DMV, Tesla has admitted its supposedly Full Self Driving system, that is only available to 2000 beta drivers, is only working as a level 2 (hands on) autonomous system (in a scale from level 0 to level 5). In my opinion all of the Tesla’s “autonomous” systems in their offerings are just Advanced Drivers Assistance Systems, and should be Advertised as such – everywhere. The naming scheme is also highly suggestive, indicating the cars can do much more than they actually can.

Elon Musk has defended the name “Autopilot” saying that on airplanes, there also is an autopilot system and the pilot still has to be attentive and ready to take over. I think this argument is invalid, because pilots (especially commercial) are actually trained to use these features and have to have a certain amount of experience in order to be able to operate such machines.

Although for now the official states that the car was not operating in any of Teslas autonomous modes, how was it even operating if there was no-one in the drivers seat? Assuming that the car actually had autopilot engaged, this is not the first accident involving a Tesla car operating in this mode. Despite the statistics saying that Tesla’s autopilot is almost 10 times safer that “average” vehicle, with one accident every 4.19 million miles traveled with autopilot engaged vs one crash in every 484,000 miles for human-operated cars. Even though the number is impressive it could still be easily improved, if all of people driving Teslas understood what the systems that their cars are equipped with really were.

Stay safe, and remember to always pay attention to the road, whether you are using ADAS systems or not. Don’t text and drive. Stay hydrated.

Sources:

https://www.latimes.com/business/story/2021-04-19/tesla-on-autopilot-kills-two-where-are-the-regulators

https://finance.yahoo.com/news/tesla-crash-texas-leads-2-192437979.html?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAANNqmvFndKvefa11nrNSutIEtRYcKe0cBs3MiCvK5pUa4ut6f6FRN49hxcCd8hh2YmAmnOi-AMNa38sp_zUNterB_iwgMblh1VcHNy3cIUJOO9WPB7OnyugCswP5vkt4swfevVASDCE74Ac3BRgO3oVo5vAEfQ1LIAbM4D8aEfG9

https://edition.cnn.com/2021/04/19/business/tesla-fatal-crash-no-one-in-drivers-seat/index.html

https://www.thedrive.com/tech/39647/tesla-admits-current-full-self-driving-beta-will-always-be-a-level-2-system-emails

Tagged , , , , ,

Tesla’s Autopilot two contrasting cases

Reading Time: 2 minutesAutopilot saves life

teslaX

Joshua Neally a 37-year-old attorney from Springfield claims that his new Tesla Model X saved his life thanks to its autopilot feature.

One day he was in a hurry from work to be on time for his daughter birthday. Suddenly he felt something like steel pole going through his chest. By that time, he didn’t know that he had a pulmonary embolism. Neally had problems with concentrating on the road but calculated that it is going to be much faster to drive on his own rather than waiting on the side of the road for an ambulance. Thanks to built-in autopilot he was able to let his Model X take the steer for more than 20 miles of highway ride. He took over the steering on the final stretch when he reached off-ramp near the hospital in Branson. After reaching hospital he made his way to emergency room.

He survived and it might be partly thanks to autopilot as pulmonary embolism kills 50 thousand people a year with seventy percent of those deaths coming within an hour from first symptoms.

Neally says “I’m very thankful I had it for this experience,” and expresses that he is unsure how this story would end if he didn’t buy a car with autopilot a week prior the incident.

 

Fatal accident 

In June Joshua Brown a 40-year-old from Canton, Ohio was killed in an accident while using autopilot of his Tesla Model S.

Before accident brown used to say “I do drive it a LOT,” as he was using Autopilot feature to drive on highways to his work. During first 9 months he has put more than 45 thousand miles on his car. During that time, he uploaded to YouTube several videos of him driving long distances on autopilot.

accident

The fatal accident happened when Brown’s Tesla on autopilot crashed into track later hitting a fence and a power pole. By the time of an accident, he was supposedly watching harry potter on a D.V.D. player (found later by police in his car).

Why did accident happen:

  • Brown was not paying any attention to road
  • car’s sensor system failed to distinguish truck against a bright sky (similar color of a truck to sky color)
  • tesla is using cameras, radar(microwaves) and ultrasonic sensors. The emergency braking is used when both radar and vision system agrees that there is an obstacle.

It any of the mentioned points will not be present accident would not happen. As for the third point adding or exchanging to laser scanner should trigger emergency breaking even in such a situation (this technology with name Lidar is used by Google in their cars).

 

Are you a supporter of autopilot in cars?

Sources:

http://www.cnbc.com/2016/08/05/man-says-tesla-autopilot-saved-his-life-by-driving-him-to-the-hospital.htmlhttp://tvn24bis.pl/moto,99/autopilot-w-tesli-dowiozl-kierowce-na-pogotowie,666778.htmlhttp://www.nytimes.com/2016/07/02/business/joshua-brown-technology-enthusiast-tested-the-limits-of-his-tesla.htmlhttp://www.ky3.com/content/news/Self-driving-Tesla-SUV-saves-the-day-389392262.htmlhttp://www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html?_r=0

Tagged , ,